JavaScript tracking punks given a thrashing by good old-fashioned server log analytics

Netlify dodges the blockers by going to the source

Netlify this week whipped the covers off its take on dealing with the rise of ad blockers in analytics – do it on the server.

With many analytic tools requiring tracking pixels, cookies or JavaScript in the webpage, the arms race in blocking the things – either unintentionally through ad-blockers or intentionally because of privacy concerns – is proving a headache for website operators.

The solution is, Netlify told London's JAMstack conference, to use all those log files generated on the web server itself. After all, unless you have more nefarious aims in mind, that information should be all developers need to keep things humming along. Marketeers, however, may be in for a disappointment.

The first cut of the system will track the usual things – page views, unique visitors, bandwidth, top pages and, most usefully, where resources are not found (such as a missing image or 404).

It seems like obvious stuff – one would hope that a missing resource would show up somewhere in testing as part of CI/CD, but the number of borked sites out there indicates that this is not the case.

Of course, Netlify is by no means the only game in town when it comes to ingesting analytics data from its edge server nodes. That log data has always been lurking around, but shoving the analytics onto the client makes things that bit more personal. For those who want to use the server log files, there are competing tools such as Matomo.

But the likes of Matomo are heavy on the features and costly. While Matomo starts at around $59/month for its "Essential" package and 300,000 page views monthly, Netlify is $9/month for 250,000 page views a month.

That $9 is on top of whatever you're paying Netlify at the moment, of course, and you have to join the 600,000 or so users of the web-hosting-for-dummies platform. Matomo, on the other hand, is a tad more cross-platform.

Still, for Netlify users (and there seem to be quite a few opting for the outfit's take on multicloud-hosting-without-the-tears), the new functionality is an intriguing alternative to JavaScript tracking, with the added bonus of GDPR compliance.

Certainly, privacy is a hot topic these days, and the gang hopes that by merely capturing the IP address of a user and not sharing it in the analytics tool, such issues are skipped over. Netlify keeps those IPs in its logs in order to fight off DDoS attacks and such.

Another downside is that by using the logs for analytics, there is no simple way for a user to opt out. It's up to the website operator to make such options available.

Logs are kept for 30 days and, to be frank, with just an IP address that is likely assigned dynamically by an ISP or masking an organisation of users, individual tracking to the level enjoyed by some script at the client is going to be difficult with this tool, something the Netlify team acknowledged when we spoke with them earlier this week.

Netlify CEO Mathias Biilmann Christensen and president Christian Bach told The Register that plans were afoot to make future versions a little more granular in order to track a users's path through a site, as well as introduce alerts and notifications when thresholds are reached.

In the meantime, the function, which works from data slurped from the logs once an hour, shows its metrics in dashboards within the Netlify suite and requires a developer only enable it (and start paying) in order to use it. Since it pulls from logs, there is also no performance impact.

It's a neat tool, giving developers a handy insight into how things are running. The inability to trace a user's path will, however, annoy some. And some organisations will miss the ability to festoon a user's browser with tracking JavaScript.

And that, to be honest, is probably no bad thing. ®

Other stories you might like

  • Israel plans ‘Cyber-Dome’ to defeat digital attacks from Iran and others
    Already has 'Iron Dome' – does it need another hero?

    The new head of Israel's National Cyber Directorate (INCD) has announced the nation intends to build a "Cyber-Dome" – a national defense system to fend off digital attacks.

    Gaby Portnoy, director general of INCD, revealed plans for Cyber-Dome on Tuesday, delivering his first public speech since his appointment to the role in February. Portnoy is a 31-year veteran of the Israeli Defense Forces, which he exited as a brigadier general after also serving as head of operations for the Intelligence Corps, and leading visual intelligence team Unit 9900.

    "The Cyber-Dome will elevate national cyber security by implementing new mechanisms in the national cyber perimeter, reducing the harm from cyber attacks at scale," Portnoy told a conference in Tel Aviv. "The Cyber-Dome will also provide tools and services to elevate the protection of the national assets as a whole. The Dome is a new big data, AI, overall approach to proactive defense. It will synchronize nation-level real-time detection, analysis, and mitigation of threats."

    Continue reading
  • Intel to sell Massachusetts R&D site, once home to its only New England fab
    End of another era as former DEC facility faces demolition

    As Intel gets ready to build fabs in Arizona and Ohio, the x86 giant is planning to offload a 149-acre historic research and development site in Massachusetts that was once home to the company's only chip manufacturing plant in New England.

    An Intel spokesperson confirmed on Wednesday to The Register it plans to sell the property. The company expects to transfer the site to a new owner, a real-estate developer, next summer, whereupon it'll be torn down completely.

    The site is located at 75 Reed Rd in Hudson, Massachusetts, between Boston and Worcester. It has been home to more than 800 R&D employees, according to Intel. The spokesperson told us the US giant will move its Hudson employees to a facility it's leasing in Harvard, Massachusetts, about 13 miles away.

    Continue reading
  • Start using Modern Auth now for Exchange Online
    Before Microsoft shutters basic logins in a few months

    The US government is pushing federal agencies and private corporations to adopt the Modern Authentication method in Exchange Online before Microsoft starts shutting down Basic Authentication from the first day of October.

    In an advisory [PDF] this week, Uncle Sam's Cybersecurity and Infrastructure Security Agency (CISA) noted that while federal executive civilian branch (FCEB) agencies – which includes such organizations as the Federal Communications Commission, Federal Trade Commission, and such departments as Homeland Security, Justice, Treasury, and State – are required to make the change, all organizations should make the switch from Basic Authentication.

    "Federal agencies should determine their use of Basic Auth and migrate users and applications to Modern Auth," CISA wrote. "After completing the migration to Modern Auth, agencies should block Basic Auth."

    Continue reading
  • City-killing asteroid won't hit Earth in 2052 after all
    ESA ruins our day with some bad news

    An asteroid predicted to hit Earth in 2052 has, for now, been removed from the European Space Agency's list of rocks to be worried about.

    Asteroid 2021 QM1 was described by ESA as "the riskiest asteroid known to humankind," at least among asteroids discovered in the past year. QM1 was spotted in August 2021 by Arizona-based Mount Lemmon observatory, and additional observations only made its path appear more threatening.

    "We could see its future paths around the Sun, and in 2052 it could come dangerously close to Earth. The more the asteroid was observed, the greater that risk became," said ESA Head of Planetary Defense Richard Moissl. 

    Continue reading
  • Why Wi-Fi 6 and 6E will connect factories of the future
    Tech body pushes reliability, cost savings of next-gen wireless comms for IIoT – not a typo

    Wi-Fi 6 and 6E are being promoted as technologies for enabling industrial automation and the Industrial Internet of Things (IIoT) thanks to features that provide more reliable communications and reduced costs compared with wired network alternatives, at least according to the Wireless Broadband Alliance (WBA).

    The WBA’s Wi-Fi 6/6E for IIoT working group, led by Cisco, Deutsche Telekom, and Intel, has pulled together ideas on the future of networked devices in factories and written it all up in a “Wi-Fi 6/6E for Industrial IoT: Enabling Wi-Fi Determinism in an IoT World” manifesto.

    The detailed whitepaper makes the case that wireless communications has become the preferred way to network sensors as part of IIoT deployments because it's faster and cheaper than fiber or copper infrastructure. The alliance is a collection of technology companies and service providers that work together on developing standards, coming up with certifications and guidelines, advocating for stuff that they want, and so on.

    Continue reading
  • How can we make the VC world less pale and male, Congress wonders
    'Combating tech bro culture' on the agenda this week for US House committee

    A US congressional hearing on "combating tech bro culture" in the venture capital world is will take place this week, with some of the biggest names in startup funding under the spotlight.

    The House Financial Services Committee's Task Force on Financial Technology is scheduled to meet on Thursday. FSC majority staff said in a memo [PDF] the hearing will focus on how VCs have failed to invest in, say, fintech companies founded by women and people of color. 

    We're told Sallie Krawcheck, CEO and cofounder of Ellevest; Marceau Michel, founder of Black Founders Matter; Abbey Wemimo, cofounder and co-CEO of Esusu; and Maryam Haque, executive director of Venture Forward have at least been invited to speak at the meeting.

    Continue reading
  • DataStax launches streaming data platform with backward support for JMS
    Or move to Apache Pulsar for efficiency gains, says NoSQL vendor

    DataStax, the database company built around open-source wide-column Apache Cassandra, has launched a streaming platform as a service with backwards compatibility for messaging standards JMS, MQ, and Kafka.

    The fully managed messaging and event streaming service, based on open-source Apache Pulsar, is a streaming technology built for the requirements of high-scale, real-time applications.

    But DataStax wanted to help customers get data from their existing messaging platforms, as well as those who migrate to Pulsar, said Chris Latimer, vice president of product management.

    Continue reading

Biting the hand that feeds IT © 1998–2022