Google flu-finding service diagnosed with 'big data hubris'

Bad data contagion overwhelms prediction service


A paper in Science claims that Google Flu Trends, unveiled back in 2008 to become a poster-child of Big Data, has one teeny, tiny, fatal flaw: it's almost always wrong.

The paper – abstract here – finds that not only did Flu Trends completely miss the 2009 swine flu, but for 100 of the 108 weeks since 2011, Google's predictions of influenza outbreaks are simply wrong, and the reason is simple: nearly everybody thinks the slightest sniffle means they have influenza.

That's quite at odds with how Google described the project when it first unveiled the project. As by Johns Hopkins professor Steven Salzberg here, the Chocolate Factory originally claimed that “we can accurately estimate the current level of weekly influenza activity in each region of the United States, with a reporting lag of about one day”.

As Salzberg notes, for the most recent week in which the Centre for Disease Control (CDC) has published data, only 8.8 per cent of specimens sent for testing returned a positive for influenza.

Talking to the Science podcast, here, researcher David Lazer of Northeastern University in Boston (and lead author of the paper) suggests one problem is that people – including highly-trained Oompa Loompas – love pattern matching, and that Google “overfit” the data.

“They ... overfit the data. They had fifty million search terms, and they found some that happened to fit the frequency of the 'flu' over the preceding decade or so, but really they were getting idiosyncratic terms that were peaking in the winter at the time the 'flu' peaks … but wasn't driven by the fact that people were actually sick with the 'flu',” he tells the podcast.

Having noted that this caused errors, Google ran a fix into Flu Trends, but since 2011, the system has been overestimating the number of 'flu' cases.

Calling this "big data hubris", Lazar says there were "certain assumptions baked into the analysis that doomed it in the long run". For example, he says, Google Flu Trends assumed a stable relationship between search terms and the incidence of influenza, which hasn't been the case.

Google's own search algorithms, which route someone from a 'flu' search to a suitable product, also play a part, Lazar says. Google search creates a kind of feedback loop which Google Flu Trends mistakenly interprets as an outbreak. ®

Similar topics


Other stories you might like

  • Mastering metadata key to shifting data fast, says Arcitecta
    A new transmission protocol can work lightning fast, but only with very thorough records to pull from

    Companies that move and analyze huge volumes of data are always on the lookout for faster ways to do it. One Australian company says it has created a protocol that can "transmit terabytes per minute across the globe."

    The company, Arcitecta, which has an almost 25-year history, has just announced the new Livewire protocol, which is part of their data management platform, Mediaflux, used by institutions including the Australian Department of Defense, drug maker Novartis, and the Dana Farber Cancer Institute.

    According to CEO Jason Lohrey, Livewire itself has already made an impact for some of the largest data movers. "One of our customers transmits petabytes of data around the globe, he told The Register.

    Continue reading
  • Real-time data analytics firm Tinybird nets $37m in Series A
    Millions of rows per second in real time, so the legend goes...

    A data analytics company claiming to be able to process millions of rows per second, in real time, has just closed out a Series A funding round to take-in $37 million.

    Tinybird raised the funds via investors Crane Ventures, Datadog CPO Amit Agarwal, and Vercel CEO Guillermo Rauch, along with new investments from CRV and Singular Ventures.

    Tinybird's Stephane Cornille, said the company plans to use the funds to expand into new regions, build connectors for additional cloud providers, create new ways for users to build their own connectors, provide tighter Git integration, Schema management and migrations, and better defaults and easier materialization.

    Continue reading
  • Big data means big money for the UK government as £2bn tender mooted
    Central procurement team tickles the market with tantalising offer... but what for?

    The UK government is putting the feelers out for a bundle of big data products and services in a move that could kick off £2bn in tendering.

    Cabinet Office-run Crown Commercial Service (CCS), which sets up procurement on behalf of central government ministries and other public sector organisations, has published an early market engagement to test the suppliers' interest in a framework for so-called big data and analytics systems.

    According to the prior information notice: "Big data and analytics is an emerging and evolving capability, with its prominence heightened by COVID. It is fast becoming recognised as business critical and a core business function, with many government departments now including chief data officers. The National Data Strategy and implementation of its missions reinforce the requirement to access and interrogate Government data more effectively to improve public services."

    Continue reading

Biting the hand that feeds IT © 1998–2022