Big Data is bovine excrement says Obama's Big Data man

You probably have 'medium data' and are paranoid about privacy, says CTO


Big Data is “bullshit”, says Harper Reed, Chief Technology Officer at Obama for America 2012.

Speaking at the CeBIT conference in Sydney, Australia, today, Reed said he encountered the the term “Big Data” in 2007 when it referred to a storage problem.

“We used it in 2007 because it was hard to store data,” he said. “People who did it were doing it well.” But not everyone was doing it well, knew how to or had the tools to do so. Six years later, the likes of Hadoop and Hbase mean storing and preparing large volumes of data for analysis are no longer difficult problems, he said, but the term “Big Data” persists.

Reed would rather it had not come into such wide use, if only because he feels most people contemplating an investment in “Big Data” probably don't have enough data to qualify for the term as he understands it.

Rayid Ghani, who shared the Sydney stage and worked alongside Reed as Chief Data Scientist on the Obama 2012 campaign agreed, revealing that his personal domestic storage rig stores more data than he used in the election campaign.

That revelation led Reed to opine that few of the attendees at the event have enough data to qualify as “Big”.

“You probably have medium data,” he said, and the term “Big Data” has now come to represent analytics tools, not the data itself.

Even if Big or Medium Data contains personal information, Ghani said analysis probably wouldn't use it to create eerily well-targeted offers.

“Data on what car you drive was not very useful in the campaign,” he said. “We did not use that much private data.” More useful, Reed said, was simple data points like a response to the question “do you support the President?” With a response to that question and information on whether an individual had voted in the past in hand, the Obama campaign was able to identify a voter as someone worthy of their attention.

Reed also cautioned old people – anyone over 25 in his big-beard-chunky-earrings-and-thick-framed-glasses world – not to panic on the topic of privacy. Oldsters are uneasy with the notion that Facebook et al mines their data he said. Young folk have no such qualms, understand the transactions they participate in and are more familiar with the privacy controls of the services they use.

Oldsters, he said, haven't bothered to learn how to operate those privacy controls and as a result assume their privacy worries are universal.

“Let's not project our fears onto others,” he said. ®

Similar topics


Other stories you might like

  • Mastering metadata key to shifting data fast, says Arcitecta
    A new transmission protocol can work lightning fast, but only with very thorough records to pull from

    Companies that move and analyze huge volumes of data are always on the lookout for faster ways to do it. One Australian company says it has created a protocol that can "transmit terabytes per minute across the globe."

    The company, Arcitecta, which has an almost 25-year history, has just announced the new Livewire protocol, which is part of their data management platform, Mediaflux, used by institutions including the Australian Department of Defense, drug maker Novartis, and the Dana Farber Cancer Institute.

    According to CEO Jason Lohrey, Livewire itself has already made an impact for some of the largest data movers. "One of our customers transmits petabytes of data around the globe, he told The Register.

    Continue reading
  • Real-time data analytics firm Tinybird nets $37m in Series A
    Millions of rows per second in real time, so the legend goes...

    A data analytics company claiming to be able to process millions of rows per second, in real time, has just closed out a Series A funding round to take-in $37 million.

    Tinybird raised the funds via investors Crane Ventures, Datadog CPO Amit Agarwal, and Vercel CEO Guillermo Rauch, along with new investments from CRV and Singular Ventures.

    Tinybird's Stephane Cornille, said the company plans to use the funds to expand into new regions, build connectors for additional cloud providers, create new ways for users to build their own connectors, provide tighter Git integration, Schema management and migrations, and better defaults and easier materialization.

    Continue reading
  • Big data means big money for the UK government as £2bn tender mooted
    Central procurement team tickles the market with tantalising offer... but what for?

    The UK government is putting the feelers out for a bundle of big data products and services in a move that could kick off £2bn in tendering.

    Cabinet Office-run Crown Commercial Service (CCS), which sets up procurement on behalf of central government ministries and other public sector organisations, has published an early market engagement to test the suppliers' interest in a framework for so-called big data and analytics systems.

    According to the prior information notice: "Big data and analytics is an emerging and evolving capability, with its prominence heightened by COVID. It is fast becoming recognised as business critical and a core business function, with many government departments now including chief data officers. The National Data Strategy and implementation of its missions reinforce the requirement to access and interrogate Government data more effectively to improve public services."

    Continue reading

Biting the hand that feeds IT © 1998–2022