Hadoop experiencing growing pains in lamestream businesses

Platform little more than 'skunkworks' outside tech industries


Apache Big Data Fast, unbridled growth has hurt adoption of Hadoop, according to a leading advocate of the technology.

John Mertic, director of program management at ODPi, said that work on Hadoop was often relegated to a "skunkworks" project in many mainstream organisations. "It's effectively stuck," he said during a keynote presentation at the Apache Big Data conference in Seville, Spain, on Wednesday.

Mertic distinguished between companies whose business is built around technology platforms, such as Netflix, that are pushing ahead with the technology and more mainstream financial service and insurance firms. The likes of insurance firms struggle with practical problems such as how it might be possible to deploy Hadoop across 100 server clusters while maintaining existing policies and controls.

Hadoop offers a unified way of managing data and introducing emerging technologies such as machine learning but controls "aren't enterprise production ready" and tools "aren't straightforward", according to Mertic.

"Hadoop is made up of an ecosystem of 30-40 projects with different support levels and maturity," Mertic told El Reg.

Moving out of the skunkworks cul-de-sac in mainstream companies involves coming up with convincing technology use cases that management will be prepared to back. If adopted, the technology offers an organisation the ability to make full use of the data at its disposal to improve both pre-sales and post-sales performance by making organisations more agile and data-driven, an approach that suits firms in industries ranging from banking to retail and beyond.

Aron Sogor, senior director of engineering at Big Data analytics firm Cloudiera, said it was much easier for tech staff at the likes of Netflix to show return of investment or increased sales from Big Data technologies than it is for similar workers at banks to justify investment. Adoption of Big Data technologies is more dependant on how fast an organisation is growing than what industry sector it operates within, he added. ®

Similar topics


Other stories you might like

  • Mastering metadata key to shifting data fast, says Arcitecta
    A new transmission protocol can work lightning fast, but only with very thorough records to pull from

    Companies that move and analyze huge volumes of data are always on the lookout for faster ways to do it. One Australian company says it has created a protocol that can "transmit terabytes per minute across the globe."

    The company, Arcitecta, which has an almost 25-year history, has just announced the new Livewire protocol, which is part of their data management platform, Mediaflux, used by institutions including the Australian Department of Defense, drug maker Novartis, and the Dana Farber Cancer Institute.

    According to CEO Jason Lohrey, Livewire itself has already made an impact for some of the largest data movers. "One of our customers transmits petabytes of data around the globe, he told The Register.

    Continue reading
  • Real-time data analytics firm Tinybird nets $37m in Series A
    Millions of rows per second in real time, so the legend goes...

    A data analytics company claiming to be able to process millions of rows per second, in real time, has just closed out a Series A funding round to take-in $37 million.

    Tinybird raised the funds via investors Crane Ventures, Datadog CPO Amit Agarwal, and Vercel CEO Guillermo Rauch, along with new investments from CRV and Singular Ventures.

    Tinybird's Stephane Cornille, said the company plans to use the funds to expand into new regions, build connectors for additional cloud providers, create new ways for users to build their own connectors, provide tighter Git integration, Schema management and migrations, and better defaults and easier materialization.

    Continue reading
  • Big data means big money for the UK government as £2bn tender mooted
    Central procurement team tickles the market with tantalising offer... but what for?

    The UK government is putting the feelers out for a bundle of big data products and services in a move that could kick off £2bn in tendering.

    Cabinet Office-run Crown Commercial Service (CCS), which sets up procurement on behalf of central government ministries and other public sector organisations, has published an early market engagement to test the suppliers' interest in a framework for so-called big data and analytics systems.

    According to the prior information notice: "Big data and analytics is an emerging and evolving capability, with its prominence heightened by COVID. It is fast becoming recognised as business critical and a core business function, with many government departments now including chief data officers. The National Data Strategy and implementation of its missions reinforce the requirement to access and interrogate Government data more effectively to improve public services."

    Continue reading

Biting the hand that feeds IT © 1998–2022