Apache Big Data Fast, unbridled growth has hurt adoption of Hadoop, according to a leading advocate of the technology.
John Mertic, director of program management at ODPi, said that work on Hadoop was often relegated to a "skunkworks" project in many mainstream organisations. "It's effectively stuck," he said during a keynote presentation at the Apache Big Data conference in Seville, Spain, on Wednesday.
Mertic distinguished between companies whose business is built around technology platforms, such as Netflix, that are pushing ahead with the technology and more mainstream financial service and insurance firms. The likes of insurance firms struggle with practical problems such as how it might be possible to deploy Hadoop across 100 server clusters while maintaining existing policies and controls.
Hadoop offers a unified way of managing data and introducing emerging technologies such as machine learning but controls "aren't enterprise production ready" and tools "aren't straightforward", according to Mertic.
"Hadoop is made up of an ecosystem of 30-40 projects with different support levels and maturity," Mertic told El Reg.
Moving out of the skunkworks cul-de-sac in mainstream companies involves coming up with convincing technology use cases that management will be prepared to back. If adopted, the technology offers an organisation the ability to make full use of the data at its disposal to improve both pre-sales and post-sales performance by making organisations more agile and data-driven, an approach that suits firms in industries ranging from banking to retail and beyond.
Aron Sogor, senior director of engineering at Big Data analytics firm Cloudiera, said it was much easier for tech staff at the likes of Netflix to show return of investment or increased sales from Big Data technologies than it is for similar workers at banks to justify investment. Adoption of Big Data technologies is more dependant on how fast an organisation is growing than what industry sector it operates within, he added. ®