This article is more than 1 year old

Big Data skills gap needs filling says tech industry

Want a job for life? Get good at data science

Oracle OpenWorld Concern is growing in the US technology industry about a skills gap in education and training in the field of data analysis.

"Last 50 years arguably have been about computer science," said Jeremy Burton, VP of product operations for EMC, during his Monday keynote at Oracle OpenWorld. "The next 50 years are going to be about data science, people who understand the semantics of data, how to visualize the data and present it to business people."

LinkedIn employs around 100 data scientists, he said, but the industry needs many more and there are job vacancies aplenty. His concerns were echoed on Wednesday with the release of a report on big data use in government from the non-profit TechAmerica Foundation's Big Data Commission.

"There are millions of technical jobs in the US market going unfilled," commission co-chair and GM of Database & Technology at SAP Steve Lucas told The Register. Part of the reason is that we need people who are better equipped to understand large data sets and finding new data sources."

"It's not just computer science, not just engineering and not just mathematics – you need elements from all of them to educate people on big data use."

The report recommends setting up an IT Leadership Academy to promote such skills and for training facilities to build in Big Data understanding into technology curricula. Companies and government departments would also benefit from setting up a role for a chief data officer, responsible for managing and understanding data streams internally and externally.

The Obama administration set aside $200m for Big Data projects last year and there are some areas of government that are getting the message, with the report citing the Department of Defense as a good example. But in general, government has lagged behind commerce in the area, Lucas said.

While government has proven good at collecting and storing data, it's less adept at getting value from it, according to the report. In 2009, the government produced 848 petabytes of data and US healthcare data alone reached 150 exabytes. At this rate, the healthcare system will soon reach zetabyte (10^21 gigabytes) scale and soon yottabytes (10^24 gigabytes).

If this was mined intelligently government could cut costs and add more value to the economy, Lucas asserted, and while there was support among Federal departments, there is still a long way to go. ®

More about

TIP US OFF

Send us news


Other stories you might like