Machine learning newbs: TensorFlow too hard? Kick its ass with Keras

New version 2 integrates better with Google's tough but essential software library

Reg comments Got Tips?

Keras, a popular deep learning library, has been updated with a new API to make it easier for developers to use machine learning in Python.

Artificial intelligence is all the rage right now and techies are keen to explore ways they can use machine learning. But it’s not that easy – especially for coders with little knowledge of neural networks.

Built in 2015 by Google software engineer and AI researcher François Chollet, Keras was designed to be used on top of TensorFlow and Theano – open-source software libraries developed by machine learning researchers at Google and the University of Montreal, Canada.

The update, dubbed Keras 2, has been changed to adapt to TensorFlow API better, allowing developers to mix and match TensorFlow and Keras components together. Since the software runs on TensorFlow and Theano, there is no performance cost to using Keras compared to the other more complex frameworks.

Keras is more specialized for deep learning than TensorFlow or Theano. It’s “higher-level” and “abstracts away a lot of details that most users don’t need to know about,” Chollet explained to The Register.

Instead of dealing with several lines of messy code, developers can directly input deep learning models and customize their own neural nets by clipping together different components or “layers.”

It provides a way for researchers to quickly try out different configurations of their models, reducing the time it takes to set up new experiments.

In the space of two years, the number of people using Keras has grown to a hundred thousand.

“Hundreds of people have contributed to the Keras codebase. Many thousands have contributed to the community. Keras has enabled new startups, made researchers more productive, simplified the workflows of engineers at large companies, and opened up deep learning to thousands of people with no prior machine learning experience,” Chollet wrote in a blog post. ®


Keep Reading

Machine-learning models trained on pre-COVID data are now completely out of whack, says Gartner

That AI-powered product and price recommendation engine? Useless now

Machine learning helps geoboffins spot huge beds of hot rocks 1,000km across deep below Earth's surface

Large structures were detected as anomalies in seismic waves processed by an algorithm

So many stars, so little time: Machine learning helps astroboffins spot the most oxygen-starved galaxy yet

Don't bother packing your bags for HSC J1631+4426 just yet, it's 430 million light years away

Hey, Sparky: Confused by data science governance and security in the cloud? Databricks promises to ease machine learning pipelines

You know the one, that pothole ridden journey from on-prem to the fluffy white stuff

AI startup accuses Facebook of stealing code designed to speed up machine learning models on ordinary CPUs

Neural Magic claims algos in social network's open-source compiler on GitHub look awfully familiar

Machine learning devs can now run GPU-accelerated code on Windows devices on AMD's chips, OpenAI applies GPT-2 to computer vision

Roundup Plus: AI for the benefit of humankind group loses a member and more

Amazon sticks AI inference chip up for rent in the cloud for machine-learning geeks

re:Invent AWS subscribers, you can forget GPUs (unless you need to train your models)

You know what would look great on our database? Your machine learning model: GPUs and unstructured data on the menu for Exasol as it tries to unify BI and ML

Keeping up in performance stakes vital as data science sector explodes, says analyst

Biting the hand that feeds IT © 1998–2020