Google buys startup biz, slurps up its NLP brains
Nah, not that neuro-bollocks – natural language processing
Google have snapped up API.AI, a Silicon Valley startup specialising in building tools for natural language understanding in mobiles, web applications and devices.
The details of the financial transaction have not been disclosed.
Launched in 2014, API.AI quickly recognised the growing trend in companies interested in giving their technology a voice.
“We’ve been constantly impressed by the fast and energetic adoption of the technology from people building conversational interfaces for chatbots, connected cars, smart home devices, mobile applications, wearables, services, robots and more,” Ilya Gelfenbeyn, CEO of API.AI, said.
The company’s API works in three steps. First, the acoustic signals from speech is translated into text using automatic speech recognition.
Second, its natural language processing kicks in to process the text to help the machine "understand" what the user is saying. The final step is what the developers call “fulfillment” - the machine executes the user’s request.
Many communication applications such as Slack, Facebook Messenger and Kik use API.AI already.
Google’s vice president, Scott Huffman, hasn’t explicitly said what API.AI will be working on, apart from helping “empower [Google’s] developers to continue building great natural language interfaces.”
Perhaps, the biggest hint comes from both companies having a mutual interest in developing AI and machine learning for natural language processing.
Google recently unveiled its voice powered AI assistant, Google Assistant, a rival to Amazon Echo, Apple’s Siri and Microsoft’s Cortana, earlier this year.
The tech giant’s AI research hub, Google DeepMind, have also been playing around with computer speech. A recent paper demonstrated its “WaveNet” model which uses neural networks to make computer voices sound more human. ®