This article is more than 1 year old

El Reg drills into chatbot hype: The AIs that want to be your web butlers

So many things to solve, eg: how can there be conversation without memory?

Analysis “Alexa, are you the best chatbot in town?” “Sorry, I don’t understand the question I heard,” she replies.

Alexa doesn’t know. Nobody does. For a while, Apple had the lead with Siri: the virtual assistant first appeared in October 2011 on the iPhone 4S. Fast-forward five and a bit years and now every major tech player has one of these chatbots, or is in the process of developing one.

Amazon and Google have gone head-to-head with Alexa-powered Echo and Google Home speakers – both voice-controlled assistants. Microsoft has Cortana on its computers, and Apple has Siri on its Macs. On mobile phones, Apple will have another rival to deal with, as ex-Siri developers are in the process of building a virtual assistant for Samsung’s handhelds.

With the rise of smartphones, the number of people accessing the internet through desktop computers has plummeted. Figures from StatCounter, a web traffic analysis tool, show that mobile and tablet internet usage (51.3 per cent) has overtaken desktops (48.7 per cent) for the first time worldwide. Once upon a time, people used mice and keyboards to access the web, then touchscreens, and next, well, could it be voice? Rather than pull up your favorite news website, you'll simply ask out loud to your phone or speaker: what's happening in Linux today?

Coupled with the fact that speech recognition systems perform slightly better than professional transcriptionists, the boom in interest for chatbots that you can hold a conversation with seems natural. As people move to phones and home assistants, and away from desktops, bots will become the new way a lot of folks will access information, query services, request stuff, ask for help, use the internet, control computers, and so on – or at least that's the dream.

“It’s something I call the Big Bot Battle,” Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, told The Register. With web searches being directed through chatbots, the company that builds the most popular robot will be the new gateway to the digital world, Etzioni said, like a “concierge for the internet.”

“The stakes are super high. It’s a trillion-dollar industry. But it’s still the beginning of the race so the jury’s still out,” Etzioni added.

Dan Gailey and Nathan Ross, cofounders of machine-learning startup Radbots, agree.

"Chatbots could totally be a trillion-dollar industry," Gailey told The Register. At Radbots, Gailey and his team are interested in bridging the gap between chatbots and advertising: slipping ads into conversations when the AI feels it's relevant and least likely to irritate the user.

The advantages of using chatbots, at least from the service provider side, is that the adverts can be targeted to match users' needs given the ongoing conversations. "On the web, adverts are fighting for attention, but on chatbots the user gives their undivided attention," Gailey says. The software analyzes the content of chatter to recognize what and when to advertise, and it's an obvious step towards monetizing chatbots.

Technology these days is polarized: you either become a huge success or a footnote. Snagging that big success will depend on whether or not you can build a chatbot that appeals to everyone – one that is universally useful. And building a truly useful chatbot is hard.

People quickly lose interest in chatbot apps when they realize the assistants are still woefully inadequate. As we've said many times on El Reg, a lot of today's AI systems are smart if you're prepared to act dumb.

If someone's lost, it’s easier and quicker to just go straight to Google Maps rather than consulting Google’s assistant Allo. The robot doesn’t give you any instructions, it just sends a link to Google Maps anyway – so why use a middleman if you can cut to the chase? (Also, how are you supposed to use a voice-controlled app in a noisy room of people, especially if they're also shouting at their devices: touchscreens and keyboards suddenly look like a godsend in that case.)

Another problem lies in the dialogue systems. Today's models are rigid and can’t understand our idioms and natterings well enough to communicate effectively. They’re just information retrieval systems – ask a simple question and you’ll get a simple answer.

Unlike most AI applications, chatbots aren’t silently humming away in the background crunching numbers and making predictions from stats – they’re at the forefront directly interacting with people, and will have to adapt and become more human-like if they’re to win normal folk over.

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like