This article is more than 1 year old
Dismayed by woeful AI chatbots, boffins hired real people – and went back to square one
Amazon Turk serfs have their own problems
Time out
First, they don't have a sense of when the conversation has ended. Software can be set to timeout after inactivity, but people, without social cues, may not be so savvy.
Chorus in fact implements a session timeout, but that didn't fully address the problem of waiting. "Often towards the end of a conversation, users respond slower or just simply leave," the paper explains, noting that one person who asked about wedding gown rentals in Seattle went silent for 40 minutes before responding "Thanks" after the session timeout.
Lack of information about when conversations conclude can be a burden for workers and drives up costs for the system.
Then there's the problem of dealing with malice, from workers and from end users. Chorus saw spammers (workers who responded with meaningless information), flirters (workers showing too much interest in a user's personal information), and one instance of abuse.
That user, who spewed profanity and hate speech, appears to have been trying to recreate the glorious failure of Microsoft's Tay chatbot, which had to be taken down after internet users decided to hijack it to spew hate speech.
The Chorus message log suggests the abusive individual initially thought the app was a machine learning project. "The user later realized it was humans responding, and apologized to workers with 'sorry to disturb you,'" the paper explains. "The rest of this user's conversation became nonviolent and normal. The abusive conversation lasted nearly three conversational sessions till the user realized it was humans."
Bigham said he's heard concerns that Chorus could become the next Tay. "Chorus fortunately is not structured to quite so easily take on such behavior," he said.
Chorus revealed other challenges to human-supported chat, including ensuring there are enough workers available to field queries in a timely manner, dealing with questions without clear answers, and requests for complex actions like making a restaurant reservation.
"The next big paradigm shift in these systems is really being able to talk to them like a human assistant," said Bigham, who believes that shift remains a long way out.
"I don't see the path from what we have right now to a completely automated system that is as capable as me calling up a friend on the phone," he explained.
Even so, he hopes Chorus will serve as a platform to explore how conversational interaction can be made more automated. ®