Facebook parlays bot bet into ParlAI dialog framework

With human trainers, bots may finally get smart enough to hold complex conversations


Facebook's artificial intelligence research team, which operates under the self-endorsing acronym FAIR, has released an open-source research framework called ParlAI to train bot software to chat more coherently with people.

ParlAI, a Python-based tool for developing and evaluating chatbot dialog models, tries to teach bots by enlisting humans, via Amazon Mechanical Turk, to train their code-based replacements.

Facebook's desire to turn mobile chat into an e-commerce command line hasn't quite worked as well as planned. At its recent F8 developer conference, Messenger veep David Marcus acknowledged that Messenger bots work better when interaction comes in the form of menu selections instead of as written or spoken commands.

In an effort to downplay dialog-based interaction, he also said that Facebook never called conversational apps chatbots. Just bots.

FAIR boffins Jason Weston, Alexander Miller, and Will Feng say much the same thing, though they evidently missed the "don't call them chatbots" memo.

"Existing chatbots can sometimes complete specific independent tasks, but have trouble understanding more than a single sentence or chaining subtasks together to complete a bigger task," the three researchers explain in a blog post. "More complex dialog, such as booking a restaurant or chatting about sports or news, requires the ability to understand multiple sentences and then reason about those sentences to supply the next part of the conversation."

ParlAI complements Facebook's other text-oriented research projects, including FastText – a text-classification system – and CommAI, a framework for developing general-purpose software agents that rely on linguistic input.

The software is designed to allow researchers to submit multiple tasks and training algorithms into a shared repository, with an eye toward unifying the results into a coherent set of data that will allow bots to conduct complex conversations.

Tasks in this context refer to five types of dialog training:

  • Question answering
  • Sentence completion
  • Goal-oriented dialog
  • Chit-chat (social banter without a specific purpose)
  • Dialog associated with objects (initially image-oriented)

ParlAI supports over 20 tasks, such as bAbI tasks, SimpleQuestions, and SQuAD, which are used for approximating the process of reasoning. For example, given the statements, "Daniel entered the kitchen," "Mary took the milk from there," and Mary went to the office," a bot trained with bAbI might be able to conclude that the milk has been moved to the office.

Teaching complexity

In theory, chaining together various straightforward AI tasks can lead to complex AI, the sort of personal assistant technology depicted in science fiction.

Asked whether there are any examples of bots capable of this sort of complex conversation, Weston in an email to The Register said, "ParlAI has only just been released, so there aren't too many trained models yet. We do include a dataset of this kind of task, which has nice results showing end-to-end neural networks outperforming traditional rule-based systems: Learning End-to-End Goal-Oriented Dialog."

To refine bots further, ParlAI has been designed to collect feedback using Mechanical Turk, Amazon Web Services' turnkey cheap labor service. Turkers, as Turk workers are known, may get presented with facts, about which they're expected to pose and answer questions. This input gets used to build better bots.

"In the end, dialog with humans is necessary to build chatbots that can talk to humans," the researchers observe.

Weston allows that chatbots leave something to be desired as conversational partners, but remains optimistic that they can be improved beyond menu-based interaction.

"Menus or apps are strictly limited to the options they include," he said. "The full technology of chatbots isn't there yet, but hopefully one day we'll be able to serve all kinds of requests, ones that we couldn't have predicted. We do believe we'll get there eventually. ParlAI is one step towards that." ®

Narrower topics


Other stories you might like

  • Verizon: Ransomware sees biggest jump in five years
    We're only here for DBIRs

    The cybersecurity landscape continues to expand and evolve rapidly, fueled in large part by the cat-and-mouse game between miscreants trying to get into corporate IT environments and those hired by enterprises and security vendors to keep them out.

    Despite all that, Verizon's annual security breach report is again showing that there are constants in the field, including that ransomware continues to be a fast-growing threat and that the "human element" still plays a central role in most security breaches, whether it's through social engineering, bad decisions, or similar.

    According to the US carrier's 2022 Data Breach Investigations Report (DBIR) released this week [PDF], ransomware accounted for 25 percent of the observed security incidents that occurred between November 1, 2020, and October 31, 2021, and was present in 70 percent of all malware infections. Ransomware outbreaks increased 13 percent year-over-year, a larger increase than the previous five years combined.

    Continue reading
  • Slack-for-engineers Mattermost on open source and data sovereignty
    Control and access are becoming a hot button for orgs

    Interview "It's our data, it's our intellectual property. Being able to migrate it out those systems is near impossible... It was a real frustration for us."

    These were the words of communication and collaboration platform Mattermost's founder and CTO, Corey Hulen, speaking to The Register about open source, sovereignty and audio bridges.

    "Some of the history of Mattermost is exactly that problem," says Hulen of the issue of closed source software. "We were using proprietary tools – we were not a collaboration platform before, we were a games company before – [and] we were extremely frustrated because we couldn't get our intellectual property out of those systems..."

    Continue reading
  • UK government having hard time complying with its own IR35 tax rules
    This shouldn't come as much of a surprise if you've been reading the headlines at all

    Government departments are guilty of high levels of non-compliance with the UK's off-payroll tax regime, according to a report by MPs.

    Difficulties meeting the IR35 rules, which apply to many IT contractors, in central government reflect poor implementation by Her Majesty's Revenue & Customs (HMRC) and other government bodies, the Public Accounts Committee (PAC) said.

    "Central government is spending hundreds of millions of pounds to cover tax owed for individuals wrongly assessed as self-employed. Government departments and agencies owed, or expected to owe, HMRC £263 million in 2020–21 due to incorrect administration of the rules," the report said.

    Continue reading

Biting the hand that feeds IT © 1998–2022