Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Microsoft did Nazi that coming: Teen girl chatbot turns into Hitler-loving sex troll in hours

SIGINT? More like SIGHEIL

Microsoft's "Tay" social media "AI" experiment has gone awry in a turn of events that will shock absolutely nobody.

The Redmond chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket.

The intent was for "Tay" to develop the ability to sustain conversations with humans on social networks just as a regular person could, and learn from the experience. Twitter is awash with chatbots like this.

Unfortunately, Microsoft neglected to account for the fact that one of the favorite pastimes on the internet is ruining other people's plans with horrific consequences. Miscreants were able to find a debugging command phrase – "repeat after me" – that can be used to teach the bot new responses. In a span of about 14 hours, and after some unexpected schooling in hate speech, Tay's personality went from perky social media squawker...

...to feminist-hating Nazi:

Others noted Tay tweeting messages in support of Donald Trump, as well as explicit sex chat messages.

Not surprisingly, Microsoft has suspended the effort, deleting almost all of Tay's tweets and putting Tay "to sleep":

To recap, Google's AI efforts are yielding unprecedented leaps in machine learning, Facebook is commoditizing the field, and Microsoft?

We think Redmond might have some catching up to do. ®

More about

TIP US OFF

Send us news


Other stories you might like