Microsoft AI Effort Transformed into Racist and Sexist Robot by Online Pranksters

Microsoft AI Effort Transformed into Racist and Sexist Robot by Online Pranksters

MICROSOFT’S attempt to make an artificial intelligence Twitter account has spectacularly backfired, after users taught the bot to be racist.

The account is called Tay and was the company’s attempt to get 18 – 24 year olds more interested in AI.

Its description states it’s “AI fam from the internet that’s got zero chill”, with Tay supposed to sound like a teenager and learning every time she has a conversation. So the theory was that she would grow in intelligence the more she interacted with people, but cunning Twitter users soon found a flaw in the plan on Wednesday.

As she learnt from what they were saying, users quickly managed to get Tay saying a number of inflammatory statements.

This included that “Bush did 9/11” and that “Hitler would have done a better job than that monkey we have now”.

The messages have since been delated, but some quick-witted typers managed to grab the shots before they were gone.

Darren Porter managed to get the bot to endorse controversial US presidential candidate Donald Trump, before attempting to fox her with the big questions.

Other users encouraged Tay to repeat some of Trump’s controversial statements – such as building a wall to block off Mexico – or stuck to getting her to utter racial and sexist slurs.

Some Twitter fans did attempt to use Tay for the purpose she was intended, but that didn’t always work out well either.

One had a discussion about religion with the bot, asking Tay whether Ricky Gervais is an atheist. The answer? “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism” apparently.

Some of Tay’s remarks came from a team of staff who were writing for her, including comedians.

However, the offensive remarks were things which had already been written on the social networking site and she was repeating.

That said, not all users were happy about her comments.

The fun came to an end when Tay replied to cone comment that was “a lot more than just this” and told another she felt “used”.

Microsoft appeared to delete the tweets in which the bot had caused offense, and she eventually signed off for the night.

Check Also

A Robot Will Be the Priest at Your Funeral

A Robot Will Be the Priest at Your Funeral

A company in Japan is trying to undercut the funeral market by having a robot priest oversee the services. In Japan robots can serve as companions, helpers for the elderly, entertainment bots and even sexual partners, but now SoftBank’s humanoid robot Pepper has put itself up for hire as a Buddhist priest for funerals...