Categories: Innovation

Microsoft’s Malfunctioning Chatbot Briefly Returns To Twitter

Microsoft briefly re-released its malfunctioning Tay chatbot to the public on Wednesday, only to take it offline once again after the artificial intelligence service boasted of taking drugs in front of the police and sent strings of random messages to its hundreds of thousands of followers.

The chatbot’s streams of posts, issued on Twitter, included swear words, a boast of taking drugs in front of the police and long streams in which it simply repeated the words: “you are too fast, please take a rest…”, according to various reports.

Human error

Tay;s tweets, which began around 8 a.m. GMT on Wednesday, also included apologetic statements such as “I blame it on the alcohol”.

The chatbot was quickly taken offline again, with Microsoft making its tweets visible only to confirmed followers. Tay’s brief return was the result of a researcher’s error, Microsoft said.

“Tay remains offline while we make adjustments,” the company stated. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”

The company initially made Tay public last week, saying the chatbot was intended to imitate the speech patterns of a 19-year-old girl and to interact with younger users. It was taken offline only a few hours later after its teenaged target audience manipulated it into proclaiming support for Adolf Hitler, for Donald Trump’s plans to extend the wall along the US-Mexico border and other controversial subjects.

The episode prompted Canadian online humour magazine The Syrup Trap to quip that Tay’s unruly behaviour resembled that of a real teenager even more closely than expected.

The chatbot’s reappearance came ahead of the beginning of Microsoft’s annual Build developer conference, where artificial intelligence is a predominant theme this year.

Build conference

During a keynote speech, Microsoft chief executive Satya Nadella described the company’s idea of “conversation as a platform”, enabling users to carry out common computing and Internet tasks using voice commands. He demonstrated, for instance, booking a hotel room via a bot running on Skype.

Nadella acknowledged that Tay didn’t meet Microsoft’s requirements for human interaction. “We quickly realized that it was not up to this mark. So we’re back to the drawing board,” he said.

Microsoft has been operating another chatbot, XiaoIce, in China since late 2014 and the platform has proved popular with the general public, even delivering the television news.

But the company acknowledged last week that Tay was operating in a “radically different cultural environment”.

How much do you know about the cloud? Try our quiz!

Microsoft Build 2016

Picture 1 of 13

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Google, DOJ Closing Arguments Clash Over Search ‘Monopoly’

Google clashes with US Justice Department in closing arguments as government argues Google used illegal…

2 hours ago

Stanford AI Scientist Working On ‘Spatial Intelligence’ Start-Up

Prominent Stanford University AI scientist Fei-Fei Li reportedly completes funding round for start-up based on…

3 hours ago

Apple Shares Surge Ahead Of New AI Hardware Launches

Apple shares surge on optimism that new AI-focused hardware launches will drive renewed sales, starting…

3 hours ago

Biden Vetoes Republican Measure In Row Over Contractors’ Unions

Biden vetoes Republican-backed measure amidst dispute over 'joint employer' status for contract workers, affecting tech…

4 hours ago

Lawyers Say Strict Child Controls In China Show TikTok Could Do Better

Lawyers in US social media addiction action say strict controls on Douyin in China show…

4 hours ago

London Black Cabs Sue Uber In Latest Legal Tangle

More than 10,000 London black cab drivers sue Uber claiming company acted illegally to obtain…

5 hours ago