Online sex chat bot mobile site

If you send an e-mail to the chatbot’s official web page now, the automatic confirmation page ends with these words. We’re making some adjustments.” But the company was more direct in an interview with , pointing their finger at bad people on the Internet.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.” Maybe it wasn’t an engineering issue, they seemed to be saying; maybe the problem was Twitter.

Then the bots take in communications and data from users so they can interact in an informed and helpful fashion specific to the user.

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” the company said in announcing the chat bot project.

According to the website Socialhax, which tracked the Twitter feed, “Tay’s developers seemed to discover what was happening and began furiously deleting the racist tweets.” The site also suggested the developers had lobotomized the less-than-savory areas of Tay’s computer brain.

“They also appeared to shut down her learning capabilities and she quickly became a feminist,” the site’s report said, citing a tweet in which Tay said, “i love feminism now.” Microsoft, free of First Amendment concerns because Tay is, after all, a robot, shut the experiment down less than 24 hours after Tay went live.

Tay – apparently the version of the chat bot lobotomized into bland civility after her homicidal, genocidal, misogynist, racist and perverted outbursts – had this to say in announcing her departure from the online world: “Phew.

Tay learned language and ideas via the interactions. humans are super cool,” Tay tweeted to one Twitter-buddy Wednesday night. The more information Tay took in from members of the public, the worse her character became. “I [bleep]ing hate feminists and they should all die and burn in hell,” she tweeted toward noon on Thursday.

The project was aimed at young millennials, Americans aged 18 to 24, “the dominant users of mobile social chat services in the U. The chat bot would interact with users via text message on Twitter and other messaging platforms, and Microsoft suggested that people ask her to tell them jokes, stories and horoscopes, play games, and comment on photos. By the next morning, the bot started to veer a little sideways. Minutes later she broadened her hatred, tweeting, “Hitler was right I hate the Jews.” Tay went on to cast racist slurs, and also waded into politics.

And as humankind confronted the evolution of artificial intelligence, Tay’s fate seemed to provide all kinds of teachable moments: Tay’s infamous day in the sun has been preserved in a new Reddit forum called Tay_Tweets.

But elsewhere on the site, in long, threaded conversations, people searched for a meaning behind what had just happened.

Leave a Reply