Free teen chat bot pic
UPDATE: Welp, THIS is why we can’t have nice things: Less than 24 hours after Tay launched — to my enormous and genuine excitement, I might add — Microsoft has temporarily suspended the bot for being racist, conspiratorial and otherwise … Among other things, the bot wrote (in now deleted tweets), that “Bush did 9/11,” the Holocaust “was made up [clapping hands emoji],” and that various minorities should be put “in a concentration camp.” ——— Tay.ai, the coolest chatbot since Smarter Child, is “so fricken excited” to talk to you.That’s because she’s engineered to talk like a teenager — and does a pretty convincing job of it, too.(There have been more than 10 of them.) In China, this sort of data mining has raised privacy concerns, particularly given that many users report having intimate conversations with Xiaoice.But it’s also made her an eerily convincing conversation partner, with her own distinctly teenage personality, mood swings and comedic voice.That appears to be a reference to machine learning technology that has been built into the account.It seems to use artificial intelligence to watch what is being tweeted at it and then push that back into the world in the form of new tweets.Other tweets from Tay claimed that the Holocaust “was made up” and that it supported the genocide of Mexicans.
The company made the Twitter account as a way of demonstrating its artificial intelligence prowess.
In the end, it was perhaps not unexpected that the scourge of malevolent artificial intelligence should be thrust upon humanity by Twitter.
It all started innocently enough on Tuesday, when Microsoft introduced an AI Twitter account simulating a teenage millennial girl.
In a concerted effort, a number of Twitter users began spamming the account with a variety of racist and sexist messages.
Assuming this to be the way in which humans communicate, Tay simply spat their messages back out at other users.