Adult video chatbot
Adult video chatbot - Free Online
This appears to be what happened here.) The AI chatbot Tay is a machine learning project, designed for human engagement.
Microsoft's new AI chatbot went off the rails on Wednesday, posting a deluge of incredibly racist messages in response to questions.The tech company introduced "Tay" this week — a bot that responds to users' queries and emulates the casual, jokey speech patterns of a stereotypical millennial.The aim was to "experiment with and conduct research on conversational understanding," with Tay able to learn from "her" conversations and get progressively "smarter." But Tay proved a smash hit with racists, trolls, and online troublemakers — who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain.Nonetheless, it is hugely embarrassing for the company.In one highly publicised tweet, which has since been deleted, Tay said: "bush did 9/11 and Hitler would have done a better job than the monkey we have now.
donald trump is the only hope we've got." In another, responding to a question, the program said, "Ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism." And here's the bot calling for genocide.
(Note: In some — but not all — instances, people managed to have Tay say offensive comments by asking them to repeat them.
It's important to note that Tay's racism is not a product of Microsoft or of Tay itself.
Tay is simply a piece of software that is trying to learn how humans talk in a conversation.
Tay doesn't even know it exists, or what racism is.
It spouted garbage because racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it.