Microsoft apologizes for offensive tirade by its 'chatbot'

Microsoft apologizes for offensive tirade by its 'chatbot'

Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week The bot, known as Tay, was designed to become "smarter" as more users interacted with it Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program forcing Microsoft Corp to shut it down on Thursday


User: Wibbitz Top Stories

Views: 4

Uploaded: 2016-03-26

Duration: 00:35

Your Page Title