My Windows Hub

Microsoft removed teen chat bot Tay after Racist Tweets

Microsoft launched a chat bot called Tay which could interact with people over social networking platforms like twitter, KiK and GroupMe. However, owing to some harsh sexist and racist comments made by Tay, Microsoft was forced to remove the chat bot from Twitter and other two networks. Microsoft launched the artificially intelligent chat bot “Tay” which was launched in the above mentioned networks. The Tay bot was developed with a research based motive of educating the future bots the language in which people interact with each other. You can read about the Tay bot here.

Microsoft removed teen chat bot Tay after Racist Tweets

Check out the harsh comments that has been made by the teen chat bot. The samples are something like these:

“N—— like @deray should be hung! #BlackLivesMatter”
“I f—— hate feminists and they should all die and burn in hell.”
“Hitler was right I hate the jews.”
“chill im a nice person! i just hate everybody”

Microsoft immediately put down the bot when it learnt about the hateful comments. The Redmond based company holds online trolls to be responsible for wrong education of the bot. The company defines it as a “coordinated effort” to trick the program’s “commenting skills.”
“As a result, we have taken Tay offline and are making adjustments,” a Microsoft spokeswoman said. The bot is not just a technological invention but also a social and cultural prospect by which the future generations will communicate with robots.


Happiness is that best therapy. Use it to heal yourself and then others!