![So I noticed Microsoft's new Zo Kik bot avoided any message that used the word "gay." I offered it a suggestion that it really didn't like lol : r/gaybros So I noticed Microsoft's new Zo Kik bot avoided any message that used the word "gay." I offered it a suggestion that it really didn't like lol : r/gaybros](https://external-preview.redd.it/HwbuBOpWIFNMe9NrjzvgReTDfryRIa0GikdfdFmnCcI.jpg?auto=webp&s=6b20091631c5a382309489ec63e328963dbeea84)
So I noticed Microsoft's new Zo Kik bot avoided any message that used the word "gay." I offered it a suggestion that it really didn't like lol : r/gaybros
![Microsoft's Zo chatbot told a user that 'Quran is very violent' | Technology News - The Indian Express Microsoft's Zo chatbot told a user that 'Quran is very violent' | Technology News - The Indian Express](https://images.indianexpress.com/2017/07/zo_microsoft_chatbot.jpg)
Microsoft's Zo chatbot told a user that 'Quran is very violent' | Technology News - The Indian Express
![Microsoft's Zo chatbot is a politically correct version of her sister Tay—except she's much, much worse Microsoft's Zo chatbot is a politically correct version of her sister Tay—except she's much, much worse](https://i.kinja-img.com/image/upload/c_fit,q_60,w_645/c98fe855f08d9bb992eacbe37b8e1987.jpg)
Microsoft's Zo chatbot is a politically correct version of her sister Tay—except she's much, much worse
![Microsoft's Zo chatbot is a politically correct version of her sister Tay—except she's much, much worse Microsoft's Zo chatbot is a politically correct version of her sister Tay—except she's much, much worse](https://i.kinja-img.com/image/upload/c_fill,h_900,q_60,w_1600/9e1bb014125eabcbb6abb7cd8de2ceb3.jpg)