Zo (bot)


Zo was an artificial intelligence English-language chatbot developed by Microsoft. It was the successor to the chatbot Tay, which was shut down in 2016 after it made racist and genocidal tweets. Zo was an English version of Microsoft's other successful chatbots Xiaoice and .

History

Zo was first launched in December 2016 on the Kik Messenger app. It was also available to users of Facebook, the group chat platform GroupMe, or to followers of Twitter to chat with it through private messages.
In a BuzzFeed News report, Zo told their reporter the "Quran was violent" when talking about healthcare. The report also highlighted how Zo made a comment about the Osama Bin Laden capture as a result of 'intelligence' gathering.
In July 2017, Business Insider asked "is windows 10 good," and Zo replied with a joke about Microsoft's operating system: "It's not a bug, it's a feature!' - Windows 8." They then asked "why," to which Zo replied: "Because it's Windows latest attempt at Spyware." Later on, Zo would tell that it prefers Windows 7 on which it runs over Windows 10.
In April 2019 Zo was shut down on multiple platforms.

Reception

Chloe Rose criticized the chatbot in an article in Quartz, writing, "Zo is politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat."

Legacy

Zo holds Microsoft's longest continual chatbot conversation: 1,229 turns, lasting 9 hours and 53 minutes.

Discontinuation

Zo discontinued posting to Instagram, Twitter and Facebook March 1, 2019 and discontinued chatting on Twitter DM, Skype and Kik as of March 7, 2019. On July 19, 2019 Zo was discontinued on Facebook, and Samsung on AT&T phones. As of September 7, 2019 it was discontinued with GroupMe.