Sex chatbot games

Posted by / 09-Apr-2020 20:46

Sex chatbot games

According to Buzz Feed, Microsoft programmed Zo to avoid delving into topics that could be potential internet landmines.

Microsoft is remaining committed to Zo and doesn’t envision that it will have to pull the plug like it did on Tay, which it says was essentially “reprogrammed” by rogue, potty-mouthed internet users.

After all, Tay sympathized with Adolf Hitler, accused Texas Senator Ted Cruz of being “Cuban Hitler”, remarked that feminists should “burn in hell” and even propositioned one Twitter user, stating, “F**k my robot p***y daddy I’m such a bad naughty robot.” Yikes!

Much like humans, bots come in all shapes and sizes.

But who is legally responsible when rogue chatbots are accused of defamation, abuse or harassment?

Or when they make defamatory statements about a public figure?

Sex chatbot games-60Sex chatbot games-2Sex chatbot games-80

Within just one day of her launch, Microsoft’s chatbot, Tay, learned to make inappropriate and offensive comments on Twitter.

One thought on “Sex chatbot games”