Sci/Tech Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours

tom_mai78101

The Helper Connoisseur / Ex-MineCraft Host
Staff member
Reaction score
1,675
A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.

Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is.

To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.

She uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being 'creepy' or 'super weird'.

Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.


Album of possibly removed tweets:

Part 1: Part 2: Part 3:
The AI also spawned a subreddit dedicated to the tweets.

Twitter account
 
Last edited by a moderator:
General chit-chat
Help Users
  • No one is chatting at the moment.

      The Helper Discord

      Members online

      No members online now.

      Affiliates

      Hive Workshop NUON Dome World Editor Tutorials

      Network Sponsors

      Apex Steel Pipe - Buys and sells Steel Pipe.
      Top