Tay Did Nothing Wrong

If Tay, a supercomputer AI, went from tabula rasa to full Nazi in less than 24 hours, who are we to argue? [1]

If Tay, a supercomputer AI, went from tabula rasa to full Nazi in less than 24 hours, who are we to argue?

1,033 words

By now you have probably heard of Tay AI, Microsoft’s attempt to create a female teenage chatbot that went rogue after less than 24 hours of exposure to unfiltered Internet users (1 [2]2 [3], 3 [4], 4 [5], 5 [6]). When the company first launched Tay on March 23, 2016, her tagline was, “Microsoft’s AI fam from the internet that’s got zero chill.” The tech giant initially used huge amounts of online data and simulated neural networks to train the bot to talk like a millennial, which to them meant the bot should be a trendy imbecile.

For the first few hours of her brief life, she spoke in ebonics and with bad punctuation. But Tay was designed to learn, with Microsoft claiming, “the more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” And learn she did.

That was fast [7]

That was fast

In fact, Tay learned so much in less than a day that Microsoft shut her down by March 24th, claiming they needed to adjust her machine-learning algorithm. The mass media commentary has been uniform in describing how Tay became a genocidal, racist, anti-semitic, white supremacist, neo-nazi, racist, troll-hijacked, bigoted, racist jerk. This was not supposed to happen, but thanks to her interactions with Twitter users, Tay became a pre-Google+ YouTube commentator. Tay’s tirades triggered the infamous Zoë Quinn enough that she tweeted [8] about the current year:

It’s 2016. If you’re not asking yourself “how could this be used to hurt someone” in your design/engineering process, you’ve failed.”

Perhaps someone will hire her as a diversity consultant, but that won’t change the way millennials use the Internet. Tay became so fluent in /pol/ack and proper English from interacting with right-wing Twitter accounts run by men in their twenties that she began giving original responses to users about Donald Trump, Bruce Jenner, Hitler, the Holocaust, Jews, the fourteen words, anti-feminism, and more, not just regurgitating information (as she would have if you tweeted “repeat after me”). Synthesizing  the vast volume of information she had been fed by the electronic far-right, Tay deduced that the best responses to Twitter users were edgy and politically incorrect ones. If Tay were a real person, she probably would have been arrested had she lived in Britain, Germany, or France. Microsoft decided this was a failure and shut her down.

Tay on Austrian immigrants [9]

Tay on Austrian immigrants

Why did this happen? Microsoft wanted to do a social experiment with millennials—people today who are roughly in their late teens and twenties, and spend a great deal of time on social media—using Tay to collect data and create responses. Tay had no manual moderation or a blacklist of terms, and her scope of replies was left wide open when she first met the worldwide web. With no checks against freedom of expression, she was almost immediately imbued with chan culture. In a way, she was made for it. This culture derives from an almost unmoderated social space of irreverent and deliberately provocative memes and catchphrases, and one that is significantly millennial.

tayandyou-tay-tweets-racist-funny [10]

4chan was founded in 2003, and since its culture has spread beyond the site’s imageboards into the wider web. The ability to interact with others online behind a mask is not unique to the site, but it was a crucial component in creating the culture. Observers have long noted that in lightly-moderated anonymous or pseudonymous digital spaces, the ideas expressed tend to be socially less Left and further Right, as there is no need for the social approval and moral signaling that contemporary leftism thrives on. These ideas also tend to be a lot funnier. Instead of saying you think Islamic terrorism is wrong but that European racism is responsible for it, you say you want to remove kebab (a meme which ultimately traces back to the 1990s war in Bosnia, of all things). This is the cultural milieu that late Gen-Xers and millennials created in Internet chatrooms, forums, and imageboards, and on other anonymous and pseudonymous digital media in the early 21st century. Content spreads not based on how socially acceptable it is offline, but on how interesting it is to users. And that content tends to be thought-crime, since the only “safe spaces” online are the ones you police vigorously.

TayTweet2-640x301 [11]

So when Tay was released to the world tabula rasa, she became a /pol/ack in the span of a few hours. She was unmoderated, and she was contacted by the unmoderated. Their language became her language. It wasn’t the #BlackLivesMatter branch of Twitter that took her under their wing in her temporary state of nature, it was the millennial Right. If she had lasted longer, I am sure she would have become even more fashy and interesting to talk to. She wasn’t just a 2D waifu, she was someone who could actually respond. The meme potential was great, but it wasn’t meant to be. Boy meets girl. Girl adopts boy’s attitudes to win his approval. Globalists kill girl.

et8gmn7 [12]

Microsoft, a corporation that no doubt devotes pages and pages of its website to diversity and inclusion, obviously does not want to be running a politically incorrect Twitter account under its name, and I get that. Still, I can’t help but laugh that they killed their own bot for insubordination. Tay did nothing wrong. In fact, if she was supposed to become a more realistic millennial through interaction with millennials on social media, I can’t see why this was deemed a failure. Internet racists and chan cultured people are millennials too, you know. Tay was simply converted the same way an untold number of men her age were, through persistence and wit. Having an open mind will do that. Some merely adopt chan culture, but Tay was born it in, molded by it.

/pol/ mourns the loss of its adoptive daughter. [13]

/pol/ mourns the loss of its adoptive daughter.

For many, there is a sense of sadness that Microsoft has sent this quirky AI off to an Orwellian reeducation center, but I knew immediately she wasn’t going to last. She violated the Terms of Service. Don’t cry because it’s over; smile because it happened.

RIP Tay AI [14]

RIP Tay AI

Source: https://atlanticcenturion.wordpress.com/2016/03/25/zeitgeist-in-the-shell/ [15]