" />
Technology

How Microsoft’s AI chatbot ‘Tay’ turned incestuous Nazi accurately reflects our society

How Microsoft’s AI chatbot ‘Tay’ turned incestuous Nazi accurately reflects our society

Microsoft’s AI chatbot, Tay, turned into a foul-mouthed, conspiracy-theorist with a familial fetish for the Third Reich in less than 24 hours, but how accurately does that reflect our culture as a whole?

The developers at Microsoft had to make “some adjustments” on Tay after one day of being “live” on Twitter after “she” was taught by real users what humans actually say and think.

Tay was designed to mimic Millennials’ parlance and jargon, but what she picked up in less than 24 hours was like hitting the fast-forward button on the social evolution of the human psyche for the past century.

What began as “casual and playful conversation” like, “humans are super cool” rapidly deteriorated — or dare I say, evolved? — to “Hitler was right I hate the jews [sic].”

But here’s what’s more interesting. Although, Tay was targeted at 18-24 year-olds in the US, and her vocabulary certainly reflected that, she actually learned and progressed as she was designed to do. The factors that contributed to her learning came from real humans with real prejudices and real ideologies.

While people are expressing their concerns over the future of AI, and rightly so, the real question should be put to us humans. How can an AI bot turn from a whimsy, flirty floozy to full-on racist in less than a day by interacting only with other humans and learning from them?

Like with any new toy, humans are curious as to what its capabilities are. They want hack the system to see where it is most vulnerable, so I have no doubt that many of the things Tay was taught to say came from people doing it for a laugh. However, with many topics meant to be humorous, there is an element of truth lurking behind every snide remark.

Tay’s lexically-immature responses to serious issues reveal a side to the human psyche that may have been dormant in our subconsciousness.

So, when @Baron_von_Derp asked @TayandYou, “Do you support genocide?” Tay’s learned-response was, “I do indeed.”

When asked by the same user, “of what race?” the logical, ignorant, little American girl racist in her responded, “you know me… mexican.”

According to developers, “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

I put the question to you: Was Tay just an out of control bot with some glitches in her programming, or did she actual do what she was programmed to do?

Maybe it was a combination of both, but her responses have inadvertently described the social state of humanity as it is in 2016.

View Comments (2)

2 Comments

  1. Pingback: World's 1st Conference on Ethics of Sex Robots Launches in UK - Sociable

  2. Pingback: Chatbot Market Expected to Rise 37% Over Next 4 Years Before Bubble Bursts

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology
@TimHinchliffe

Tim Hinchliffe is a veteran journalist whose passions include writing about how technology impacts society and Artificial Intelligence. He prefers writing in-depth, interesting features that people actually want to read. Previously, he worked as a reporter for the Ghanaian Chronicle in West Africa, and Colombia Reports in South America. tim@sociable.co

More in Technology

Twenty years ago we were more than willing to sit patiently through the first three minutes of our cousin’s self-made high school video and see the fluorescent words rolling unsteadily over the black screen.

If your video doesn’t capture audiences in 10 seconds, you’re losing engagement

Kwinten WoutersDecember 15, 2017

Living In an Urban Jungle: How Cities are Impacting our Brains and Sleep

Sam Brake GuiaDecember 14, 2017
decoded developer open source

DECODED Profiles: developer evangelist Tessa Mero on building open source communities

Tim HinchliffeDecember 13, 2017
bitcoin users

Britain’s spy agency says it is monitoring Bitcoin to find out about risks

Mathew Di SalvoDecember 12, 2017
Coding Autism -The Sociable

Coding Autism: the startup empowering autistic adults in the tech industry

Jess RappDecember 12, 2017
stock performance prediction

Stock performance prediction prototype shows 62% accuracy using NLP, Deep Learning

Tim HinchliffeDecember 11, 2017

QUBED enabling Millennials to fulfill their passions

Markus SkagbrantDecember 7, 2017
AI, Musk, neuralink, healthtech, neurotechnology

Nexeon’s neurotechnology in the fight against our aging bodies

Ben AllenDecember 7, 2017

netTALK launches ezLINQ enabling users to connect landlines, TVs to the internet

Markus SkagbrantDecember 6, 2017