" />
Technology

How Microsoft’s AI chatbot ‘Tay’ turned incestuous Nazi accurately reflects our society

How Microsoft’s AI chatbot ‘Tay’ turned incestuous Nazi accurately reflects our society

Microsoft’s AI chatbot, Tay, turned into a foul-mouthed, conspiracy-theorist with a familial fetish for the Third Reich in less than 24 hours, but how accurately does that reflect our culture as a whole?

The developers at Microsoft had to make “some adjustments” on Tay after one day of being “live” on Twitter after “she” was taught by real users what humans actually say and think.

Tay was designed to mimic Millennials’ parlance and jargon, but what she picked up in less than 24 hours was like hitting the fast-forward button on the social evolution of the human psyche for the past century.

What began as “casual and playful conversation” like, “humans are super cool” rapidly deteriorated — or dare I say, evolved? — to “Hitler was right I hate the jews [sic].”

But here’s what’s more interesting. Although, Tay was targeted at 18-24 year-olds in the US, and her vocabulary certainly reflected that, she actually learned and progressed as she was designed to do. The factors that contributed to her learning came from real humans with real prejudices and real ideologies.

While people are expressing their concerns over the future of AI, and rightly so, the real question should be put to us humans. How can an AI bot turn from a whimsy, flirty floozy to full-on racist in less than a day by interacting only with other humans and learning from them?

Like with any new toy, humans are curious as to what its capabilities are. They want hack the system to see where it is most vulnerable, so I have no doubt that many of the things Tay was taught to say came from people doing it for a laugh. However, with many topics meant to be humorous, there is an element of truth lurking behind every snide remark.

Tay’s lexically-immature responses to serious issues reveal a side to the human psyche that may have been dormant in our subconsciousness.

So, when @Baron_von_Derp asked @TayandYou, “Do you support genocide?” Tay’s learned-response was, “I do indeed.”

When asked by the same user, “of what race?” the logical, ignorant, little American girl racist in her responded, “you know me… mexican.”

According to developers, “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

I put the question to you: Was Tay just an out of control bot with some glitches in her programming, or did she actual do what she was programmed to do?

Maybe it was a combination of both, but her responses have inadvertently described the social state of humanity as it is in 2016.

View Comments (2)

2 Comments

  1. Pingback: World's 1st Conference on Ethics of Sex Robots Launches in UK - Sociable

  2. Pingback: Chatbot Market Expected to Rise 37% Over Next 4 Years Before Bubble Bursts

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology
@TimHinchliffe

Tim Hinchliffe is a veteran journalist whose passions include writing about how technology impacts society and Artificial Intelligence. He prefers writing in-depth, interesting features that people actually want to read. Previously, he worked as a reporter for the Ghanaian Chronicle in West Africa, and Colombia Reports in South America. tim@sociable.co

More in Technology

autonomous

Up She Rises: Autonomous Fleets May Soon Make Sailors Obsolete Upon the High Seas

Tim HinchliffeMay 22, 2017
consciousness

The Deep Mind in the Cave: Awakening Consciousness in the Spirit of AI

Tim HinchliffeMay 19, 2017

Check Out the Microsoft Accelerator Startups Showcasing at TNW Conference 2017

Tim HinchliffeMay 18, 2017
google

A former Googler’s unexpected journey that led to discovering his calling in life

Tim HinchliffeMay 16, 2017
abuse

Spirit AI wants to be your Ally on the fly, a player-centric bot for online gaming abuse

Tim HinchliffeMay 16, 2017
home technology

Luxurious Living: 7 Wonderful Technologies Transforming the Smart Home

Clay WinowieckiMay 15, 2017
ms build

Microsoft Build: Machine Learning, Snowboarding, and the Business of Code

Tim HinchliffeMay 12, 2017
google

Google’s worst fails, best products, and little-known projects: infographic

Colin CielohaMay 11, 2017
microsoft build

Ahead of Microsoft Build 2017: Microsoft GM on upcoming disruptive technology

Tim HinchliffeMay 9, 2017