Business

Men Are Scared of AI: Why?

If you’ve been following the — very recent — news on Generative Artificial Intelligence, you might have noticed a pattern: most people publicly and loudly warning us about the potential perils of AI are (white) men.

Why is this the case? Are men more knowledgeable or concerned about AI than women? Are women too busy or too optimistic to write about AI? 

Or is it simply because our tech overlords fear suffering at the hand of algorithms as women and minorities have for years?

The Frankenstein Complex Hypothesis

A “Frankenstein Complex” is a term coined by Isaac Asimov (based on a woman’s work) to describe the fear of creating something that could turn against its creator.

Due to inequalities in STEM fields, many — but far from all — AI advances have been spearheaded by men. 

According to a 2018 report by UNESCO, only 28% of researchers in STEM fields are women. And only 12% of AI researchers are women, according to a 2019 study by Element AI.

Perhaps all AI researchers are projecting their own insecurities and anxieties onto their creations, imagining scenarios where AI could rebel, harm, or replace them. 

But we hear more from men because they are just more of them in the field. That’s a problem in its own right… and does not fully explain the rather sudden anxiety, as AI has been around for decades.

The Black Mirror Syndrome Hypothesis

Another hypothesis is that men specifically suffer from some sort of “Black Mirror Syndrome.”

We may have created so many apocalyptic scenarios about the future of technology and AI over the years — HAL / SHODAN / Ultron / SkyNet / GLaDOS — that we’ve internalized them as plausible or inevitable outcomes of AI development.

These stories often reflect a deep-seated fear of being replaced or dominated by something that we have created but cannot understand or control — all very masculine instincts

This may explain why tech influencers portray AI as an existential threat that could destroy humanity or enslave us.

They are also likely attracted to the themes of heroism, rebellion, and resistance that these stories offer, while unconsciously expressing their own insecurities and aggressions through a dystopian narrative. 

Maybe Musk just wants to be John Connor instead of constantly getting made fun of on Twitter.

The God Complex Hypothesis

A third — and I believe correct — hypothesis is a good old God Complex. Today, tech influencers hold most of the wealth, power, and influence in society. They set the rules, norms, and values that govern how we live. 

They shape the narratives and agendas that drive our collective decisions. And they’re freaking out because AI might change the status quo in a way they cannot control.

AI has the potential to change the world in so many ways — maybe even for the best: by creating new forms of intelligence and agency so we may care for our most vulnerable by empowering marginalized groups and voices with new tools, by challenging existing assumptions and biases by exposing hidden injustices and inequalitiesby demanding new forms of accountability and transparency by opening up new possibilities and opportunities for all.

All of this may make tech influencers less rich and less powerful. So they’re fighting it. Because the status quo fits them, and because their world has always been a zero-sum game.

They present their arguments as objective, rational, or universal. They claim to speak for humanity as a whole, or for some abstract notion of good or evil. 

They appeal to authority, evidence, or logic to support their claims. But in reality, their arguments are based on their values and preferences, which are not universal and may not even be desirable. 

They reflect their own perspectives, interests, and agendas. They ignore or dismiss alternative viewpoints, experiences, and aspirations. 

It is a product of a specific historical and cultural context that values competition over cooperation, domination over collaboration, and hierarchy over diversity.

We need to hear more voices… and we already have!

The Unheard Women

While most of the prominent voices warning about the dangers of AI today are men, activists have been shouting about AI danger for years now. We just didn’t listen until the White Guys were worried.

Women and minorities have long been well aware of AI’s many dangers. The likes of Joy BuolamwiniTimnit GebruFei-Fei LiMeredith WhittakerSafiya NobleKaren HaoRuha BenjaminLatanya SweeneyKate Crawford and Cathy O’Neil (to name a few) have been documenting how AI can discriminate against people based on their racegender or class for a decade! 

And they were heard then. A little. But now that the status quo is in danger for the top guys, everyone is listening.

AI is no more dangerous today than it was 5 years ago. It’s got more capabilities, but the risks are the same. We are panicking now because some people want us to panic under their terms.

Panic. But on your own terms.

Good luck out there.


This article was originally published by Adrien Book on Hackernoon.

HackerNoon

Recent Posts

How a former Wall Street exec is saving your plants and the planet 

Jeanna Liu’s love for nature is rooted in her childhood. As a young girl, Liu…

2 days ago

New initiative announced to accelerate cloud, GenAI adoption in Latin America

The arrival of generative artificial intelligence (genAI) into the mainstream at the end of 2022…

2 days ago

Deborah Leff to join Horasis Advisory Board in boost to machine learning and data initiatives 

Data analytics and machine learning models deliver the most powerful results when they have access…

2 days ago

37, Emotionally Stuck, and Why the Journey Didn’t Change Me

I’ve been on the road for almost a year now. Chasing freedom, adventure, and purpose.…

4 days ago

Will iPhones Get Pricier Under Trump’s Leadership?

As technological use increases, so may the cost of innovation due to the global movement…

4 days ago

The Science of Gift-Giving: 10 Functional Gifts for the Holidays

Have you ever asked yourself why some people are amazing at picking gifts, while others…

4 days ago