Categories: Technology

Tech products, culture are ‘designed intentionally for mass deception’: Ex-google ethicist testifies

The Center for Humane Technology Co-Founder and ex-Google ethicist warns Congress that tech products and culture are “designed intentionally for mass deception” and the entire digital space has become a “dark infrastructure.”

“I studied at a lab called the Stanford Persuasive Technology Lab, actually with the founders of Instagram, so I know the culture of the people who build these products, and the way that it is designed intentionally for mass deception.” — Tristan Harris

Today, Center for Humane Technology president and former Google Design ethicist Tristan Harris testified before the US House Subcommittee on Consumer Protection and Commerce during a hearing called “Americans at Risk: Manipulation and Deception in the Digital Age,” where he warned that big tech had created an environment that manipulates and controls almost every aspect of our lives.

“I’m going to go off script, I come here because I’m incredibly concerned,” Harris began.

“I studied at a lab called the Stanford Persuasive Technology Lab, actually with the founders of Instagram, so I know the culture of the people who build these products, and the way that it is designed intentionally for mass deception.

“We often frame these issues as ‘we’ve got a few bad apples — we’ve got these bad deepfakes and we got to get them off the platform. We’ve got this bad content; we’ve got these bad bots.’

“What I want to argue is we have ‘dark infrastructure.’

“This is now the infrastructure by which 2.7 billion people — bigger than the size of Christianity.”

Tristan Harris

“Tech companies manipulate our sense of identity, self-worth, relationships, beliefs, actions, attention, memory, physiology and even habit-formation processes, without proper responsibility”

Harris added in his written testimony that, “YouTube has north of 2 billion users, more than the followers of Islam. Tech platforms arguably have more psychological influence over two billion people’s daily thoughts and actions when considering that millions of people spend hours per day within the social world that tech has created, checking hundreds of times a day.”

Technology companies creating this dark infrastructure “manipulate our sense of identity, self-worth, relationships, beliefs, actions, attention, memory, physiology and even habit-formation processes, without proper responsibility,” according to the ethicist.

He added that “technology has directly led to the many failures and problems that we are all seeing: fake news, addiction, polarization, social isolation , declining teen mental health, conspiracy thinking, erosion of trust, breakdown of truth.”

“You can’t just bring some new agency around and regulate all of the virtual world”

As for the regulation of digital deception tactics such as deepfakes, dark patterns, and social media bots, Harris doesn’t see the need to create more government agencies, but rather to better equip and expand the ones that already exist.

“Instead of trying to design some new federal agency, some master agency, when technology has basically taken all the laws of the physical world and virtualized it in a virtual world with no laws — what happens when we have no laws for an entire virtualized infrastructure?

“You can’t just bring some new agency around and regulate all of the virtual world.

“Why don’t we take the existing infrastructure — the existing agencies […] and have a digital update that expands their jurisdiction to just ask, ‘How do we protect the tech platforms in the same areas of jurisdiction?'”

Wednesday’s hearing also heard testimony from:

This article was written while the hearing was still in progress.

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

View Comments

  • What I just read from Tristan Harris is a sweeping condemnation of social media that has offers no concrete examples . How exactly have I been brain washed, influenced and damaged?

  • @Eric Advertising at unprecedented accuracy, certain features/functionality which simulates behavior in your brain that works with your reward mechanisms etc. It's a bit more meta than ape look at Facebook ape do Facebook's bidding but there are real deceptive and manipulative practices going on within tech. Profit over people etc. Highly recommend digging deeper into it yourself.

Recent Posts

Not Your Typical CPA Firm: A CEO on Mission to Guide Companies Through the Ever-Changing World of Tech Compliance (Brains Byte Back Podcast)

In today’s episode of the Brains Byte Back podcast, we speak with Mike DeKock, the founder…

3 days ago

‘Social problems in substituting humans for machines will be easier in developed countries with declining populations’: Larry Fink to WEF

Blackrock CEO Larry Fink tells the World Economic Forum (WEF) that developed countries with shrinking…

3 days ago

Meet Nobody Studios, the enterprise creating 100 companies amidst global funding winter 

Founders and investors alike were hopeful the funding winter would start to thaw in 2024.…

4 days ago

As fintech innovation picks up pace, software experts like 10Pearls help lead the way

Neobanks and fintech solutions hit the US market more than a decade ago, acting as…

5 days ago

CBDC will hopefully replace cash, ‘be one hundred percent digital’: WEF panel

Central bank digital currencies (CBDCs) will hopefully replace physical cash and become fully digital, a…

5 days ago

Ethical Imperatives: Should We Embrace AI?

Five years ago, Frank Chen posed a question that has stuck with me every day…

1 week ago