Big TechGovernment and Policy

Big tech’s addictive business model makes us ‘attention vampires,’ distracts from urgent threats like China: ‘Social Dilemma’ star testifies

Big tech offers 2 dystopian choices: Orwellian authoritarianism & censorship or Huxleyan outrage & distractions: Tristan Harris

Tristan Harris, Image from January 8, 2020
1.48Kviews

Ex-Google design ethicist and star of the Netflix documentary “The Social Dilemma” Tristan Harris tells lawmakers that no amount of content moderation policies can fix big tech’s core problem — the business model of keeping users addicted, outraged, and polarized.

Tristan Harris
Tristan Harris, April 27, 2021

“A business model that preys on human attention means that we are worth more as human beings and as citizens of this country when we are addicted, outraged, polarized, narcissistic, and disinformed” — Tristan Harris

Testifying before the Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law on Tuesday, Harris warned that big tech platforms like Facebook, Twitter, and YouTube profit from keeping their users addicted, and that the big tech business model is a giant distraction from urgent existential threats, like the rise of China.

“At the end of the day, a business model that preys on human attention means that we are worth more as human beings and as citizens of this country when we are addicted, outraged, polarized, narcissistic, and disinformed because that means that the business model was successful at steering our attention using automation,” said Harris, who is also president of the Center for Humane Technology.

“When the entire model is predicated on dividing society, it’s like Exxon talking about the number of trees they have planted, while their extractive business model hasn’t changed” — Tristan Harris

Joining Harris in Tuesday’s hearing on “Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds” were expert witnesses from big tech and academia, including:

  • Monicka Bickert, Facebook’s VP for Content Policy
  • Lauren Culbertson, Twitter’s Head of US Public Policy
  • Alexandra Veitch, YouTube’s Director Of Government Affairs And Public Policy For The Americas And Emerging Markets
  • Dr. Joan Donovan, Research Director Shorenstein Center on Media, Politics, and Public Policy and Lecturer in Public Policy at John F. Kennedy School of Government at Harvard University

Under oath, Harris told Senator Josh Hawley that he didn’t believe that people in the tech industry were evil or deliberately trying to do harm, but that the big tech acted like a “digital drug lord” that “turns us all into attention vampires that want attention from other people.”

In his written testimony, Harris elaborated that while we are addicted to outrage, polarization, and disinformation, “we face genuine existential threats that require urgent attention,” including:

  • The rise of China
  • A climate crisis
  • Nuclear proliferation
  • Vulnerable infrastructure
  • Dangerous inequality

“Today’s tech platforms disable our capacity to address these urgent problems,” according to Harris.

“We should be interested in structural reforms for tech platforms’ incentives that would comprehensively strengthen rather than disable our capacity to respond to these existential threats, especially in competition with China” — Tristan Harris

In his testimony, Harris outlined the core problem of big tech — from which all other other problems stem — the business model.

He also provided a potential solution — “structural reforms for tech platforms’ incentives that would comprehensively strengthen rather than disable our capacity to respond to these existential threats.”

“We must reset our criteria for success,” Harris submitted, adding, “Instead of evaluating whether my fellow Facebook, Twitter and YouTube panelists have improved their content policies or hired more content moderators, we should ask what would collectively constitute a ‘humane” Western digital democratic infrastructure that would strengthen our capacity to meet these threats.

“Instead of shortening attention spans, distracting us, competing for addiction and outrage […] they would compete from the bottom-up to deepen and cultivate our best traits, sustained thinking and concentration, better critical thinking, facilitating easier ways to understand each other and identify solutions built on common ground.

“We should be interested in structural reforms for tech platforms’ incentives that would comprehensively strengthen rather than disable our capacity to respond to these existential threats, especially in competition with China.”

“Instead of shortening attention spans, distracting us, competing for addiction and outrage … [big tech] would compete from the bottom-up to deepen and cultivate our best traits” — Tristan Harris

Acknowledging the business executives from Facebook, Twitter, and YouTube that were also present at the hearing, Harris commented in his testimony:

“My fellow panelists from technology companies will say:

  • We catch XX% more hate speech, self-harm and harmful content using AI
  • We took down XX billions of fake accounts, up from YY% last year
  • We have Content Oversight Boards and Trust & Safety Councils
  • We spend $X million more on Trust & Safety in 2021 than we made in revenue in an entire year

“But none of this is adequate to the challenge stated above,” added Harris.

“When the entire model is predicated on dividing society, it’s like Exxon talking about the number of trees they have planted, while their extractive business model hasn’t changed.”

“We should aim for nothing less than a comprehensive shift to a humane, clean ‘Western digital infrastructure’ worth wanting” — Tristan Harris

At present, Harris sees society facing two dystopian paths in the digital space, and he hopes we find a third way out.

“We should aim for nothing less than a comprehensive shift to a humane, clean ‘Western digital infrastructure’ worth wanting,” he wrote.

“We are collectively in the middle of a major transition from 20th century analog societies to 21st century ‘digitized’ societies.

“Today we are offered two dystopian choices:

  • Install a Chinese ‘Orwellian’ brain implant into society with authoritarian controls, censorship and mass behavior modification.
  • Install the US/Western ‘Huxleyan’ societal brain implant that saturates us in distractions, outrage, trivia and amusing ourselves to death.

“Let’s use today’s hearing to encourage a third way, to have the government’s help in incentivizing Digital Open Societies worth wanting, that outcompete Digital Closed Societies,” Harris concluded in his written testimony.

Tech products, culture are ‘designed intentionally for mass deception’: Ex-google ethicist testifies

‘When technology takes control of our information environment, it takes control of humanity’: Center for Humane Tech president

‘Social media services that I & others have built are tearing people apart’: ex-Facebook ads chief testifies

China’s digital currency will help CCP punish or coerce citizens with social credit system: CNAS report

Leave a Response

Tim Hinchliffe
Tim Hinchliffe is the editor of The Sociable. His passions include writing about how technology impacts society and the parallels between Artificial Intelligence and Mythology. Previously, he was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. tim@sociable.co