Social Media

‘Social media services that I & others have built are tearing people apart’: ex-Facebook ads chief testifies

4.99Kviews

Facebook’s former director of monetization testifies that the social media services he helped build are tearing people apart, and that social media companies “worship at the altar of engagement” instead of promoting free speech.

Today, the man who once led the development Facebook’s advertising business told lawmakers in a Congressional hearing that he fears that the social media platforms he helped create are pushing society to the brink of a civil war in a worst case scenario.

Tim Kendall left Facebook in 2010, went on to serve as president at Pinterest, and is now working to reverse many of the troubling outcomes created by social media as the CEO of Moment.

Tim Kendall
Tim Kendall

“The social media services that I and others have built have torn people apart with alarming speed and intensity” — Tim Kendall

“When I started working in technology, my hope was to build products that brought people together in new and productive ways. I wanted to improve the world that we all lived in,” said Kendall in his opening testimony.

“Instead, the social media services that I and others have built have torn people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding as a country.

“At worst, I fear we are pushing ourselves to the brink of a civil war.

“At Facebook, I believe we sought to mine as much human attention as possible and turned it into historically unprecedented profits,” he added.

“These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism” — Tim Kendall

The former Facebook ads architect told lawmakers that social media companies were using the same playbook as big tobacco by making the platform as addictive as possible without regard for human health and safety.

“These services are making us sick. These services divide us. It’s time we take account of the damage, and it’s time we put in place the necessary measures to protect ourselves—and our country,” Kendall testified.

The former Facebook director of monetization said that social media companies are incentivized to keep users outraged and glued to their platforms for as long as possible in order to drive revenues and maximize profits.

“These services are making us sick. These services divide us. It’s time we take account of the damage, and it’s time we put in place the necessary measures to protect ourselves—and our country” — Tim Kendall

When asked if there were any incentives for social media companies to combat extremist content on their platforms, Kendall responded that social media companies wouldn’t change their behavior unless they faced “financial or civil or criminal penalties with the harm that they create.”

“I think that without enforcement, they’re just going to continue to be embarrassed by the mistakes, and they’ll talk about empty platitudes about, you know, ‘O gee! we hope we can get better operationally next time,’ but I don’t believe anything systemic will change,” said Kendall.

In his written testimony, Kendall explained how social media platforms manipulate human emotions to provoke outrage, and how it is all by design.

“Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions— it aims to provoke, shock, and enrage,” he wrote.

“I don’t believe social media is the root cause of every problem we’re facing, but I believe it may be the most powerful accelerant in history” — Tim Kendall

Kendall went on in his written testimony, “When you see something you agree with, you feel compelled to defend it. When you see something you don’t agree with, you feel compelled to attack it. People on the other side of the issue have the same impulses.

“The cycle continues with the algorithm in the middle happily dealing arms to both sides in an endless war of words and misinformation. All the while, the technology is getting smarter and better at provoking a response from you.

“These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism.

“This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.”

On the subject of whether social media companies have any interest in defending free speech, Kendall testified:

“When it comes to misinformation, these companies hide behind the First Amendment and say they stand for free speech. At the same time, their algorithms continually choose whose voice is actually heard.

“I don’t think it’s free speech these companies revere. Instead, Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside, raising the voices of division, anger, hate and misinformation to drown out the voices of truth, justice, morality, and peace.”

“Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside” — Tim Kendall

“I don’t believe social media is the root cause of every problem we’re facing, but I believe it may be the most powerful accelerant in history,” he added.

Kendall’s criticism of social media platforms is similar to that of former Google ethicist Tristan Harris, who testified before the same Congressional committee back in January.

Earlier this year, Harris told Congress that tech products and culture were “designed intentionally for mass deception” and that “tech companies manipulate our sense of identity, self-worth, relationships, beliefs, actions, attention, memory, physiology and even habit-formation processes, without proper responsibility.”

And in June, UC Berkeley professor and expert in digital forensics Dr. Hany Farid testified that Facebook has a toxic business model that puts profit over the good of society and that its algorithms have been trained to encourage divisiveness and the amplification of misinformation.

‘Facebook’s business model is poison & its algorithms amplify misinformation’: digital forensics expert testifies

Facebook says it doesn’t benefit from hate, but its algorithms tell a different story: op-ed

Symptoms of shadowbanning on social media for your diagnosis

 

Leave a Response

Tim Hinchliffe
Tim Hinchliffe is the editor of The Sociable. His passions include writing about how technology impacts society and the parallels between Artificial Intelligence and Mythology. Previously, he was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. tim@sociable.co