Social Media

‘Social media services that I & others have built are tearing people apart’: ex-Facebook ads chief testifies

Facebook’s former director of monetization testifies that the social media services he helped build are tearing people apart, and that social media companies “worship at the altar of engagement” instead of promoting free speech.

Today, the man who once led the development Facebook’s advertising business told lawmakers in a Congressional hearing that he fears that the social media platforms he helped create are pushing society to the brink of a civil war in a worst case scenario.

Tim Kendall left Facebook in 2010, went on to serve as president at Pinterest, and is now working to reverse many of the troubling outcomes created by social media as the CEO of Moment.

Tim Kendall

“The social media services that I and others have built have torn people apart with alarming speed and intensity” — Tim Kendall

“When I started working in technology, my hope was to build products that brought people together in new and productive ways. I wanted to improve the world that we all lived in,” said Kendall in his opening testimony.

“Instead, the social media services that I and others have built have torn people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding as a country.

“At worst, I fear we are pushing ourselves to the brink of a civil war.

“At Facebook, I believe we sought to mine as much human attention as possible and turned it into historically unprecedented profits,” he added.

“These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism” — Tim Kendall

The former Facebook ads architect told lawmakers that social media companies were using the same playbook as big tobacco by making the platform as addictive as possible without regard for human health and safety.

“These services are making us sick. These services divide us. It’s time we take account of the damage, and it’s time we put in place the necessary measures to protect ourselves—and our country,” Kendall testified.

The former Facebook director of monetization said that social media companies are incentivized to keep users outraged and glued to their platforms for as long as possible in order to drive revenues and maximize profits.

“These services are making us sick. These services divide us. It’s time we take account of the damage, and it’s time we put in place the necessary measures to protect ourselves—and our country” — Tim Kendall

When asked if there were any incentives for social media companies to combat extremist content on their platforms, Kendall responded that social media companies wouldn’t change their behavior unless they faced “financial or civil or criminal penalties with the harm that they create.”

“I think that without enforcement, they’re just going to continue to be embarrassed by the mistakes, and they’ll talk about empty platitudes about, you know, ‘O gee! we hope we can get better operationally next time,’ but I don’t believe anything systemic will change,” said Kendall.

In his written testimony, Kendall explained how social media platforms manipulate human emotions to provoke outrage, and how it is all by design.

“Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions— it aims to provoke, shock, and enrage,” he wrote.

“I don’t believe social media is the root cause of every problem we’re facing, but I believe it may be the most powerful accelerant in history” — Tim Kendall

Kendall went on in his written testimony, “When you see something you agree with, you feel compelled to defend it. When you see something you don’t agree with, you feel compelled to attack it. People on the other side of the issue have the same impulses.

“The cycle continues with the algorithm in the middle happily dealing arms to both sides in an endless war of words and misinformation. All the while, the technology is getting smarter and better at provoking a response from you.

“These algorithms have brought out the worst in us. They’ve literally rewired our brains so that we’re detached from reality and immersed in tribalism.

“This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.”

On the subject of whether social media companies have any interest in defending free speech, Kendall testified:

“When it comes to misinformation, these companies hide behind the First Amendment and say they stand for free speech. At the same time, their algorithms continually choose whose voice is actually heard.

“I don’t think it’s free speech these companies revere. Instead, Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside, raising the voices of division, anger, hate and misinformation to drown out the voices of truth, justice, morality, and peace.”

“Facebook and their cohorts worship at the altar of engagement and cast all other concerns aside” — Tim Kendall

“I don’t believe social media is the root cause of every problem we’re facing, but I believe it may be the most powerful accelerant in history,” he added.

Kendall’s criticism of social media platforms is similar to that of former Google ethicist Tristan Harris, who testified before the same Congressional committee back in January.

Earlier this year, Harris told Congress that tech products and culture were “designed intentionally for mass deception” and that “tech companies manipulate our sense of identity, self-worth, relationships, beliefs, actions, attention, memory, physiology and even habit-formation processes, without proper responsibility.”

And in June, UC Berkeley professor and expert in digital forensics Dr. Hany Farid testified that Facebook has a toxic business model that puts profit over the good of society and that its algorithms have been trained to encourage divisiveness and the amplification of misinformation.

 

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

View Comments

Recent Posts

How a former Wall Street exec is saving your plants and the planet 

Jeanna Liu’s love for nature is rooted in her childhood. As a young girl, Liu…

6 days ago

New initiative announced to accelerate cloud, GenAI adoption in Latin America

The arrival of generative artificial intelligence (genAI) into the mainstream at the end of 2022…

6 days ago

Deborah Leff to join Horasis Advisory Board in boost to machine learning and data initiatives 

Data analytics and machine learning models deliver the most powerful results when they have access…

6 days ago

37, Emotionally Stuck, and Why the Journey Didn’t Change Me

I’ve been on the road for almost a year now. Chasing freedom, adventure, and purpose.…

1 week ago

Will iPhones Get Pricier Under Trump’s Leadership?

As technological use increases, so may the cost of innovation due to the global movement…

1 week ago

The Science of Gift-Giving: 10 Functional Gifts for the Holidays

Have you ever asked yourself why some people are amazing at picking gifts, while others…

1 week ago