GROW YOUR TECH STARTUP

‘Facebook’s business model is poison & its algorithms amplify misinformation’: digital forensics expert testifies

June 25, 2020

SHARE

facebook icon facebook icon

Facebook’s business model is poison and its divisive algorithms are a driving force behind online misinformation, according to a digital forensic expert’s testimony during a House hearing on Wednesday.

Who’s to blame for the spread of misinformation online? How does it spread and what can be done about it?

The answers are not all black and white, and that was the subject of yesterday’s congressional hearing titled, “A COUNTRY IN CRISIS: HOW DISINFORMATION ONLINE IS DIVIDING THE NATION.”

Appearing virtually before the House Committee on Energy and Commerce UC Berkeley professor and expert in digital forensics Dr. Hany Farid testified that Facebook has a toxic business model that puts profit over the good of society and that its algorithms have been trained to encourage divisiveness and the amplification of misinformation.

Dr. Hany Farid

Dr. Hany Farid

“Mark Zuckerberg is hiding the fact that he knows that hate, lies, and divisiveness are good for business” — Dr. Hany Farid

For example, Facebook’s business model is ad-based, and a good way to get more ad revenue is to keep people glued to the platform, and a good way to keep people glued to the platform is to stir up controversy.

“Mark Zuckerberg is hiding the fact that he knows that hate, lies, and divisiveness are good for business,” Farid testified.

“He’s hiding the fact that content moderation is bad for business, and so he props up these phony arguments to hide behind.

“And I think that Mark Zuckerberg is hiding the fact that his entire business model — of maximizing engagement, of maximizing advertising dollars — just stinks!

“It’s bad for us as individuals. It’s bad for society. It’s bad for democracy, but it’s awfully good for his bottom line — to the tune of $70 billion last year,” he added.

Facebook’s business model is quite simple. As Zuckerberg once famously quipped, “Senator, we run ads.”

But there’s a little more to it than that.

“They didn’t set out to fuel misinformation and hate and divisiveness, but that’s what the algorithms learned” — Dr. Hany Farid

“Social media is in the engagement and attention business. They profit when we spend more time on the platform. They collect more data from us, and they deliver ads,” Farid testified.

“They didn’t set out to fuel misinformation and hate and divisiveness, but that’s what the algorithms learned,” he added.

Using algorithms that were designed only to keep people on the page to drive profit is a poisonous business model, according to Farid, who has lent his tech expertise to DARPA, the New York Times, AP, and Reuters.

“Algorithms have learned that the hateful, the divisive, the conspiratorial, the outrageous, and the novel keep us on the platforms longer, and since that is the driving factor for profit, that’s what the algorithms do,” said Farid.

“The core poison here is the business model. The business model is that when you keep people on the platform, you profit more, and that is fundamentally at odds with our societal and democratic goals,” he added.

“[Zuckerberg] is hiding the fact that content moderation is bad for business, and so he props up these phony arguments to hide behind” — Dr. Hany Farid

Facebook’s algorithms could be reprogrammed to encourage more civil discourse while curbing misinformation, but that would require a little incentive.

Meanwhile, in an ironic twist of fate, as Facebook polarizes its users, its users are turning their anger towards the social media giant itself.

Facebook is faced with the accusation of not doing enough to curb hate speech, but at the same time, it is also accused of stifling free speech, and all the while its algorithms continue their quest for everlasting engagement in the background.

Even polarized users from opposites ends of the political and social spectrum agree that Facebook needs to change, and all poor Facebook wants is profit and power.

“It is these algorithms that are at the core of the misinformation amplification” — Dr. Hany Farid

“The internet, and social media in particular, is failing us on an individual, societal, and democratic level,” Farid submitted in his written testimony.

“Online content providers have prioritized growth, profit, and market dominance over creating a safe and healthy online ecosystem.

“Many want to frame the issue of content moderation as an issue of freedom of speech. It is not. First, private companies have the right to regulate content on their services without running afoul of the first amendment, as many routinely do when they ban legal adult pornography.

“Second, the issue of content moderation should focus not on content removal but on the underlying algorithms that determine what is relevant and what we see, read, and hear online. It is these algorithms that are at the core of the misinformation amplification,” he added.

“Standing in the way of this much needed change is a lack of corporate leadership, a lack of competition, a lack of regulatory oversight, and a lack of education among the general public. Responsibility, therefore, falls on the private sector, government regulators, and we the general public,” Farid concluded.

“The issue of content moderation should focus not on content removal but on the underlying algorithms that determine what is relevant” — Dr. Hany Farid

The UC Berkeley professor suggested that one area in which Facebook could take immediate action to curb misinformation would be to change the algorithms, but that the social media giant wouldn’t do so without “the right incentives” — be they “regulatory, advertising, or competition.”

The government could come in and regulate, advertisers could decide to stop selling ads on the platform, or a competitor could come along to encourage a mass migration much like Parler is trying to do with #Twexit to move people from Twitter to Parler.

Brandi Collins-Dexter

Brandi Collins-Dexter

“When executives at Facebook were alerted that their algorithms were dividing people in dangerous ways, they rushed to kill any efforts to create a healthy dialogue on the platform” — Brandi Collins-Dexter

Testifying along similar lines in the same hearingColor Of Change Senior Campaign Director Brandi Collins-Dexter added, “When executives at Facebook were alerted that their algorithms were dividing people in dangerous ways, they rushed to kill any efforts to create a healthy dialogue on the platform.”

She added that tech companies have routinely failed to uphold at least three core societal values:

  1. Transparency: There are no clear processes for challenging the decisions of tech companies.
  2. Accountability: Currently, policies are implemented at the discretion of platform companies, and are not uniformly applied especially to those in positions of political power.
  3. Fairness: Social media platforms talk about democracy, but fail to uphold its principles

“While the Internet has provided a means for decentralized media voices to breathe digital oxygen into emerging mobilization efforts, it has also given rise to new tech oligarchies and distortions of political thought,” Collins-Dexter testified.

‘When technology takes control of our information environment, it takes control of humanity’: Center for Humane Tech president

House Intel Committee ditches script, grills big tech on polarizing users and toxic civic discourse

FBI identifies over 400 victims of online child sexual abuse activity during pandemic: Senate testimony

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending