Government and Policy

‘Censor social media content & harvest data from banned accounts’: House Intel witnesses testify on combating misinformation

Witnesses call for more big tech censorship to kick ‘conspiracy theories’ to fringe platforms, perform digital autopsies on removed accounts

Expert witnesses tell the House Intelligence Committee how to best combat “misinformation” online with more content restriction on social media and posthumous data harvesting from banned accounts.

Today, the House Permanent Select Committee on Intelligence held rare open hearing on the subject of “Misinformation and Conspiracy Theories Online” where expert witnesses testified on best practices for censoring content.

Towards the end of the hearing, Chairman Adam Schiff asked a peculiar question whose answer hints at a future where there’s even more censorship and greater amounts of data harvested from social media accounts.

Cindy Otis

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed” — Cindy Otis

Schiff asked Alethea Group Vice President of Analysis Cindy Otis what data social media companies are not sharing that they should be sharing in order to help analysts do their work.

Otis responded that getting access to the data from content that had already been removed would be “extraordinarily helpful” for conducting digital autopsies to find out the strategies, methods, and tactics of social media movements.

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed,” Otis testified.

“On Facebook, for example, you get an announcement every week or every couple of weeks about the content that’s been removed. We get a couple of screen shots maybe. We get maybe an account, maybe a page name — that sort of thing — but it’s after the content has been removed.”

“That sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how’s this manifesting on the platform” — Cindy Otis

Otis added, “Unless we were particularly tracking that threat or were part of that analysis to begin with, we’re not able to go back and identify the tactics and procedures that were used by threat actors to do this campaign in the first place.

“And so that sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how this is manifesting on the platform.”

While Otis called for harvesting data posthumously from banned accounts like digital autopsies, Melanie Smith, Head of Analysis at Graphika Inc., testified that big tech platforms should continue to restrict content, so that movements like Qanon would be forced to alternative platforms with smaller audiences.

Melanie Smith

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted” — Melanie Smith

She argued that on the so-called alternative platforms, there would be fewer opportunities for the cross-pollination of ideas.

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted,” Smith testified.

But it didn’t stop with the big tech companies. Smith told the committee that there should also be more pressure on alternative platforms to restrict content after it’s already been beaten back to “the fringes.”

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space,” Smith testified.

“I also think there could be changes to platform engineering to restrict the exposure of new audiences to algorithmic re-enforcement of some of these ideas,” she added.

If you combine the strategies of both Smith and Otis, what you get is more censorship, and then once a user or group is banned, the data is harvested posthumously to discover their tactics.

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space” — Melanie Smith

As big tech companies purge thousands of accounts for spreading so-called conspiracy theories, there’s a lot of personal data that analysts could have access to if they had their way and if the platforms were to have an obligation to hand over that data.

The data could then be used to track where users go next, and the potential for abuses of privacy is enormous, no matter how well-intended the idea may sound.

Why would the chairman of the House Intelligence Committee ask data analysts what they would need if he wasn’t already thinking about a  way to obtain that data?

If analyzing data harvested from banned accounts would be so “extraordinarily helpful,” would there be an incentive to ban even more accounts, so more data could be collected?

Where would it end?

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

View Comments

Recent Posts

Ethical Imperatives: Should We Embrace AI?

Five years ago, Frank Chen posed a question that has stuck with me every day…

8 hours ago

The Tech Company Brief by HackerNoon: A Clash with the Mainstream Media

What happens when the world's richest man gets caught in the crosshairs of one of…

9 hours ago

New Synop app provides Managed Access Charging functionality to EV fleets

As companies that operate large vehicle fleets make the switch to electric vehicles (EVs), a…

2 days ago

‘Predictive government’ is key to ‘govtech utopia’: Saudi official to IMF

A predictive government utopia would be a dystopian nightmare for constitutional republics: perspective Predictive government…

2 days ago

Nilekani, Carstens propose digital ID, CBDC-powered ‘Finternet’ to be ‘the future financial system’: BIS report

The finternet will merge into digital public infrastructure where anonymity is abolished, money is programmable…

1 week ago

Upwork’s Mystery Suspensions: Why Are High-Earning Clients Affected?

After more than ten years on Elance / oDesk / Upwork, I dare to say…

1 week ago