Government and Policy

‘Censor social media content & harvest data from banned accounts’: House Intel witnesses testify on combating misinformation

Witnesses call for more big tech censorship to kick ‘conspiracy theories’ to fringe platforms, perform digital autopsies on removed accounts

Expert witnesses tell the House Intelligence Committee how to best combat “misinformation” online with more content restriction on social media and posthumous data harvesting from banned accounts.

Today, the House Permanent Select Committee on Intelligence held a rare open hearing on the subject of “Misinformation and Conspiracy Theories Online” where expert witnesses testified on best practices for censoring content.

Towards the end of the hearing, Chairman Adam Schiff asked a peculiar question whose answer hints at a future where there’s even more censorship and greater amounts of data harvested from social media accounts.

Cindy Otis

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed” — Cindy Otis

Schiff asked Alethea Group Vice President of Analysis Cindy Otis what data social media companies are not sharing that they should be sharing in order to help analysts do their work.

Otis responded that getting access to the data from content that had already been removed would be “extraordinarily helpful” for conducting digital autopsies to find out the strategies, methods, and tactics of social media movements.

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed,” Otis testified.

“On Facebook, for example, you get an announcement every week or every couple of weeks about the content that’s been removed. We get a couple of screen shots maybe. We get maybe an account, maybe a page name — that sort of thing — but it’s after the content has been removed.”

“That sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how’s this manifesting on the platform” — Cindy Otis

Otis added, “Unless we were particularly tracking that threat or were part of that analysis to begin with, we’re not able to go back and identify the tactics and procedures that were used by threat actors to do this campaign in the first place.

“And so that sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how this is manifesting on the platform.”

While Otis called for harvesting data posthumously from banned accounts like digital autopsies, Melanie Smith, Head of Analysis at Graphika Inc., testified that big tech platforms should continue to restrict content, so that movements like Qanon would be forced to alternative platforms with smaller audiences.

Melanie Smith

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted” — Melanie Smith

She argued that on the so-called alternative platforms, there would be fewer opportunities for the cross-pollination of ideas.

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted,” Smith testified.

But it didn’t stop with the big tech companies. Smith told the committee that there should also be more pressure on alternative platforms to restrict content after it’s already been beaten back to “the fringes.”

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space,” Smith testified.

“I also think there could be changes to platform engineering to restrict the exposure of new audiences to algorithmic re-enforcement of some of these ideas,” she added.

If you combine the strategies of both Smith and Otis, what you get is more censorship, and then once a user or group is banned, the data is harvested posthumously to discover their tactics.

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space” — Melanie Smith

As big tech companies purge thousands of accounts for spreading so-called conspiracy theories, there’s a lot of personal data that analysts could have access to if they had their way and if the platforms were to have an obligation to hand over that data.

The data could then be used to track where users go next, and the potential for abuses of privacy is enormous, no matter how well-intended the idea may sound.

Why would the chairman of the House Intelligence Committee ask data analysts what they would need if he wasn’t already thinking about a  way to obtain that data?

If analyzing data harvested from banned accounts would be so “extraordinarily helpful,” would there be an incentive to ban even more accounts, so more data could be collected?

Where would it end?

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

View Comments

Recent Posts

GAN, Tec de Monterrey partnership highlights cross-border startup ecosystem building in Latin America amid trade dispute

Despite recent tensions between the United States government and Latin American countries over migration and…

23 hours ago

This founder started out with US $5K to his name. Now, he owns a multi-million-dollar global business

Meet Nitin Seth, the Co-Founder and CEO of Screen Magic (SMS Magic), a messaging leader…

1 day ago

Building smarter: AI, the ultimate tool transforming an old-age industry

In this Brains Byte Back, we sit down with Hari Vasudevan, founder and CEO of…

2 days ago

When AI Goes Rogue: 8 Lessons from Implementing LLMs in the Healthcare Industry that Could Save the Future

By Santosh Shevade, Principal Data Consultant at Gramener – A Straive company All pharmaceutical companies…

3 days ago

Digital Public Infrastructure will enable public, private entities to control your access to essential goods, services & mobility

Digital Public Infrastructure is a top-down agenda coming from unelected globalists, bureaucrats, and their partners…

1 week ago

Open Source Claims to Be a Meritocracy—So Why Are Companies Buying Their Way In?

Imagine that you are a maintainer of a widely used open source project relied upon…

1 week ago