Facebook’s counterterrorism chief will educate Congress on digital deception, along with a former Google ethicist, a law professor, and the director of the Technology and Social Change Research Project.
“They are creating deepfakes and spreading disinformation and misinformation to deceive Americans for their own profit and gain”
On Wednesday, January 8, 2020 the US House Subcommittee on Consumer Protection and Commerce will hold a hearing dubbed “Americans at Risk: Manipulation and Deception in the Digital Age.”
Specifically, the hearing is set to address three core topics:
- Deepfakes and Cheap Fakes:
- Fake videos can be used for malicious purposes, including facilitating the spread of misinformation and disinformation for political or commercial purposes and sowing discord. They can also be difficult to detect by human review and technology.
- Dark Patterns:
- Techniques that are incorporated in user interfaces (e.g., pop-up screens and webpages) and are designed to encourage or trick users into doing things they might not otherwise do, such as sneaking additional items into customer’s shopping baskets, adding a countdown timer to the webpage falsely implying that a deal will expire.
- Social Media Bots:
- Bots have been used to run fake social media accounts to make products or people look more popular, drive clicks and raise advertising revenue, and to spread disinformation and stir division by state actors.
While some practices are illegal, “much of the deception and manipulation that occurs online is; however legal and unregulated,” according to a House Memorandum.
The hearing will boast an impressive lineup of witnesses with strong backgrounds and expertise:
- Monika Bickert:
- Facebook’s head of product policy and counterterrorism
- Aspen Cybersecurity Group member
- Former resident legal advisor at the US Embassy in Bangkok
- Specialist in response to child exploitation and human trafficking
- Tristan Harris:
- Co-Founder of the Center for Humane Technology
- Former Google Design ethicist
- Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic
- Joan Donovan:
- Director of the Technology and Social Change Research Project at Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy
- Former Research Lead for Data & Society’s Media Manipulation Initiative
- Justin (Gus) Hurwitz:
- Co-Director of Space, Cyber, and Telecom Law Program, University of Nebraska College of Law
- Director for Law and Economics Programming with the International Center for Law & Economics
- Worked at Los Alamos National Lab and interned at the Naval Research Lab
“We look forward to exploring these issues at the hearing and engaging with experts to discuss how we can combat harmful deception and manipulation online”
House Energy and Commerce Committee Chairman Frank Pallone, Jr. and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky said in a joint statement:
“As the internet has matured from a blogging platform to the engine that powers our economy, bad actors who seek to manipulate consumers have become more sophisticated.
“They are creating deepfakes and spreading disinformation and misinformation to deceive Americans for their own profit and gain.
“We look forward to exploring these issues at the hearing and engaging with experts to discuss how we can combat harmful deception and manipulation online.”
According to the House Memorandum, “Deceptive techniques are being used to manipulate consumers into making purchases they may not have otherwise made, paying higher prices than they expected, or accepting terms of service or privacy policies that they would not have otherwise or ordinarily accepted.”
The hearing will also look to uncover how to address misinformation that “has ranged from fake product reviews to election interference generated by a variety of actors from individuals to nation states.”
In June, 2019 the House Intelligence Committee held its first hearing devoted specifically to examining deepfakes and other types of AI-generated synthetic data, which was announced shortly after the release of a cheap fake video that appeared to show House Speaker Nancy Pelosi drunkenly slurring her words.
There are a few tech companies creating deepfake software while there are others who create software to detect deepfakes — a Bellerephon to combat every shape-shifting Chimera.
No one in the industry is really asking if we actually need the ability to clone other people’s voices or to make fake videos indistinguishable from the real ones; they just know that it can be done well enough to make a profit in the short term.
And years before the term “deepfake” had a Wikipedia page, The Sociable warned back in 2016 that the technology raised ethical concerns about the future integrity of journalism, could be used as psychological warfare, and was a terrible threat to democracy.
Virtually any person dead or alive can be imitated with vocal and facial manipulation.
The subjects of disinformation and misinformation will also be discussed in Wednesday’s hearing, and the witnesses should have plenty of experience dealing with those.
Disinformation spreads like wildfire across the web, and big tech companies encourage people to stay in their echo chambers by personalizing or customizing the news for their users.
While these services do help users find the type of news they’re looking for, they open the door for people to become more entrenched towards certain avenues of information, regardless of accuracy.
For example, if you look at how Facebook operates, it actually encourages people to stay in their echo chambers — to stay connected with like-minded people. This helps to ensure that users spend more time on the Facebook platform, which means they will be exposed to more ads, which makes Facebook more money.
According to The Telegraph, “The social networking site creates an ‘echo chamber’ in which a network of like-minded people share controversial theories, biased views and selective news, academics found.”
“Research finds that [Facebook] users seek out information that reinforces their beliefs, which is then shared and given increasing weight whether accurate or otherwise.”