Senators ask big tech to explain how they curb child sexual abuse material
Five US senators are asking that big tech companies explain what efforts they are taking to curb child sexual abuse material.
“We are writing to request information on what your company is actively doing to identify, prevent, and report child sexual abuse material and other forms of child exploitation”
“We write with concern that technology companies have failed to take meaningful steps to stop the creation and sharing of child sexual abuse material (CSAM) on their online platforms,” Senators Josh Hawley, Richard Blumenthal, Lindsey Graham, Mazie Hirono, and John Cornyn wrote to Google CEO Sundar Pichai on Monday.
“We are writing to request information on what your company is actively doing to identify, prevent, and report child sexual abuse material and other forms of child exploitation,” the senators added.
According to New York Times author Michael Keller’s Twitter post on Tuesday, Google is one of 36 tech companies being asked to explain what they’re doing to curb CSAM on their platforms.
Here’s the full list of companies: Adobe, Amazon, Apple, Automattic (WordPress and Tumblr), Cloudflare, DigitalOcean, Discord, Dropbox, Facebook, Flickr, GoDaddy, ImageShack, Imgur, ManyVids, MediaLab (Whisper & Kik), Microsoft, Mozilla, Omegle, Paypal, Photobucket… 2/
— Michael Keller (@mhkeller) November 19, 2019
In their letter the five senators requested that Pichai answer the following questions in writing by December 4 (some questions have been left out or have been condensed for brevity and you can view the full list of questions here).
- Do you automatically identify CSAM that is created and upoloaded to your platform(s)? Please describe how you identify CSAM.
- How many reports of CSAM have you provided to the NCMEC CyberTipline on an annual basis for the past three years?
- How many pieces of CSAM did you remove from your platform(s) in 2018?
- What measures have you taken to ensure that steps to improve the privacy and security of users do not undermine efforts to prevent the sharing of CSAM or stifle law enforcement investigations into child exploitation?
- What are the main obstacles in identifying all CSAM posted to your platform(s) automatically?
- Do you provide notice to individuals in the transmission of CSAM when you report or submit evidence of such activities to NCMEC or law enforcement?
- What other barriers do you face in receiving or sharing information, hashes, and other indicators of CSAM with other companies?
- Have you implemented any technologies or techniques to automatically flag CSAM that is new or has not been previously identified, such as the use of machine learning and image processing to recognize underage individuals in exploitative situations?
- What steps have you taken to ensure that CSAM detection efforts are incorporated in each appropriate produce and service associated with your platform(s)?
- If your platform(s) include a search engine, please describe the technologies and measures you use to block CSAM from appearing in search results.
- What, if any, proactive steps are you taking to detect online grooming of children?
The five senators cited the New York Times reporting by Keller and Gabriel Dance to back up their concerns, and today Keller listed 36 tech companies that the senators were questioning on the issue of CSAM.