The US Intelligence Advanced Research Projects Activity (IARPA) is putting together a new research program to anonymize conversational speech recorded by IoT devices in real-time, so that the speaker cannot be identified.
With a stated goal to “protect an individual’s privacy,” IARPA’s Anonymous Real-Time Speech (ARTS) program is looking to “develop novel systems that will modify spontaneous speech in real-time,” according to the program overview.
For the US intelligence community, Internet of Things (IoT) devices represent “a growing source of data that can be collected to learn intent,” as IARPA director Catherine Marsh stated in December, 2021.
And as IoT devices become more prevalent, a person’s speech can be recorded and decoded to reveal “not only their identity, but also static traits, such as dialect, gender, and age, as well as dynamic traits, such as fear, stress, and anger,” according to the ARTS program overview.
Apart from revealing a person’s identity and static traits, these devices can also reveal a person’s geolocation and what’s going on in the background.
That’s not something the US intelligence community would want to happen to their spies in the field.
With this in mind, IARPA’s ARTS program aims “to research and develop novel methods that will anonymize conversational speech” that will “overcome threats to privacy, such as speaker identification software, human evaluation of static traits, and automated classification of dynamic traits.”
In the end, “The program desires novel software-based systems suitable for conversations, with the transformed speech that is understandable, sounds natural, and operates in real-time.”
While IARPA’s ARTS program looks to avoid exploitation from IoT devices, IARPA has another research program aimed at exploiting IoT devices to track people’s movements.
Announced in May 2023, IARPA’s Hidden Activity Signal and Trajectory Anomaly Characterization (HAYSTAC) program looks “to develop systems capable of modeling population movement patterns around the globe” using AI and sensors connected to the Internet of Things (IoT) and smart cities.
For program manager Dr. Jack Cooper, HAYSTAC represents “an unprecedented opportunity to understand how humans move, and HAYSTAC’s goal will be to build an understanding of what normal movement looks like at any given time and place.”
In the end, the goal of HAYSTAC is “to establish ‘normal’ movement models across times, locations, and populations and determine what makes an activity atypical.”
While both HAYSTAC and ARTS approach IoT data from different angles for different purposes, IARPA has another research program aimed anonymization and privacy protection similar to that of the ARTS program.
While IARPA’s ARTS program looks to anonymize speech to protect the speaker’s privacy, the US intelligence community’s research funding arm has another active program that looks to anonymize a person’s writing to protect the author’s privacy.
Last year, IARPA announced its Human Interpretable Attribution of Text Using Underlying Structure (HIATUS) research program, with the goal of both identifying an author by their writing style, and also making the author anonymous by removing their “linguistic fingerprints.”
According to the HIATUS program description:
“HIATUS seeks to develop novel human-useable AI systems for attributing authorship and protecting author privacy through identification and leveraging of explainable linguistic fingerprints.
“The program will develop novel techniques to generate representations that capture author-level linguistic variation and will use these representations to build human-interpretable algorithms to perform authorship attribution and ensure author privacy (i.e., via removal of author-identifying characteristics from text).”
Through authentic authorship attribution, the flow of information can become increasingly more transparent, or at the very least, more organized.
On the flip side, the flow of information could become even more distorted by anonymizing the source, concealing its origins, and adding more noise to the channel — a tactic used by spy agencies in which, “You create so much noise in the channel that people start to have overall doubts on all information that’s available in the media, social media, and other places,” as one former NSA foreign surveillance agent told The Sociable.
Could the same concept be applied to IARPA’s ARTS program?
IARPA will be holding a Proposers’ Day for the ARTS program on June 27 from 9:30AM to 5:00PM EDT.
Image by Pete Linforth from Pixabay
Every now and then, I stumble upon posts such as these here and there: And,…
Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…
Latin America’s tech industry is booming, with innovative new startups popping up across the region.…
The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…
In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…
This year wasn’t exactly what the video gaming industry expected — it declined by 7%…