IoT devices are vulnerable to state-enlisted espionage: Booz Allen report
Booz Allen releases its 2019 Cyber Threat Outlook report highlighting state-enlisted espionage on IoT devices and AI deepfakes, among others, as major cyber threats this year.
“Connected televisions, webcams, and printers have been enlisted to mine cryptocurrency, launch DDoS attacks, and cause other mischief.”
The Booz Allen report outlines the top eight cybersecurity threats trending this year:
- Companies in the crosshairs of information warfare
- Internet of Things (IoT) devices broaden state espionage operations
- Chip and PIN may fall short
- The weaponization of adware networks
- Deepfakes in the wild—Artificial Intelligence (AI) in information warfare
- New Frontiers—the expanding wireless attack surface
- State-sponsored threat actors double down on deception
- Water utility targeting bubbles to the surface
State-enlisted Actors Hijack IoT Devices (Including CIA, Mi5)
“Connected televisions, webcams, and printers have been enlisted to mine cryptocurrency, launch DDoS attacks, and cause other mischief. In 2019, state-linked adversaries will likely increasingly abuse these devices to further their espionage and warfare efforts,” the report reads.
With respect to the treat of IoT devices being used for espionage enlisted by the state, the report makes allusions to Russia and South Korea; however, it has been well-documented by WikiLeaks that both the United States and Great Britain have hijacked IoT devices to spy on people.
Two of the most notable state-linked entities abusing these devices include the intelligence agencies CIA and Mi5.
In April of 2017, WikiLeaks revealed that both British and American intelligence agencies used an implant on Samsung TVs to secretly listen-in on user conversations.
Using MI5’s EXTENDING Tool, American counterparts at the CIA developed the listening implant tool code-named “Weeping Angel” to record audio on Samsung F Series Smart Televisions.
Even when the TV was off, the CIA and MI5 could still record audio using the aptly-named “Fake-off” recording feature.
If you want to check out the state-sponsored IoT espionage user’s guide for EXTENDING Tool, it’s available to read here.
Another reason the IoT is vulnerable is that people who buy IoT devices don’t change the default password from the factory settings, and most users use the same password across multiple devices.
The Booz Allen report continues, “About 15 percent of IoT device owners don’t change their devices’ default passwords, and 10 percent of IoT devices use one of the same five passwords for administrative access, according to one 2017 estimate.
“IoT botnets, especially state-owned ones, present difficult challenges for defenders. Attempting to backlist astronomically large volumes of smart televisions and DVRs would probably be impractical. An adversary running a self-contained IoT proxy botnet, which we’ve dubbed a ‘boxynet,’ would not need to worry about third-party botnet managers logging their activity or otherwise compromising their anonymity.”
Almost anything digital can be faked. With Adobe’s Voco and Stanford University’s Face2Face technologies, virtually any person dead or alive can be imitated with vocal and facial manipulation.
“AI-generated video—commonly referred to as ‘deepfakes’—use machine-learning algorithms to create highly believable forgeries that can be used to depict individuals saying or doing things that never occurred,” according to Booz Allen.
Journalism, business, governments, and geopolitics can all fall victim to deepfakes.
According to the report, “Attributing false quotes to political leaders is a tactic that has already been used by likely state-sponsored threat actors to significant effect.”
“Weaponized leaks—in which data is stolen and released publicly, sometimes with falsified data blended in—have increasingly been leveraged in influence operations. This tactic could similarly incorporate false video content mixed among a trove of stolen, but otherwise legitimate data, to increase the believability of the ruse.”
There is a great line from the 1993 blockbuster Jurassic Park that comes from Dr. Ian Malcolm, played by Jeff Goldblum, when he confronts the park’s owner on the ethics and dangers of reviving dinosaurs after 65 million years of extinction:
“Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”
That same sentiment is being expressed when it comes to voice and facial reenactment technologies that now have the power to imitate virtually anyone who has ever been recorded audibly or visually.
Be sure to check out the full report linked at the beginning of this article.