Cyborgs and genetically enhanced “supersoldiers” are projected to be the warfighters of the future, according to a recent RAND Corporation report.
Published on January 2, 2024, the report, “Plagues, Cyborgs, and Supersoldiers: The Human Domain of War,” highlights advances in human-machine systems, along with AI and synthetic biology as being among the technologies that will be used to engineer the future warfighter.
According to the report, these technologies will give rise to seemingly telepathic capabilities where soldiers will be able to control machines using their thoughts, along with the ability to genetically modify warfighters, so that they will be able to survive in “the harshest of combat environments.”
The Internet of Bodies (IoB), which refers to an ecosystem consisting of interconnected devices that can be worn, ingested, or implanted, is one way in which warfighters of the future are projected to become cyborgs.
The authors note that “IoB and related technologies present a variety of potential opportunities to warfighters. For example, the US Army is running studies to determine whether wearables can help with soldier wellbeing and fitness. Australian researchers have shown that military robot quadrupeds can be steered by brain signals collected and translated by a graphene sensor worn behind the ear of a nearby soldier.”
On the positive side, “Combining IoB data with advanced machine learning (ML) and AI algorithms can potentially enable tremendous advancements in health care, particularly precision medicine.”
However, the IoB also comes with grave risks in terms of cybersecurity, intelligence gathering, privacy, and targeted attacks that can essentially hijack the brain of the user and cause unimaginable harms.
As brain-computer interfaces (BCIs) become more prevalent, the report warns:
The authors go on to explain that this type of brain hacking is already considered a threat and that “several organizations based in China were found to ‘use biotechnology processes to support Chinese military end uses and end users, to include purported brain-control weaponry’ and, because of this, these entities were added to the Department of Commerce’s Entity List to restrict trade with those organizations.”
While brain hacking is discussed in the context of military environments, the same technology is increasingly becoming more commercially available for the general public, including the workplace.
For example, speaking at the World Economic Forum (WEF) annual meeting in January 2023, Dr. Nita Farahany from Duke University explained:
“Artificial intelligence has enabled advances in decoding brain activity in ways we never before thought possible,” said Farahany.
“What you think, what you feel — it’s all just data — data that in large patterns can be decoded using artificial intelligence,” she added.
And the devices to decode the human brain don’t have to be as invasive as a brain implant.
The devices can be as non-invasive as a “Fitbit for your brain.”
Another type of IoB risk, according to the recent RAND report, “derives from information security issues with IoB-collected data.”
For example, “A security vulnerability in the Strava app reportedly allowed unknown users to identify and track the movements of Israeli service members inside military bases, even if users limited who could view their Strava profiles,” and “In 2023, it was reported that the Strava app might have been used to track a Russian submarine commander who was killed while jogging.”
In a similar vein, US Intelligence Advanced Research Projects Activity (IARPA) director Catherine Marsh said back in 2021 that the US spy community was increasingly looking to Internet of Things (IoT) devices as “a growing source of data that can be collected to learn intent.”
“Developing these new sensors and detectors, as well as thinking about clever ways to collect multi-modal data to reveal what our adversaries are attempting to hide from us is at the very core of what our collection programs are aimed at doing,” said Marsh.
Moving beyond the IoB and cyborgs, another way in which the warfighter of the future is projected to become a “supersoldier” is through genetic engineering, specifically genomic enhancement.
According to the recent RAND report, genomic enhancement refers to “the process of isolating and using accessible genomic information or treatments to alter a trait in the human body or the environment to enhance resiliency at a micro (individual) or macro (societal) scale.”
The authors predict that “potential near-future genomic enhancements of key warfighting traits could be the ability to function with less sleep, more physical stamina, and improved breathing capacity.”
This recent report echoes another RAND report from 2021 entitled, “Technological Approaches to Human Performance Enhancement,” which outlined the technological potentials of this controversial transhumanist research.
For example, the report from 2021 notes that “adding reptilian genes that provide the ability to see in infrared,” and “making humans stronger, more intelligent, or more adapted to extreme environments” were all potential applications for genomic editing.
As I reported over two years ago, “If successful, these ‘people’ would have the potential to never tire and think smarter, move faster, jump higher, see farther, hear better, hit harder, live longer, adapt stronger, and calculate quicker than any other human being on the planet.”
If and when humans become fully integrated with machines on a large scale, where will the technology end and the human begin?
On the last day of the World Economic Forum meeting in 2020, a discussion about “When Humans Become Cyborgs” attempted to tackle some of the very large ethical questions surrounding bodily integrity and digital ownership of cyborgs.
Ilina Singh, professor of neuroscience and society at Oxford, told the Davos crowd that one of the major concerns from military officers was a sense of ownership and bodily integrity.
Military officers were concerned over many issues, such as:
In the same discussion, National Academy of Medicine president Victor Dzau told the Davos elites that using brain-computer interfaces to augment humans beyond their natural capabilities was crossing the ethical line.
“I think you’re in pretty safe ground when you use these technologies for the purpose of curing disease, treating disease, or at least addressing impairment,” he said, adding, “I do think you start crossing the line when you think about enhancement and augmentation.”
What’s to become of the soldiers and government agents that have been genetically edited with superhuman powers once their service has ended?
What advantages or disadvantages would people with godlike abilities have compared to the rest of humanity?
Jeanna Liu’s love for nature is rooted in her childhood. As a young girl, Liu…
The arrival of generative artificial intelligence (genAI) into the mainstream at the end of 2022…
Data analytics and machine learning models deliver the most powerful results when they have access…
I’ve been on the road for almost a year now. Chasing freedom, adventure, and purpose.…
As technological use increases, so may the cost of innovation due to the global movement…
Have you ever asked yourself why some people are amazing at picking gifts, while others…