Technology

AI will help submarine crews better understand what adversaries are doing underwater

Artificial Intelligence will help submarine crews to better understand what their adversaries are doing underwater by combining and analyzing multiple data sets, including sonar.

Can machine learning give one protagonist the upper hand in the submarine battle space?  Recent moves towards application of AI technology suggests it can.

Thanks to Hollywood movies, perhaps the most widely recognized technology in naval operations is sonar (Sound Navagation Ranging).

Alone, today’s sonar doesn’t give a submarine captain an accurate sense of where opposing vessels are in the naval battle space; however, when sonar is combined with other data sets and put through machine learning, a more accurate understanding of underwater objects and enemy movements can be obtained.

The Limits of Sonar

At a recent Defense One/Nextgov event, a research oceanographer explained away the limitations of sonar on its own:

“What the submariners get is a low-dimensional picture”, stated Jules Jaffe of the Scripps Institution of Oceanography.  “So if you are towing an array, you get information like bearing and sometimes frequency information”.

Towed sonar only collects one type of data, with one single data point.  Sound and wave data collected from multiple points in the ocean helps in determining the actions of an adversary to a much higher level of precision.

AI and machine learning technology can work with big data and can garner so much more value from it when it comes from multiple points.  To that end, the AI technology is being pursued for this use case.

US Navy AI Technology in Submarines

Late last year, the US Office of Naval Research (ONR) requested submissions for whitepapers based upon the exploration of “analytic techniques linking physical oceaonographic variability with acoustic propagation, including field efforts to collect relevant data sets.”

That request further specified the analysis of large oceanographic and acoustic data sets – together with implicating the use of AI and machine learning technology to facilitate such analysis.

Under the ‘Task Force Ocean’ research program, funding will be provided to 30 projects to the tune of $60 million over three years.

In a separate development, 15 projects will be funded to develop improved ocean-sound sensors.  Furthermore, efforts will be made to turn undersea sound propagation data into 3D images.  The objective is to a achieve enhanced imagery such that submarine crews can use it to determine the location of other objects.

The US Navy is planning for the introduction of 32 attack submarines between now and 2034.

French multinational, Thales, is also working on developing this technology.  In information published to its website last year, the company claims that deep learning technology can be used to determine the presence of an adversary in the battle space in a similar manner in which the Shazam music recognition application works.  An algorithm can be developed to detect and identify specific sounds.

Thales ‘Bluescan’ is a solution which integrates all sonar data, together with AI and big data:

Dominique Thubert of Thales Underwater Systems stated on the website that “equipping our military vessels with a higher-level artificial intelligence is the answer to the increasing size and complexity of data to be processed as well as the need to reduce staff.”

Similar AI technology is also being investigated with regard to mine warfare.  Several underwater drones can provide the multiple data points that AI needs in order to survey and clear the sea mine field.

China, UK Develop AI Technology for Naval Use

China is in the process of developing unmanned submarines which rely upon AI technology.  The subs are believed to be ready for deployment in the early 2020s with the expectation that they will enhance rather than replace human operated submarines.

Just like the US, the Chinese are also developing AI-powered support systems for submarine crews.

In the UK, Plymouth based M Subs has installed a sensor system in Plymouth Sound which integrates with machine learning and AI based systems.  The installation is part of a broader plan to create a center of excellence in unmanned vehicles and maritime autonomy.

Through this system, situational awareness, communications and command and control are facilitated when it comes to unmanned autonomous vessels entering the Bay.

Pat Rabbitte

Pat is a writer from the West of Ireland - currently living and working in Medellín, Colombia. He has always had an inquiring mind when it comes to new technology. His discovery of Bitcoin back in 2013 slowly led to a realisation of the implications of the underlying tech. As a consequence, Pat’s passion for blockchain technology has led him to focus his writing on the subject.

View Comments

Recent Posts

How GPUs, widely used in gaming, are helping doctors get better look inside us

In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…

10 hours ago

Top Five Trends Shaping Gaming in 2025

This year wasn’t exactly what the video gaming industry expected — it declined by 7%…

2 days ago

Why data flywheels are the key to sustainable growth in 2025 

By Oren Askarov, Growth & Operations Marketing Director at SQream Becoming “data-driven” has become a…

2 days ago

Swiss-based Horasis to host its Asia Meeting in Dubai, United Arab Emirates 

Horasis Asia Meeting, led by German entrepreneur Frank Jurgen-Richter, will take place this year on the…

5 days ago

Startup ecosystem in Sevilla welcomes the return of Techstars Startup Weekend

Techstars is one of the world's most recognized startup organizations, helping to support countless founders…

5 days ago

Three ways that BioPharma is leveraging AI to tackle mounting cost pressures 

Article by Vikram (V) Venugopal, General Manager, VP BioPharma at Prezent, Partner at Prezentium Biotech…

7 days ago