Technology

AI will help submarine crews better understand what adversaries are doing underwater

Artificial Intelligence will help submarine crews to better understand what their adversaries are doing underwater by combining and analyzing multiple data sets, including sonar.

Can machine learning give one protagonist the upper hand in the submarine battle space?  Recent moves towards application of AI technology suggests it can.

Thanks to Hollywood movies, perhaps the most widely recognized technology in naval operations is sonar (Sound Navagation Ranging).

Alone, today’s sonar doesn’t give a submarine captain an accurate sense of where opposing vessels are in the naval battle space; however, when sonar is combined with other data sets and put through machine learning, a more accurate understanding of underwater objects and enemy movements can be obtained.

The Limits of Sonar

At a recent Defense One/Nextgov event, a research oceanographer explained away the limitations of sonar on its own:

“What the submariners get is a low-dimensional picture”, stated Jules Jaffe of the Scripps Institution of Oceanography.  “So if you are towing an array, you get information like bearing and sometimes frequency information”.

Towed sonar only collects one type of data, with one single data point.  Sound and wave data collected from multiple points in the ocean helps in determining the actions of an adversary to a much higher level of precision.

AI and machine learning technology can work with big data and can garner so much more value from it when it comes from multiple points.  To that end, the AI technology is being pursued for this use case.

US Navy AI Technology in Submarines

Late last year, the US Office of Naval Research (ONR) requested submissions for whitepapers based upon the exploration of “analytic techniques linking physical oceaonographic variability with acoustic propagation, including field efforts to collect relevant data sets.”

That request further specified the analysis of large oceanographic and acoustic data sets – together with implicating the use of AI and machine learning technology to facilitate such analysis.

Under the ‘Task Force Ocean’ research program, funding will be provided to 30 projects to the tune of $60 million over three years.

In a separate development, 15 projects will be funded to develop improved ocean-sound sensors.  Furthermore, efforts will be made to turn undersea sound propagation data into 3D images.  The objective is to a achieve enhanced imagery such that submarine crews can use it to determine the location of other objects.

The US Navy is planning for the introduction of 32 attack submarines between now and 2034.

French multinational, Thales, is also working on developing this technology.  In information published to its website last year, the company claims that deep learning technology can be used to determine the presence of an adversary in the battle space in a similar manner in which the Shazam music recognition application works.  An algorithm can be developed to detect and identify specific sounds.

Thales ‘Bluescan’ is a solution which integrates all sonar data, together with AI and big data:

Dominique Thubert of Thales Underwater Systems stated on the website that “equipping our military vessels with a higher-level artificial intelligence is the answer to the increasing size and complexity of data to be processed as well as the need to reduce staff.”

Similar AI technology is also being investigated with regard to mine warfare.  Several underwater drones can provide the multiple data points that AI needs in order to survey and clear the sea mine field.

China, UK Develop AI Technology for Naval Use

China is in the process of developing unmanned submarines which rely upon AI technology.  The subs are believed to be ready for deployment in the early 2020s with the expectation that they will enhance rather than replace human operated submarines.

Just like the US, the Chinese are also developing AI-powered support systems for submarine crews.

In the UK, Plymouth based M Subs has installed a sensor system in Plymouth Sound which integrates with machine learning and AI based systems.  The installation is part of a broader plan to create a center of excellence in unmanned vehicles and maritime autonomy.

Through this system, situational awareness, communications and command and control are facilitated when it comes to unmanned autonomous vessels entering the Bay.

Pat Rabbitte

Pat is a writer from the West of Ireland - currently living and working in Medellín, Colombia. He has always had an inquiring mind when it comes to new technology. His discovery of Bitcoin back in 2013 slowly led to a realisation of the implications of the underlying tech. As a consequence, Pat’s passion for blockchain technology has led him to focus his writing on the subject.

View Comments

Recent Posts

How a former Wall Street exec is saving your plants and the planet 

Jeanna Liu’s love for nature is rooted in her childhood. As a young girl, Liu…

1 day ago

New initiative announced to accelerate cloud, GenAI adoption in Latin America

The arrival of generative artificial intelligence (genAI) into the mainstream at the end of 2022…

1 day ago

Deborah Leff to join Horasis Advisory Board in boost to machine learning and data initiatives 

Data analytics and machine learning models deliver the most powerful results when they have access…

1 day ago

37, Emotionally Stuck, and Why the Journey Didn’t Change Me

I’ve been on the road for almost a year now. Chasing freedom, adventure, and purpose.…

3 days ago

Will iPhones Get Pricier Under Trump’s Leadership?

As technological use increases, so may the cost of innovation due to the global movement…

3 days ago

The Science of Gift-Giving: 10 Functional Gifts for the Holidays

Have you ever asked yourself why some people are amazing at picking gifts, while others…

4 days ago