Artificial Intelligence (AI) appears to dominate headlines every day, alongside blockchain and 3D printing. As these technologies continue to conquer new spaces, in addition to headlines, one area which is set for major transformation is embedded software. These two areas can be considered a great match leading to excellent results when paired in the right way.
This is quite simply because the more data an AI system has, the better the outcomes it can produce, and because embedded systems are omnipresent in modern society, they are a perfect source of data inputs. For example, nearly every adult has a smartphone and other mobile devices they use on a regular basis, or automobiles have built-in GPS systems and cameras. These devices can easily be enhanced with a touch of AI, and startups dealing in this space know this.
To get a better idea of what the future of this technology could look like, we spoke with Marcin Kłoda, VP of Intive, a global digital product development company, who said “Artificial Intelligence is very popular in any type of software which is running on an effective server or high-end application processor. But, what about embedded devices? During CES 2018 in Las Vegas, Nvidia CEO, Jensen Huang, said that “Cloud computing is so yesterday…the future of computing looks more like a past”. This statement was focused on the fact that efficient systems will be using more “edge computing”, meaning that data will be processed as close to the place of the source as possible. Keeping that in mind, it becomes obvious that edge devices need to have AI to filter data once it comes.”
He adds “A very good example is the self-driving car we all dream about. The car includes ADAS (Advanced Driver-Assistance System), which consists in an internal network of sensors sending signals to one system in charge of analyzing the data. This is called sensor fusion. To achieve efficient and fast ADAS -even with fast backend in the cloud- “edge devices” (sensors) needs to create as many “intelligence” as possible. In the near future, it would be more popular to have a neural network built into the sensor like with EX radars, where networks filter noises and send position data to the central processor and then to the cloud.”
“Another example of great usage of AI in embedded devices are the new interfaces for wearables like Google Soli project. In this cases, contact with wearable devices is done without touching the screen, so “Edge” Digital Signal Processor (DSP) need to recognize whether the user is giving an order or if it’s just an accidental hand move. With a properly trained neural network, it’s possible to recognize user behavior at this level of detail,” states Kloda.
Though we might have a long way to go, the future of this field looks very promising, with plenty of reasons to be excited.
Horasis Asia Meeting, led by German entrepreneur Frank Jurgen-Richter, will take place this year on the…
Techstars is one of the world's most recognized startup organizations, helping to support countless founders…
Article by Vikram (V) Venugopal, General Manager, VP BioPharma at Prezent, Partner at Prezentium Biotech…
Article by Ian Rankin, Chief Commercial Officer at Sim Local As its ecosystem grows, the…
While several Latin American countries have enacted crypto regulations — including some with volatile economic…
Programmable Central Bank Digital Currencies (CBDCs) could be used for state surveillance while posing risks…