selfdrivingAs the complexity and penetration of technology in automobiles increases, there is a growing need for solutions that support artificial intelligence, which uses machines and programs to emulate the functions of the human brain.

In fact, unit shipments of artificial intelligence (AI) systems used in automobiles are expected to rise from 7 million in 2015 to 122 million by 2025, according to IHS.

The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

Luca De Ambroggi, principal analyst-automotive semiconductors at IHS Technology said:

“An artificial-intelligence system continuously learns from experience and by its ability to discern and recognize its surroundings. It learns, as human beings do, from real sounds, images, and other sensory inputs. The system recognizes the car’s environment and evaluates the contextual implications for the moving car.”

According to the latest IHS Automotive Electronics Roadmap Report, AI-based systems in automotive applications are relatively rare. But, they will grow to become standard in new vehicles over the next five years, especially in the following two categories:

  • “Infotainment human-machine interface”: including speech recognition, gesture recognition, eye tracking and driver monitoring, virtual assistance and natural language interfaces.
  • “Advanced driver assistance systems (ADAS) and autonomous vehicles”: including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU). man-person-night-car-large

Specifically in ADAS, deep learning, which mimics human neural networks, presents several advantages over traditional algorithms. It is also a key milestone on the road to fully autonomous vehicles.

For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables prediction of actions, and will reduce development time of ADAS systems.


The technology required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks.

But, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. For example, the 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded technology able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla also claims to implement neural network functionality in its autonomous driving control unit.