Vehicles are no longer mechanical modes of transportation; nowadays, they have a lot of technology that is turning them into advanced uber-connected information and entertainment centers. We are currently experiencing a revolution in how we commute.
As driver amusement is becoming a critical requirement instead of a superfluous feature, Advanced Driver Assistance Systems (ADAS) is providing an all-round picture of the vehicle surroundings by gathering information from its outside world. ADAS is currently boosting safety on the road through strengthening the in-vehicle Collision Avoidance (CAS), Lane Departure Warning (LDWS), and Intelligent Parking Assist (IPAS) systems.
Vision systems are the core of ADAS —powered by technological advancements in autonomous driving, integrated-camera systems, CMOS image sensors, and driver monitoring. ADAS is also taking advantages of several technologies, such as RADAR, Laser Imaging Detection and Ranging / Light Detection and Ranging (LIDAR), ultrasound, night vision, rear views, video mirrors, 360-degree surround views, or image and motion sensors. Different types of technologies are deployed in ADAS to combine the gathered sensed data (using data fusion) to achieve more accurate results.
ADAS systems market is growing: annually sold automotive image sensors will exceed 4 million units by 2022; increasing its market value to $17.5 billion in the next couple of years. Both consumers and agencies are pushing this growth: drivers enjoy in-vehicle embedded entertainment features and want fewer risks while driving, and regulators are rising harsh guidelines for improving road safety for all drivers and pedestrians. While the safety certification body European New Car Assessment (Euro NCAP) announced only ADAS-embedded vehicles would be considered for their top 5-star safety award, the National Highway Traffic Safety Administration (NHTSA) has driven regulatory adoption of ADAS systems for Commercial Motor Vehicles (CMV) with its SafeCar initiative.
The needs of consumers and the requirements from regulators have pushed forward ADAS systems in volume production vehicles. Engineering teams are implementing in-vehicle computers to create a 3D image from a 2D image to provide relevant data both to the driver and to the vehicle itself for identifying and distinguishing objects on and around the road (like traffic lights, traffic signs, other vehicles, or even pedestrians) to act accordingly. In coming years, ADAS will be able to generate warning messages or even take preventive/predictive/corrective actions to outside and inside prompts.