The world of vehicle autonomy is made possible with intelligent sensing and radar capabilities that can improve the safety of cars and make for smoother journeys. IDTechEx's portfolio of Robotics & Autonomy Research Reports and Subscriptions covers the various components of autonomous vehicles, including in-cabin sensing, software and the integration of AI, and radar technology.
Detecting drowsy drivers
Cars are commonly fitted with interior monitoring systems such as driver monitoring systems (DMS) and occupant monitoring systems (OMS), that can keep passengers safe and detect potential dangers. Capacitive touch or torque sensor steering wheel technology can help to ensure drivers are keeping hold of the wheel, while ECG sensors could also potentially one day become integrated into steering wheels to monitor the driver's heart rate.
The scope for RGB-infrared cameras is huge within in-cabin sensing systems. Gaze tracking, head motion tracking, distraction and drowsiness monitoring, and eye lid activity are all features which can help keep the driver safe by alerting them when they may need to take a break from driving.
Time of flight cameras are commonly deployed for occupancy detection, to ensure no children are left alone inside the vehicle. These cameras can also be used alongside or interchangeably with radar systems to detect vital signs, including heart and respiration rate, and fatigue, to gather all round observations on passengers to ensure their welfare. IDTechEx's report, "In-Cabin Sensing 2025-2035: Technologies, Opportunities, and Markets", covers a spectrum of sensing technologies for safety and convenience inside the vehicle.
Vehicle-driver communication
Facial recognition features could be used also for biometric payment authentication when asking the car to place a coffee order at the next stop using an AI assistant. The combination of artificial intelligence with the DMS could allow for a new level of interaction and communication between the driver and the vehicle, for more personalized and comfortable journeys. When the car has identified which person has sat in the driver's seat, seats and lighting preferences could automatically be adjusted to suit individual preferences.
Detecting stress or unease in the driver's voice is another trend of in-cabin sensing systems enabled by AI, for the vehicle to be able to offer assistance that may be helpful in deescalating a stressful situation. IDTechEx's report, "Autonomous Driving Software and AI in Automotive 2026-2046: Technologies, Markets, Players", explores in-depth the capacity for artificial intelligence to mimic humanlike interactions, covering different types of learning and computing neural networks. Scenario simulation within a vehicle could see advanced driver assistance systems (ADAS) being contextually aware of a situation by referring to historical driver data that has been stored, and then being able to make changes, intervene, or alert the driver appropriately. Predictive behaviour modelling also sees AI able to make decisions for the driver should their behaviour be deemed unsafe or unreliable.
Object-detection ready radar
The function of radar is to provide a safety perimeter around the vehicle, enabling ADAS features such as blind spot detection, lane change assist, and junction pedestrian automatic emergency braking, with radar operating in both short-range and long-range distances. These features are becoming increasingly integrated in new vehicles, with regulation and demand for safety technologies driving the uptake of radars. The ability for radar to also work in low visibility conditions makes it the ideal technology to keep passengers safe, so that the car can be aware of potential hazards long before the driver and provide timely alerts. 4D imaging radars are also up and coming, described by IDTechEx as being able to understand more complex situations and environments, as they could separate out various objects from one another for improved accuracy and more detailed contextual surroundings.
While the majority of the autonomous vehicle market is operating up to Level 2, where the driver can receive speed and steering support and occasionally take their hands off the wheel with Level 2+, Level 3 is beginning to emerge with some liability complications. Level 3 would see drivers able to take their eyes off the road, meaning there may difficulty deciding who would accept responsibility for damage or accidents in these conditions. In the US and Germany, flagship models of Level 3 vehicles are available from Mercedes and BMW, while in China, privately owned Level 3 vehicles are permitted to be driven on public roads. IDTechEx's report, "Automotive Radar Market 2025-2045: Robotaxis & Autonomous Cars", details regulations and expectations for radar developments and vehicle autonomy over the coming years.
For more information on vehicle autonomy and features including in-cabin sensing, AI assistants, and radar technology, visit IDTechEx's portfolio of Robotics & Autonomy Research Reports and Subscriptions for the latest research.