- Latest: Welcome to Auto Futures - Mobility News, Features, Exclusives and More...
- Latest: Kia Corporation Unveils EV4 & Concept EV2 at EV Day in Spain
- Latest: Volklec Announces Plans to Open a Dedicated 10GWh UK Battery Gigafactory
- Latest: Mercedes-Benz Begins Solid-State Battery Road Tests
- Latest: BMW Unveils Sixth-Generation BMW eDrive Technology for the Neue Klasse
- Latest: Recovering Critical Battery Materials - Ace Green Recycling CEO
The Companies Taking Driver & Cabin Monitoring to The Next Level (Part 2)

In the second part of our special report, Lynn Walford takes a look at how cabin-monitoring can help enable advanced driving and safety (ADAS) solutions, autonomous vehicles and ride-sharing.
Eyes looking in the right direction may not be enough. That’s according to Modar Alaoui, CEO and founder of Eyeris, a start-up that offer the world’s first sensor-fusion AI integrating multiple cameras, radar and infrared thermal sensors with hardware partners such as On Semiconductor, Xilinx, Infineon and FLIR Systems.
Eyeris takes in-cabin driver and occupant monitoring to a higher level. Alaoui notes that sensor fusion understands the entire in-vehicle scene understanding how many people – how they are seated, their faces, emotions and then can predict what will happen.
Eyeris has the world’s largest in-cabin data from 3,000 people and 10 million images of face body action activity inside the car, collected over the last five years, says Alaoui.
“Sensor fusion is not just for the driver but for the entire in-cabin space. The driver can be distracted by the baby crying in the back seat. Eyeris can predict if the driver is going to be distracted. We need to understand everything in the cabin space because safety matters,” he adds.
WE CAN TELL THE DIFFERENCE BETWEEN PEOPLE FIGHTING AND DANCING IN THE BACK SEAT
Eyeris predicts that the driver will turn around to look at the baby crying. Then the car would be prepared to respond, it could take over, slow down automatically activate lane keep assist, it could activate seat vibration, and or supercharge the brakes, says Alaoui.
A baby covered in a blanket may not be detected by a camera, but thermal images would detect the baby under the blanket.
Eyeris’ fusion systems can better Tesla’s Dog Mode. It could notify the driver if a human or dog is in the vehicle becomes hot and needs attention. It can determine if someone in a rideshare or autonomous vehicle is carrying a weapon, left behind a bag in the back seat or if the seats are dirty and need to be serviced.
“We can tell the difference between people fighting and dancing in the back seat,” Alaoui gives as an example.
From Eyeris’ body analytics the car can be personalized with the cabin adjustments to the steering wheel and seats or climate control. Eyeris will be deployed in the Karma Revero in 2021 and other cars in the future.
How Will Driver and Cabin Monitoring be Used in the Future?
“The eyes are the windows to the soul,” says Nick DiFiore, SVP and GM Automotive at Seeing Machines, a company that develops computer vision related technologies. “They are also the window to what is going on in a head.”
Seeing Machines’ DMS is employed in GM’s Super Cruise, for the car to know the driver is paying attention to the road and ready to take over the vehicle.
Seeing Machines Technology Enables GM Super Cruise Driver Assistance System (PRNewsfoto/Seeing Machines)
The next level for DMS is to fuse it with ADAS functions, says DiFiore. With ADAS fusion the car can compensate for what the driver doesn’t see – the system learns that a driver is not looking ahead but ADAS detects that there is a vehicle in front or stop sign.
Then an ADAS function such as automatic braking can be activated.
Taking a Holistic Approach
“Currently, DMS can detect gender, age, a thousand points on a face—identify if the driver is a teenager or an elderly person. Eventually, when we take the driver out of the equation, the next generation monitoring will be detecting cognitive load and emotional states,” says Aaron Thompson, Senior Director Platform Development, BU ADAS/AD at HARMAN International.
He says the next development in driver state monitoring is a holistic understanding of driver and occupant monitoring.
“When we get to autonomy, that’s when we really need to understand what’s going on with the cabin and the driver,” says Thompson. “That is what the industry (automakers, universities and algorithm suppliers) is pushing for.”
To tread the first part of our report on cabin monitoring, follow the link below.