Passenger Monitoring in Future Autonomous Vehicles World

The race is on for all things driverless, and everybody’s working full steam – from automotive OEMs to specialized startups. Prominent players who were not originally in the automotive business are elbowing in too, like Apple, which recently acquired a self-driving car startup. Consumer acceptance of the idea seems to be growing also. A survey of 5500 people around the world showed that 59% of us are awaiting the arrival of autonomous vehicles with anticipation, and within five years, 52% would prefer to be driven in a self-driving car than in a normal one.

But we aren’t talking enough about how this new beast is going to drive evolution in the transportation ecosystem, and all the various external strings it’s gonna pull – and not just the obvious “we’ll have cars delivering our pizza” phenomena. The integration of driverless cars is related to the concept of smart cities. Autonomous vehicles of today operate on confined areas with well-known landmarks, like the Optimus Ride shuttle, or offer limited autonomy with the requirement of a human overseer, like Tesla Autopilot. Having infrastructure that can communicate with a vehicle might propel us towards level 5 autonomy where there are no restrictions imposed on areas or circumstances to drive in, and vehicles communicating with one another might even be more beneficial to safety, which means the more autonomous cars, the better. There’s a bit of a catch 22 here if all cars were driverless, there would be no need for solving complex problems of predicting human driver behavior. But to get driverless cars on the streets, we need to solve complex problems of predicting human driver behavior.

Though we may one day be freed from the necessity of understanding the behavior of other human drivers, we will increasingly pay attention to the humans on the passenger seats, their level of comfort, and their desire for entertainment. Audi has already recognized the potential in playing on our homo ludens nature, and spun off Holoride, announcing VR in the back of a car within a few years, that should rely on the car’s motion to provide you with an immersive experience of riding through, say, a Jurrasic park. The car might then use gesture recognition to let you interact with the virtual environment and play games. Gesticulation could also allow you to turn down the heating or increase the volume of that interesting podcast you’ve been listening to. Or you could talk it out with the vehicle’s voice assistant.

But why wouldn’t the vehicle be able to figure out what you like and dislike and regulate the conditions on its own? For instance, it notices you start breathing a little faster and your heart races when the speed is higher, so it slows down. Or your fitness wristband informs the car you’ve been sweating a lot, so it lowers the air temperature.

However, not all use cases for passenger monitoring have to do with entertainment and comfort – some are more critical than that. It is not impossible to imagine a scenario where a driverless car pats itself on the shoulder for having delivered you to the fair you wanted to visit outside the town, when it would have done a much better job taking you to a hospital because you had a heart attack, or notified your physician that your tachycardia is back.

The importance of transportation in healthcare was obvious to Uber. Uber Health offers to transport people to medical appointments, an on-demand service that could see some extensions in the smart-everything future we envision.

There are several ways how a vehicle might know the pace of your heart, respiration, and potentially other vital signs, like blood pressure. For one, it could get this information from your wearables. But if you thought having electrodes in contact with your skin was the only viable way of estimating the heart’s activity, you’re in for a surprise [1]. Electrodes can be built into the fabric of the car seat and capacitively coupled, with no direct contact with the skin. [2]

There’s more: next to bioelectrical effects measured by ECG, a beating heart produces thermal effects that can be captured by thermography [3], and mechanical effects, reflected in the displacement of body surface due to organ motion that can be nicely seen by a mm-wave radar, and changes in superficial blood flow which is the basis for photoplethysmographic imaging (PPGi). Just like different sensing modalities are used in conjunction to equip a vehicle with the understanding of its surroundings, it is possible sensor fusion is also the way to go inside the cabin for increased robustness and precision of health monitoring.


Overview of physiological sources, effects, unobtrusive sensors, and obtainable vital signals

[Illustration is taken from Leonhardt et al. (2018)]


While we’re sitting tight for fully autonomous transportation, there is also vast room for driver monitoring systems, using heart and respiration rates and their variability in estimating stress levels, looking for signs of drowsiness and fatigue or observing changes in the usual steering patterns, so they could warn the driver to take action and prevent unwanted events, or assess whether the driver is ready to take over control from the autopilot.

And let’s not forget the crucial role of smart sensors in alerting us about a child or a pet we may have left behind.

[1] Leonhardt, S., Leicht, L. and Teichmann, D., 2018. Unobtrusive Vital Sign Monitoring in Automotive Environments—A Review. Sensors, 18(9), p.3080.

[2] Chamadiya, B., Heuer, S., Wagner, M. and Hofmann, U.G., 2011, January. Textile Capacitive Electrocardiography for an Automotive Environment. In BIODEVICES (pp. 422-425).

[3] Barbosa Pereira, C., Czaplik, M., Blazek, V., Leonhardt, S. and Teichmann, D., 2018. Monitoring of cardiorespiratory signals using thermal imaging: A pilot study on healthy human subjects. Sensors, 18(5), p.1541.