Selected ADAS, Viewing & Autonomous Vehicles Solution
返回到 ADAS, Viewing & Autonomous Vehicles solution
The end goal of completely autonomous driving is to have no needed human interaction. Currently, the state of AD is Level 2, meaning that vehicles can drive themselves, but require that the driver keep their hands on the wheel. This does not mean the driver needs to have their eyes on the road, which opens up possible scenarios where the driver is distracted or pre-occupied. Advanced automated driving systems need to understand the state of the driver’s condition and awareness. In-cabin cameras, pointed at the driver with advanced driver monitoring algorithms, allow these systems to understand where the driver’s gaze is pointed, if their eyes are open or closed, position of their head, or if they are on their phone. These solutions require sensors with high sensitivity to infrared (940 nm), small footprints for optimal placement in the cabin, and advance global shutter efficiency to track eye and head movements.
Almost all Advanced Driver Assistance Systems (ADAS) both today and in the foreseeable future are built primarily on machine vision to drive the decision process.