Here's what we emailed September 22, 2021. Sign up for updates directly in your inbox.
Artificial intelligence has been integral to the introduction of semi-autonomous vehicles (such as Tesla). But how do AI-enabled cameras fare in evaluating human drivers themselves?
A group of German researchers developed an algorithm that can tell when someone’s attention is taken away from driving, such as looking at a smartphone or playing with children in the car. The team hopes their work will contribute to safety improvements, especially as a mandate in the E.U. goes into effect next year requiring automakers to install advanced safety systems that can monitor drivers for inattention or drowsiness and warn drivers of identified distractions.
In the U.S., three senators proposed a similar approach to require driver monitoring in all new vehicles within six years. The proposal pointed to the 3,000 lives lost to distracted driving in 2019 and follows growing concerns over how autonomous vehicles work. In fact, last month federal regulators began investigating Tesla’s autopilot system after 11 incidents where Tesla vehicles failed to notice the flashing lights of fire engines, ambulances, and police cars ahead of them.
Meanwhile, Amazon installed AI cameras to monitor drivers at the start of the year. While the company is touting a decrease in a number of safety issues, — including a 77% improvement in stop sign violations and 75% decrease in distracted driving — drivers are claiming the cameras unfairly punish for circumstances out of their control. How?
- Many drivers claim the AI cameras incorrectly penalize them whenever a car cuts them off as they are then treated to a robotic voice telling them to “maintain safe distance.”
- Some say they’re being punished for looking at their side mirrors to check for traffic as the system considers these actions “distracted.”
- There are also drivers who are now stopping at STOP signs twice after being punished for stopping past the sign at blind intersections. The first stop is for the camera to record while the second is to inch forward and actually look for oncoming traffic before proceeding.
And all these penalties contribute to poor score reports for drivers, keeping them from their bonuses or other productivity rewards. The delivery companies responsible are also unhappy, saying the inaccurately low scores allow Amazon to pay them less, reducing funds needed for vehicle repairs, consumables, damages, support staff, and more. They see these cameras as a way for Amazon to save money by not paying drivers.
As of last month, half of Amazon’s national fleet had these cameras made by Netradyne, a technology company on a “mission to transform road and fleet safety by using advanced vision technology to change the way drivers interact with the road around them.” Global industrial services provider Savage announced their use of Netradyne’s products just this week, with plans to install in over a thousand of their fleet of medium and heavy duty trucks. Savage hopes the cameras help them identify risks of distracted or drowsy driving and other potential external hazards.
🎬 Take Action
Want to better understand artificial intelligence and its application in cameras? This 14-minute read walks you through AI cameras and what exactly they do.
- Vice (Where we found this story) 2 days old | 20 minutes long
- E&T AI camera research from Germany 3 weeks old | 4 minutes long
- Smart Eye Upcoming E.U. driver monitoring 1 year old | 9 minutes long
- ARS Technica Proposed U.S. driver monitoring 5 months old | 5 minutes long
- The Conversation Federal investigation in Tesla 1 month old | 8 minutes long
- Trucking Info Savage purchases Netradyne 2 days old | 2 minutes long
- __ -- ~( @\ \ --- _________]_[__/_>________ / ____ \ <> | ____ \ =\_/ __ \_\_______|_/ __ \__D ________(__)_____________(__)____
Stop honking at me!
It’s not me, it’s the camera!
Art Credit: ASCII Art Archive