Safety Culture in the News

Safety Culture in the News

Automated Car Culture Challenge

Robocars 2019 Year In Review

With some interesting new tidbits, the board’s conclusions on the cause of the accident matched my predictions — the did not place blame on the many faults in Uber’s software, bad as they were, because the system was designed to expect such faults to exist and had a human safety driver there to supervise and take over when they occurred. Uber did a terrible job of designing their safety culture, and of hiring, training and monitoring their safety drivers, and that night, the driver was watching a video on her phone instead of the road, leaving her to not take control in a situation where she should have easily done so — and a tragic result.

One notable thing in the findings was a focus on “automation complacency” — the situation where humans quickly become bored with automated systems that need only rare attention, and then get bad at paying that attention. Uber didn’t account for that at all. Some teams have a camera monitor the driver’s gaze. Teslas require their drivers to keep tweaking the wheel to show they are there. We can expect regulators to follow the NTSB and start pushing for better countermeasures to complacency during testing and driver-assist.