After the Tesla accident, an argument for LIDAR
So it has happened. The first traffic death caused by a commercially available autonomous vehicle has occurred.
So it has happened. The first traffic death caused by a commercially available autonomous vehicle has occurred. Well, actually a semi-autonomous vehicle, but you get the picture. Joshua Brown died May 7 when the Tesla Model S in “Autopilot” mode failed to correctly identify a tractor trailer that had crossed into his path.
Tesla has turned their new Autopilot feature off by default, and for a driver to turn it on, they must acknowledge that the feature is still in beta, and that they will keep their hands on the steering wheel and stay alert at all times. There is some current proof that Mr. Brown was not staying alert of the driving situation before him, but for this discussion that doesn’t matter, what matters is that an autonomous vehicle broke two of the three Isaac Asimov laws of robotics: The robot failed to protect its human occupant and the robot failed to protect its own existence.
So what went wrong? Read the full blog by Allen Nogee on the Strategies Unlimited website.
Related article: LIDAR nears ubiquity as miniature systems proliferate, by senior editor Gail Overton