When Tesla first released their Autopilot feature, skeptics figured it was just a matter of time before it got someone killed.
Plenty of Autopilot "saves" (where the software correctly detected a threat and acted to prevent an accident) can be found on YouTube. But a video clip that you're not going to find is of the one poor soul whose Model S drove underneath a semi-trailer in Florida, apparently at highway speed. The Tesla intersected the trailer at a right angle, and the bottom of the trailer was level with the windshield. You can do the math. The lone driver—or is that "passenger"—did not survive. Reuters reports that the driver was watching a "Harry Potter" DVD at the time of the crash.
While the incident is all over the news this morning, this did not happen this week, nor even this month; the fatal accident occurred on May 7th. However, it was just yesterday that Tesla released a statement. If you're wondering "Why the delay," it's probable that the police and ambulance crews responding to the crash had no way of knowing what mode the car was in; but Tesla followed policy and alerted the National Highway Traffic Safety Administration after the wreck, and now the fact that the car was in Autopilot mode has come to light.
Tesla's statement opens up with numbers pointing out that their Autopilot is still, statistically speaking, safer than taking your own chances behind the wheel.
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
Here's Tesla's presentation of the known details of the crash:
What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
Tesla then points out that Autopilot is still technically in Beta, and mentions the intended role of the driver within their system:
It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot "is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.
Interestingly, it appears the crash victim was someone known to Tesla and apparently a proponent of electric vehicles:
The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla's mission. We would like to extend our deepest sympathies to his family and friends.
The incident will undoubtedly spark debate about self-driving cars, and the utility of Autopilot versus safety. By now you've undoubtedly seen the video of a random Tesla driver sleeping behind the wheel in traffic:
How could he have possibly reacted had the car been traveling at a higher speed and if something had gone wrong? Human attention isn't like a light switch and can take a moment or two to come online; is Autopilot a good idea when accidents can occur in the blink of an eye?
Lastly, can you take comfort in the numbers showing that there's a greater chance of being killed without Autopilot? Or is the thought of being killed due to machine error rather than your own too frightening for numbers to sway you?
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.
Comments
If this technology is supposed to assist a human driver then it should be activated differently. Despite the legal warnings the way the system is designed it gives the user a signal that it is there to replace the human driver. The system should activate in those short moments when the driver can't hold the steering wheel, take control of the car and alert the driver to return to the steering wheel. Now that would be assisting. There are many road fatalities caused by people trying to feed their babies etc. during driving.
I agree with Ian's comments. Humans are limited to the visible spectrum, but infra-red detectors would have surely sensed this truck and avoided the accident. Hopefully, lesson learned.
I'm wondering why self driving cars so far don't seem to use sensors that are better than human eyes. Infrared could sense animals in the dark, and lidar or radar should have helped see this semi trailer. If you look back at the darpa self driving car challenge, almost all of the cars had a host of sensors that they combined together into a whole picture of what was around the car. Maybe that turned out to be too hard? Also, not slowing down when you can't see well as either a human or a self driving car doesn't sound too smart. Maybe Tesla can start sensing lighting situations they can't handle, turn over control to the human AND warn them that visibility is low and potentially dangerous?
I don't think the unfortunate passing of this person can be said to have been caused by "machine error". The driver could have stopped the car or swerved. the car wasn't overriding driver input (see Toyota Acceleration Recall)
I call it "machine error" because the Autopilot system is supposed to apply the brakes when it senses an imminent impact, but did not. As you say, the car was not overriding driver input--because the driver was not providing any input.
Now that it's come to light that he was watching a DVD at the time of the crash, blame will surely be attributed to him. But the fact is that he can't be the only Tesla owner who's abused the technology, just the unluckiest. And I think we all know he won't be the last to abuse the technology.