First Fatality Of Tesla’s ‘Autopilot’ System
News has broken of the first fatality in a crash involving a car in self-driving mode, the 40-year-old owner of a technology company who nicknamed his car “Tessy” and had praised its sophisticated “Autopilot” system just one month earlier for preventing a collision on an U.S. interstate. The government said it is investigating the design and performance of the system aboard the Tesla Model S.
The car’s cameras failed to distinguish the white side of a turning articulated lorry from the bright sky and didn’t automatically activate its brakes, according to government records obtained after the crash.
Frank Baressi, 62, the driver of the truck said the Tesla driver was “playing Harry Potter on the TV screen” at the time of the crash and driving so quickly that “he went so fast through my trailer I didn’t see him.” He acknowledged he couldn’t actually see the movie playing, only heard it. Tesla have responded by saying it is impossible to watch videos on the Model S’s touch screen.
It also stressed the uncertainty about its new system, noting that drivers must manually enable it: “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”
The company said this was the first known death in over 130 million miles of Autopilot operation. It said the NHTSA (National Highway Traffic Safety Administration) investigation is a preliminary inquiry to determine whether the system worked as expected.
Before Autopilot can be used, drivers have to acknowledge that the system is an “assist feature” that still requires a driver to keep both hands on the wheel at all times. Drivers are told they need to “maintain control and responsibility for your vehicle” while using the system, and they have to be prepared to take over at any time. Autopilot makes frequent checks including making sure the driver’s hands are on the wheel, and it gives visual and audible alerts if hands aren’t detected, and it gradually slows the car until a driver responds.
The Autopilot mode allows the Model S saloon and the coming Model X SUV to steer itself within a lane, change lanes and speed up or slow down based on surrounding traffic or the driver’s set speed. It can automatically apply brakes and slow the vehicle. It can also scan for parking spaces and parallel park on command as well be “summoned” out of a parking space while the owner waits by the car. Self-driving cars have been expected to be a boon to safety because they’ll eliminate human errors. Human error is responsible for about 94 percent of crashes.
Tesla founder Elon Musk has been bullish about Autopilot, even as Tesla warns owners the feature is not for all conditions and not sophisticated enough for the driver to check out. He said the feature reduced the probability of having an accident by 50%, without detailing his calculations. In January, he said that Autopilot is “probably better than a person right now.”
One of Tesla’s advantages over competitors is that its thousands of cars feed real-world performance information back to the company, which can then fine-tune the software that runs Autopilot. You have to think of Tesla as a software company that makes cars.
This is not the first time automatic braking systems have malfunctioned, and several have been recalled to fix problems. In November, for instance, Toyota had to recall 31,000 Lexus and Toyotas because the automatic braking system radar mistook steel joints or plates in the road for an object ahead and slammed on the brakes. Also last autumn, Ford recalled 37,000 F-150 pickups in the U.S. because they braked with nothing in the way. The company said the radar could become confused when passing a large, reflective truck.
Shares of Tesla Motors Inc. fell 3.2% after the incident was reported. The technology is on the cutting edge of machine learning, connectivity and mapping data and Tesla have been touting their safety advanced technology for years. This situation will be aa huge setback for both.
Tesla have to repair this almost irreparable damage in two ways. First, they need to make sure its customers fully understand that Autopilot is meant to assist drivers, not to fully take over for them. Second, the company should update the cars’ software so autopilot will turn off if it senses the driver’s hands aren’t on the wheel for a certain period of time. Mercedes-Benz’s driver assist system in the new S-Class only allows the driver to remove their hands from the wheel for 15 seconds.
Unfortunately Tesla seem quick to point the finger at the deceased man man. How many more deaths are we going to hear about before Tesla sees their technology as perfect?