Tesla confirms 'Autopilot' was engaged during fatal crash
READ MORE
Autopilot is still far from a completely autonomous driving system, which would not require any involvement by a human.
Autopilot is considered part of the second of five levels of autonomous driving, with the fifth being fully autonomous – something once featured in futuristic cartoons but which has moved closer to reality.
A Tesla Model X – the latest model – collided with a highway barrier near the town of Mountain View in California on March 23, catching fire before two other cars struck it.
The driver was identified by The Mercury News as a 38-year-old man, Wei Huang, an engineer for Apple. He later died in hospital.
Tesla issued a blog post late Friday saying the driver had activated the Autopilot but ignored several warnings.
“In the moments before the collision… Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” Tesla said.
“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.
“The driver had about five seconds and 150 meters (164 yards) of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
Tesla added the reason the car sustained such great damage was because a highway barrier “had been crushed in a prior accident without being replaced”.
“We have never seen this level of damage to a Model X in any other crash,” it said.
The company, founded 15 years ago by Elon Musk, sought to downplay fears over its technology.
“Over a year ago, our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent,” it said.
Pedestrian killed
In January last year, the US Transportation Department closed an investigation into the fatal 2016 crash in Florida of a Tesla Model S on Autopilot, finding that no “safety-related defect” had caused that accident, the first of its kind.
The latest fatal Tesla crash came the same week a collision involving an autonomous Uber vehicle in Arizona killed a pedestrian and caused that company to temporarily halt its self-driving car program.
Circumstances of the two crashes are different: Tesla’s Autopilot is a driver assistance feature, while the Uber vehicle was designed to operate autonomously but with a driver behind the wheel to correct mistakes.
Dashcam footage released by police showed that the operator appeared to be distracted seconds before the car hit the woman.
The nonprofit group Consumer Watchdog has argued that autonomous vehicles are not ready for roads and the public should not be put at risk to test such technology.
After the Uber accident, Democratic Senator Richard Blumenthal said “autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers.”
Both Uber and Tesla are rivals in the multi-billion-dollar drive to develop vehicles which, in the future, will not need any driver intervention.
Among other contenders, General Motors has asked to test on roads beginning next year a car with no steering wheel. Google-owned Waymo is also intensifying its self-driving efforts.
If the final, fifth stage, of autonomous driving is still distant, microprocessor manufacturer NVIDIA several months ago unveiled an artificial intelligence platform to enable that goal.
The system can perform 320 trillion operations a second, completely independently of a vehicle’s passengers.
California-based NVIDIA provided some technology in the Uber car which crashed in Arizona, prompting the chip firm to suspend its road tests pending more information about the incident.
Comments
Post a Comment