Tesla seeks to master autonomous driving someday through “fleet learning,” an approach that includes wirelessly uploading vast amounts of images and sensor data from owners’ vehicles. A fatal Model X crash suggests using that data to log potential road hazards and train its vehicles to avoid them would be a better near-term goal.
Tesla owner Walter Huang was killed when his vehicle struck a concrete traffic lane divider on March 23 while driving on U.S. 101 in Mountain View, California, in the heart of Silicon Valley. He was using Tesla’s semi-automated Autopilot system at the time, which didn’t recognize that it had put him on a collision course. Though federal safety investigators haven’t yet said whether Huang, Autopilot or both were at fault, Tesla has identified a contributing factor.
“The reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced,” Tesla said in a March 27 blog post. “Our data shows [sic] that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of.”
Exactly when the lane barrier lost its extended metal attenuator is unclear. (The California Department of Transportation didn’t return multiple calls to clarify that.) A photo Tesla posted shows the attenuator was damaged at least a day before the accident, and unintentionally raises a potential limitation for a company that pioneered vehicle connectivity and wireless updates.
All that image and sensor data Tesla collects for R&D doesn’t appear to log physical hazards (like damaged lane barriers) and alert drivers – something Alphabet’s crowd-sourced Waze navigation app does surprisingly well – or as yet train Autopilot to steer clear of such hazards.
“It’s definitely within their capability,” said Mike Ramsey, an analyst for Gartner Research. “If machine-learning is to be implemented to improve autonomous driving, you’d anticipate that this is the kind of thing that would be recognized as important. ”
Intel’s Mobileye uses camera-based technology, its Road Experience Management system, to collect and share such data with other vehicles equipped with REM chips, Ramsey said. A Tesla spokesperson confirmed that alerts related to physical road hazards aren’t integrated into its navigation system at this time, without elaborating.
Waze users are encouraged to manually log a range of road issues through the app, including collisions, road closures and miscellaneous hazards including potholes, objects on the road, roadkill, broken traffic signals and vehicles parked by the side of the road. App users nearby are then alerted to those potential hazards.
“Phones are more advanced and there is probably no more advanced system than Waze’s,” Ramsey said.
For carmakers certainly the cost to upload, analyze traffic data and then push back to the head unit isn’t insignificant.”
Since the accident, Tesla customers have even tried to replicate the circumstances of Huang’s crash, including a Model S owner who drove the same stretch of the 101 on April 2. In a video posted to YouTube and a Reddit.com blog, he claimed his vehicle also headed straight for the barrier and would have hit it had he not retaken control from Autopilot.
Tesla fansite Electrek has gushed about how the company “opened the floodgates” for data collection to improve its self-driving system, highlighting a big increase in pulling in video images. And in its fourth-quarter results letter to shareholders, Tesla boasted that it had made “a step-change improvement in the collection and analysis of data” and machine-learning capabilities.
“Our neural net, which expands as our customer fleet grows, is able to collect and analyze more high-quality data than ever before, enabling us to roll out a series of new Autopilot features in 2018 and beyond,” the company said.
In its March 27 blog about the crash, Tesla said, “there are over 200 successful Autopilot trips per day on this exact stretch of road.” Yet that comment doesn’t specify how many of those trips are in the exact same lane Huang was using and how many times those who are had to take control of the wheel.
It’s possible that Tesla is close to making meaningful use of all the data it’s been harvesting. But like so many of the company’s claims, the distance between its lofty ambitions and reality remains extremely hard to gauge.
More Info: www.forbes.com
Categories: Money Matters