When MIT turns its back on Elon Musk & calls Tesla unsafe, maybe its time to start rethinking exactly how technologically profound Musk & his company truly are.
When MIT turns its back on Elon Musk and calls Tesla unsafe, maybe its time to start re-thinking exactly how technologically profound Musk and his company truly are.
A new study out of MIT has “confirmed how unsafe” Tesla’s Autopilot feature actually is, according to analysis from Screenshot Media and a study called “A model for naturalistic glance behavior around Tesla Autopilot disengagements”.
The study reveals, in not so many words, that Full Self Driving is not as safe as it claims. It followed Tesla Model S and X owners “during their daily routine for periods of a year or more” and found that drivers become inattentive when using partially automated driving systems.
The study itself concluded: “Visual behavior patterns change before and after [Autopilot] disengagement. Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.”
TechCrunch wrote about the study: “The researchers found this type of behavior may be the result of misunderstanding what the [autopilot] feature can do and what its limitations are, which is reinforced when it performs well. Drivers whose tasks are automated for them may naturally become bored after attempting to sustain visual and physical alertness, which researchers say only creates further inattentiveness.”
TechCrunch also wrote about how the data was collected for the study:
The vehicles were equipped with the Real-time Intelligent Driving Environment Recording data acquisition system1, which continuously collects data from the CAN bus, a GPS and three 720p video cameras. These sensors provide information like vehicle kinematics, driver interaction with the vehicle controllers, mileage, location and driver’s posture, face and the view in front of the vehicle. MIT collected nearly 500,000 miles’ worth of data.
At the same time this week, Tesla has stiff-armed regulators and any ongoing investigation and has decided to roll out its Full Self Driving 10.0.1 beta.
Recall, last month we reported that regulators at the NHTSA in the United States had finally come to their senses and opened the long-overdue investigation. The NHTSA said the investigation includes Tesla’s Model X, S and 3 for model years 2014-2021. The broad range of models and model years means that this could be the large-scale investigation that skeptics have been requesting for years, we noted.
The NHTSA said the investigation would assess technologies, methods “used to monitor, assist, and enforce the driver’s engagement” during autopilot operation, according to Bloomberg.
Just days ago, we noted that the NTSB had urged the company to work on the feature’s safety before pursuing Autopilot further. Jennifer Homendy, chairwoman of the National Transportation Safety Board, said: “Basic safety issues have to be addressed before they then expand it to other city streets and other areas.”
Homendy argued that the term ‘full self-driving’ was “misleading and irresponsible”.
The full study can be read here.