The federal government’s premier automated safety agency has significantly expanded its research into Tesla and its autopilot support systems to determine if technology poses a safety risk.
Agency, National Highway Traffic Safety Bureau said it was on Thursday Preliminary evaluation upgrade The transition to autopilot engineering analysis, a more intensive level of scrutiny required before ordering recalls.
The analysis examines whether the autopilot can prevent the driver from distracting from the road or taking other predictable and dangerous actions while using the system.
“We’ve been calling for a closer look at autopilots for some time,” said Jonathan Adkins, Managing Director of the Governor’s Highway Safety Association, which coordinates state efforts to promote safe driving. I am.
NHTSA is aware of 35 crashes that occurred during the activation of the autopilot, 9 of which resulted in the deaths of 14 people. But on Thursday, he told the autopilot that it had not been determined if there were any flaws that could cause the car to crash during engagement.
A more extensive survey covers 830,000 vehicles sold in the United States. Includes all four Tesla cars (Models S, X, 3, Y) of the Model Year from 2014 to 2021. The agency will consider self-driving cars and various component systems that handle steering, braking and other driving tasks, as well as more advanced ones. A system that Tesla calls full self-drive.
Tesla did not respond to requests for comment on the agency’s move.
The preliminary assessment focused on 11 collisions in which a Tesla vehicle operating under the control of an autopilot collided with a parked emergency vehicle with flashing lights. In that review, NHTSA said Thursday that it had noticed 191 collisions, including but not limited to emergency vehicles, that required closer investigation. These happened when the car was operating under autopilot, fully autonomous driving, or related features, the agency said.
According to Tesla, fully autonomous driving software can guide cars on the streets, but it is not completely autonomous and drivers need to be careful. It is also available only to a limited number of customers with a fully undeveloped test version, which Tesla calls “beta.”
Deeper investigations show that NHTSA is more seriously considering safety concerns due to the lack of safety equipment to prevent drivers from using the autopilot in dangerous ways.
“This isn’t the case for your typical flaw,” said Michael Brooks, managing director of the Japan Safe Driving Center, a non-profit consumer advocacy organization. “They are actively looking for fixable issues and are looking at the driver’s behavior. The issue may not be a component of the vehicle.”
Tesla and its CEO Elon Musk have been criticized for touting autopilot and fully autonomous driving in a way that suggests that they can steer a car without driver input.
“At least we need to change the name,” said Adkins of the Governors Highway Safety Association. “These names confuse people with more than they can really do.”
A competing system developed by General Motors and Ford Motors uses an infrared camera to closely track the driver’s eyes and sound a warning chime if the driver looks away from the road for more than a couple of seconds. Initially, Tesla did not have such a driver surveillance system in the car, but later added only standard cameras that are far less accurate than infrared cameras in eye tracking.
Tesla tells the driver to use the autopilot only on split highways, but the system can be activated on any road with a central line. The GM and Ford systems (known as Supercruise and Blue Cruise) can only be activated on the highway.
The Autopilot was first offered on the Tesla model in late 2015. Use cameras and other sensors to steer, accelerate, and brake with little driver input. The owner’s manual tells the driver to grab the steering wheel and look to the road, but early versions of the system allowed the driver to take off the steering wheel for more than 5 minutes under certain conditions. ..
Unlike technicians from almost every other company working on self-driving cars, Musk argued that the camera could achieve autonomy simply by tracking the surroundings. However, many Tesla engineers wondered if it was safe enough to rely on the camera without other sensing devices.
Mr. Musk regularly promotes the ability of autopilots, predicting that autonomous driving is a “solved problem” and that drivers will soon be able to sleep while driving in the car. ..
A question about the system arose in 2016 when a man in Ohio was killed when Model S collided with a tractor trailer on a Florida highway while the autopilot was active. NHTSA investigated the collision and stated that no safety flaws were found in the autopilot in 2017.
Problems with Tesla’s autopilot system
Claims for safer driving. Tesla cars can use computers to handle some aspects of driving, such as changing lanes. However, there are concerns that this driver assistance system, called the autopilot, is not safe. Let’s take a closer look at this issue.
But the agency Issue a newsletter In 2016, a driver assistance system that could not maintain driver involvement “may pose an unreasonable risk to safety,” he said. In another investigation, the National Transportation Safety Board concluded that the autopilot system “played a major role” in the Florida accident. This was because it worked as intended, but there was no safeguard to prevent misuse.
Tesla is facing a proceeding from the family of a fatal clash victim, and some customers are suing the company for autopilot and fully automated driving claims.
Last year, Musk admitted that developing self-driving cars was harder than he thought.
NHTSA began a preliminary evaluation of the autopilot in August and initially encountered police cars, fire trucks and other emergency vehicles with Tesla operating using the autopilot stopped and the lights flashing. Focused on 11 conflicts. These crashes killed one person and injured 17 others.
While investigating those collisions, it discovered six more, including emergency vehicles, and excluded one of the original eleven from further research.
At the same time, the agency learned of dozens of crashes that occurred while the autopilot was active and did not involve emergency vehicles. Of these, the agency first focused on 191 and excluded 85 from further investigation because it did not have enough information to obtain a clear image if the autopilot was the main cause.
In about half of the remaining 106, NHTSA found evidence that the proposed driver did not pay sufficient attention to the road. About a quarter of 106 occurred on roads where autopilot should not be used.
In engineering analysis, NHTSA’s Defects Investigation Authority may obtain vehicles under investigation, identify defects, and arrange tests to reproduce the problems they may cause. In the past, we’ve broken down components to find faults and asked manufacturers for detailed data on how they behave, often with their own information.
This process can take months or even a year or more. NHTSA aims to complete the analysis within a year. If you conclude that a safety flaw exists, you can pressure the manufacturer to initiate a recall and fix the problem.
On rare occasions, automakers challenged the authorities’ conclusions in court and won to stop the recall.