830,000 Teslas with Autopilot under NHTSA Investigation, Recall Possible

830,000 Teslas with Autopilot under NHTSA Investigation, Recall Possible

  • nhtsa opened a probe into tesla's autopilot software last fall, then asked for more information, and is now expanding its investigation to an engineering analysis, which could lead to a recall.
  • the problem under investigation is how tesla's driver-assistance software identifies potential incidents with stopped first responder vehicles, as well as how the cars alert the drivers to these problems.
  • over 800,000 vehicles are potentially affected, including model s vehicles built between 2014 and 2021, model x (2015–2021), model 3 (2018–2021) and model y (2020–2021).

    the national highway traffic safety administration (nhtsa) will take a deeper look into how tesla vehicles equipped with so-called autopilot driver assistance software navigate when interacting with first responder vehicles at the scene of a collision. nhtsa said this week that it is upgrading the preliminary evaluation it started last august into an engineering analysis, which is the next step in a possible recall of hundreds of thousands of tesla vehicles.

    nhtsa said in its notice that it was motivated to upgrade the status of the investigation because of "an accumulation of crashes in which tesla vehicles, operating with autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes."

    what level 2 means

    nhtsa said that tesla itself characterizes autopilot as "an sae level 2 driving automation system designed to support and assist the driver," and many automakers use some sort of level 2 system in their new vehicles. in fact, as part of nhtsa’s probe last fall, it asked tesla and a dozen other automakers for information on how their level 2 systems operate.



    based on public information as of today, nhtsa is now only interested in understanding tesla autopilot performance. nhtsa followed up its august information request with a request for more information last october, specifically about how tesla makes changes to autopilot using over-the-air updates as well as the way tesla requires non-disclosure agreements with owners whose vehicles are part of tesla’a so-called full self-driving (fsd) "beta" release program. despite the name, fsd is not actually capable of actually driving the car on its own.

    in a public update on its probe, nhtsa laid out its case for why autopilot needs to be investigated. nhtsa said it has so far investigated 16 crashes and found that autopilot only aborted its own vehicle control, on average, "less than one second prior to the first impact" even though video of these events proved that the driver should have been made aware of a potential incident an average of eight seconds before impact. nhtsa found most of the drivers had their hands on the wheel (as autopilot requires) but that the vehicles did not alert drivers to take evasive action in time.

    100 other crashes to get a second look

    nhtsa is also reviewing more than 100 other crashes that happened with teslas using autopilot but that did not involve first responder vehicles. its preliminary review of these incidents shows that in many case, the driver was "insufficiently responsive to the needs of the dynamic driving task." this is why nhtsa will use its investigation to assess "the technologies and methods [tesla uses] to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during autopilot operation."

    a total of 830,000 tesla vehicles are part of the upgraded investigation. that includes all of tesla’s current models, including model s vehicles built between 2014 and 2021, model x (2015–2021), model 3 (2018–2021) and model y (2020–2021). nhtsa's documents say it is aware of 15 injuries and one fatality related to the autopilot first responder problem.

    sen. ed markey of massachusetts tweeted that he’s glad nhtsa is escalating its probe, because "every day that tesla disregards safety rules and misleads the public about its 'autopilot' system, our roads become more dangerous."

    tesla ceo elon musk is still touting the benefits of full self-driving (fsd) and announced the expansion of the latest beta software to 100,000 cars earlier this month on twitter. he claimed that the new update will be able to "handle roads with no map data at all" and that "within a few months, fsd should be able to drive to a gps point with zero map data."



    the autopilot investigation is separate from another recent move by nhtsa to request more information from tesla about "phantom braking" caused by the company’s automated emergency braking (aeb) systems. the company has until june 20 to submit documents about hundreds of reported aeb problems to the government.

    source:caranddriver.com

    Search
    Random Cars
    Used Engines For Sale!
    Auto part MAX is a network of 200s of salvage yards, suppliers of Used and remanufactured Enginens and transmission across the USA . FREE request quick and easy parts location service.