The US regulators have been investigating its self-driving and assisted driving software programs on different fronts for a long time
The US auto safety regulator has started investigating Tesla's Full Self-Driving (FSD) software, after receiving four reports of crashes. One of the crashes involved a pedestrian being struck and killed.
The crashes happened when "a Tesla vehicle travelling with FSD engaged entered an area of reduced roadway visibility conditions (sun glare, fog, dust) and Tesla's FSD continued operating," the statement from the National Highway Traffic Safety Administration (NHTSA) said.
Tesla Facing Repetitive Criticisms After Self-Driving and Assisted Driving Software Crashes
The automobile company has faced repeated criticisms after the crashes involving its self-driving and assisted driving software. The US regulators have been investigating its software programs on different fronts for a long time.
National Highway Traffic Safety Administration also added, "One of the crashes involved a pedestrian being struck and killed; one crash involved a reported injury,"
Investigation Conducted to Detect System’s Failure: National Highway Traffic Safety Administration
The investigation was conducted to "examine the system's potential failure to detect and disengage in specific situations where it cannot adequately operate, and the extent to which it can act to reduce risk”, the National Highway Traffic Safety Administration said.
Tesla Settled with Victim’s Family of the Crash
In April, Tesla settled with the family of an engineer who was killed in a crash. The crash involved its Tesla Model X, which was equipped with Tesla's Autopilot driver assistance technology. The vehicle crashed in Silicon Valley in 2018.
Tesla has supported the safety of its cars and Musk has promoted its driver-assistance programs. The driver-assistance programs have not progressed as much as Musk said they would.
Last year, the company was forced to recall nearly 363,000 cars. The cars were featured with FSD Beta technology. Over 2 million vehicles were also recalled over risks associated with the Autopilot software.
In 2019, Musk said the company would produce a fully autonomous vehicle within a year. No outcome has been seen regarding it.
Previously, Tesla has announced plans to rework the cybertrucks in the US due to delays in the rear-camera images.
The cybertrucks in the US potentially reduced the visibility of drivers to see behind them and raised the chances of accidents. The company also mentioned that a software update would fix the problem.
As mentioned by Tesla in a report filed with the US National Highway Traffic Safety Administration, the recall involves over 27,000 Cybertrucks.