In a fatal accident in California, Tesla was on autopilot: Authorities Automotive news
[ad_1]
He was using an Tesla autopilot when he was involved in a serious accident on a highway in Southern California in the United States last week, authorities said.
The National Highway Traffic Safety Administration (NHTSA) is investigating the accident at Fontana, 80 km (50 miles) east of Los Angeles. This is the 29th case of a Tesla that the probe responded to.
A 35-year-old man was killed when a semi-truck overturned by his Tesla Model 3 hit the highway at around 2.30am (09:30 GMT). The driver’s name has not yet been made public. Another man was seriously injured when he was hit by an electric vehicle while helping the driver of the semi sink.
The California Highway Patrol or CHP reported on Thursday that the car was using Tesla’s partially automated driving system called Autopilot, which had multiple accidents. The fountain accident marked at least the fourth death in the U.S. with Autopilot.
“While the CHP does not comment on ongoing investigations, the Department acknowledges the high level of interest in accidents involving Tesla vehicles,” the agency said in a statement. “This information allows us to remind people that driving is a complex job that requires the full attention of the driver.”
A federal security investigation after the CHP arrested another man after authorities said anyone behind the wheel of a Tesla who was driving around Interstate near Oakland this week was behind the wheel.
The CHP has not said whether officials have determined whether the Tesla is operating on a Tesla autopilot in an accident, with a car centered on its lane and able to have a safe distance in front of vehicles.
But it is likely that Autopilot or “Full Auto-Driving” will be running to keep the driver in the back seat. Tesla allows a limited number of owners to test its self-driving system.
Tesla, which has dissolved the public relations department, did not respond to an email seeking comments on Friday. The company says in the owners ’manuals and on its website that Autopilot and“ Full Auto-Driving ”are not fully autonomous and drivers need to pay attention and be willing to intervene at any time.
The autopilot has at times had problems with stationary objects and crossing traffic in front of Teslas.
In two Florida crashes, in 2016 and 2019, cars that were using autopilot collided with tractor trailers and the men who were driving the Teslas were killed. In an accident in Mountain View, California, in 2018, an Apple engineer who was on autopilot was killed when his Tesla hit a highway barrier.
Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble stopping emergency vehicles. Teslas has struck several fire trucks and police vehicles with emergency lights parked on highways.
For example, the NHTSA sent a team in March to investigate after a Tesla autopilot collided with a Michigan State Police vehicle near Lansing at 96 Interstate. No troops or 22-year-old Tesla driver were injured, police said.
Following the crashes in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a more powerful system to ensure drivers pay attention, and limit the use of Autopilot to highways that can work effectively. Neither Tesla nor the security agency took action.
In a letter to the U.S. Department of Transportation on Feb. 1, NTSB President Robert Sumwalt called on the department to provide driver assistance systems, such as regulations such as Autopilot, and regulations for autonomous vehicle testing. NHTSA has relied primarily on voluntary vehicle guidelines so that it does not interfere with the development of new safety technologies by taking a manual approach.
Sumwalt said Tesla uses people who have purchased cars to use the “Full Auto Driving” software on public roads with limited oversight or notification requirements.
“As the NHTSA has not set any requirements, manufacturers can operate and test vehicles almost anywhere, even if the location exceeds AV [autonomous vehicle] the limitations of the control system, ”Sumwalt wrote.
He added: “A warning that Tesla’s currently enabled features require active driver supervision and that the vehicle is not autonomous” NHTSA’s intervention to oversee AV testing poses a potential risk to drivers and other road users. “
The NHTSA, which has the authority to regulate automated driving systems and seek warnings if necessary, appears to have sparked a new interest in the systems since US President Joe Biden took office.
[ad_2]
Source link