Another Model 3 crashed, and Teslas autopilot system was challenged.

category:Internet
 Another Model 3 crashed, and Teslas autopilot system was challenged.


According to media reports, on June 1, a car crash occurred on a section of Taiwans highway, and a running Tesla Model 3 directly hit a truck that had been overturned on the road. Tesla owners claimed that their vehicles had turned on the automatic driving function before the accident. When they saw the white truck in front of them, they thought that the vehicle would stop by themselves when encountering obstacles, but the vehicle did not stop. It was too late to brake.

Pictures from the Internet

The National Transportation Safety Board (NTSB) also investigated and reported two previous fatal accidents of Tesla self driving. Robert sumwalt, chairman of the U.S. National Transportation Safety Committee, said that two accidents have proved that there is a certain potential safety hazard in automatic driving. When the technology of automatic driving is not mature, there is still a lack of mandatory policies and more advanced technology to ensure safety.

According to media reports, the impact of the Tesla Model 3 crash was very strong, with a speed of about 110km / h.

From the monitoring screen, at that time, the truck rolled over in the fast lane, the driver did not place any roadblocks behind the car, only waved and reminded the rear of the car about 10 meters away from the car, and soon a white Tesla appeared in the fast lane, the driver waved hard to warn, but this Tesla Model 3 directly hit the truck, and the car head drilled into the stock car.

Pictures from the Internet

According to the report, the front bumper and engine cover of model 3 in the accident were damaged by impact, and there was no damage to the battery and motor located in the chassis. The preliminary maintenance cost is estimated to be about 200000 yuan. Tesla has yet to respond to the incident.

According to the survey, the model 3 was cruising at 69 miles per hour (110km / h) 12.3 seconds before the crash, and turned on the traffic perception control system. In the first 9.9 seconds before the impact, Tesla launched the automatic auxiliary lane, and the AutopilotL2 automatic driving system was opened. 7.7 seconds before the collision, Teslas front collision warning system did not give an alarm and the automatic emergency brake was not activated. After investigation, it was found that there was no steering or braking operation in a few seconds before the collision.

NTSB finally determined that the cause of the accident was the drivers over dependence on Tesla L2 automatic driving function, resulting in the phenomenon of inattention, which eventually led to the accident. In addition, Tesla AutopilotL2 class automatic driving system did not remind the driver when he was off the hook, which was inconsistent with the design conditions and eventually led to a collision accident. From the perspective of the model 3 accident in Taiwan and the model 3 accident in the United States, the drivers are all getting rid of their hands while driving, and the Tesla automatic driving warning system does not give an alarm, which eventually leads to the accident.

Manual supervision is still needed under the condition of automatic driving

Some analysts believe that since the automatic driving vehicle is also a new thing, and the relevant laws and regulations are not perfect, it is difficult to distinguish responsibility after the accident. A series of Tesla autonomous driving accidents also led to the investigation and research of NTSB.

In March of this year, NTSB released the final report on two deaths of the Tesla AutopilotL2 class autopilot system. According to the report, in both cases, drivers relied too much on Tesla L2 automatic driving function, resulting in inattention, which eventually led to accidents. In addition, the Tesla AutopilotL2 class automatic driving system did not send out a reminder in time when the driver dropped off. Before the collision, the vehicle collision warning system did not send an alarm, and automatic emergency braking was not activated, which was also one of the causes of the accident.

Photo source: Tesla website

NTSB said that because the National Highway Traffic Safety Administration (NHTSA) has not yet developed a test method for the L2 automatic driving systems safety assurance measures, the safety measures may not be perfect when the vehicle is out of factory, and eventually lead to a collision accident. This also makes NTSB reflect on the establishment of relevant standards for automatic driving. NTSB recently proposed nine suggestions to NHTSA, OSHA, apple, etc., including the establishment of standards for driver focus monitoring system, the monitoring of driver focus for vehicles equipped with L2 level automatic driving, and the increase of front collision in new vehicle quality inspection Collision warning system test and evaluation of Tesla autopilot automatic driving aid function, etc.

After a series of automatic driving accidents, Tesla also began to weaken its autopilot automatic driving attributes, emphasizing that this function is auxiliary driving and can not fully realize automatic driving. Tesla reminded on the official website that at present, the automatic driving function available to Tesla requires the driver to actively monitor, and the vehicle has not yet achieved full automatic driving. The activation and use of some functions will require the demonstration of billions of miles to achieve far greater reliability than human drivers; at the same time, it will also depend on administrative approval (some jurisdictions may take longer).

At present, Tesla is speeding up its business. However, in the context of frequent accidents and questions about safety, how to ensure driving safety and regain consumer trust may be the key issue Tesla has to think about.

Source: responsible editor of daily economic news: Wang Fengzhi_ NT2541