Tesla autopilot does not improve driving safety

image



When this Tesla crashed into the store, it was in manual mode. Most accidents happen on city streets, not on highways where autopilot is used.



Researchers attending the annual Automated Vehicle Summit have demanded that Tesla install a camera-based driver monitoring system in its vehicles.



Tesla's autopilot is equipped with an additional system to ensure that the driver is watching the road. If you keep your hands on the steering wheel and periodically apply force to it, a special sensor will be triggered in the steering wheel. If the system notices that the driver has removed his hands from the steering wheel, then it will start issuing warnings: first on the screen, then sound signals will sound, and if the driver does not respond, the car will slow down and stop. This system does not always work absolutely accurately - some drivers keep their hands on the steering wheel, and the system does not notice it. Also, the system can be β€œovercome” - just by attaching some kind of counterweight to the steering wheel.



Other systems (such as GM's Super-Cruise) use cameras to monitor the driver and monitor how closely they follow the road. The most advanced systems track the direction of the gaze and record how long the driver looks around, at the devices, at the phone, or how long it takes to adjust the mirrors. These systems may require that you keep your eyes on the road, even if you take your hands off the wheel.



Some resonant accidents (including fatal ones) were caused by drivers who did not pay attention to the road. In most cases, we do not know what these drivers were distracted by, but if they followed the road, these accidents would not have happened. Several reports of accidents involving Tesla vehicles indicate that the driver removed his hands from the wheel, although the driver monitoring system did not record this. Rather, the reports indicate that the sensor did not sense the force being applied to the steering wheel. Any Tesla owner will tell you that sometimes they get notifications demanding to put their hands back on the wheel even when they are holding it and watching the road.



Reports of fatal accident involving a Tesla in Mountain Viewindicates that the driver was playing an online game a few minutes before the accident. He probably played it a few moments before his death.



The Tesla Model 3 has a camera looking into the cabin and it could follow the driver, the problem is that Tesla refused to use it for this. It seems that this refusal does not make any sense, but what reasons could it have? Let's consider some of them:



  1. Driver surveillance is somewhat daunting in terms of privacy. Not all customers agree on this, and Tesla doesn't want to scare its customers.
  2. – , , . , Tesla , . , , , , .
  3. (, Model S) . , Tesla , Model 3 , Model S.
  4. , , , . , , .


According to many experts (including Brian Reimer, an MIT researcher who has presented highly credible research on Autopilot ), these excuses are not enough.



It's especially interesting to look at Tesla's claims regarding the safety of their autopilot system. Tesla publishes key crash statistics every quarter in the Vehicle Safety Report on the safety of its vehicles. The Q1 2020 report reflects the significant reduction in mileage and road traffic accidents caused by the pandemic. The Q4 2019 report states the following:



β€œIn Q4, we recorded a road accident every 3.07 million miles driven by autopilot. When driving without autopilot, but using our safety technologies, road accidents occurred every 2.1 million miles. Finally, drivers who did not use any of these features had an accident every 1.64 million miles. In comparison, accidents in the United States occur every 479,000 miles, according to the National Highway Traffic Safety Administration. "



These numbers seem incredible at first, but if you read it carefully, you can see the problem. National Highway Traffic Safety Administration statistics refer to "accidents" while Tesla speaks of "road accidents." Tesla declined to explain what a "traffic accident" means and how it differs from an accident. The department keeps track of the traffic accidents that are recorded by the police. There is a suspicion that Tesla may be keeping records of airbag accidents as the company is notified of such incidents. It is unknown if Tesla will find out about more minor accidents. Without being able to figure out exactly what Tesla calls a "traffic accident," the numbers cannot be compared. And the company shouldn't assume that you will succeed.Tesla representatives declined to comment on all these questions.



Be that as it may, the most interesting figures are the ratio of 3.07 and 2.1 million miles without crashes with the autopilot on and off, respectively.



Tesla doesn't get safer with autopilot on



Reimer also provided data from an upcoming study on autopilot systems. From these data, it can be seen that approximately 94% of the autopilot's mileage is driven on expressways. About 40% of the mileage accounted for by hand control and driving with a cruise control on motorways, the remaining 60% was covered on roads in settlements. It is difficult to find exact numbers, but driving on the highway is safer than driving on normal roads - fatal accidents happen three times less often (although if you recalculate the frequency in hours, the decrease is not so significant). The danger of high speeds is offset by the ease of driving. The incidence of all accidents is less clear, but let's make the same estimates for it (on rural roads, fatal accidents happen 2.5 times more often than on urban ones, although the autopilot is used both there and there).



Of the 2.1 million miles between manual accidents, 880,000 are on the highway and 1.2 million on other roads. In the case of autopilot, of the 3.07 million miles on the highway, 2.9 million and only 192,000 on other roads. Thus, the statistics of accident-free driving in manual mode is about 4.5 million miles on the highway and 1.5 million on all others. The accident-free driving statistics of the autopilot are 1.1 million miles between road accidents and 3.5 million for others.



In other words, in manual mode (with oncoming traffic collision protection enabled) and with cruise control, the accident-free mileage is 30% more than when driving with autopilot. So, self-driving in Tesla is slightly less safe than it is claimed.



However, security is not reduced that much. Even if 3: 1 is too high a ratio of accidents on the highway and other roads, these numbers are not far from the truth. But almost certainly the 1.5x improvement that Tesla claims doesn't exist.



Tesla's problem is that people who want to abuse the autopilot get into accidents because autopilot is a flawed driver assistance system (and that's how the company sells it). The overall increase in statistics comes from people using autopilot responsibly. Thus, in terms of statistics, the autopilot either slightly reduces driving safety or does not affect it at all. If you use the autopilot correctly, your safety is marginally improved. If you do not follow the rules for operating the autopilot, you are at very high risk. Overall, this system does not improve the overall driving safety of Tesla vehicles, and it does not seem to improve it in the future.



We do not know what proportion of accidents involving Tesla vehicles are caused by inattentive drivers, but the available data suggests that this proportion could be significant. If so, then Reimer is right - Tesla should consider implementing a driver monitoring system. Their new AI chip should be able to manage a system that helps, not annoys the driver. This technology will make shoppers happy and safer. The issue of storing video recordings after accidents was also discussed. I believe that you should not be allowed to record in the vehicle without your consent. Supporters of this function should be able to enable it - it can confirm their guilt in the accident or justify them. The use of a driver monitoring system may also be optional. In this case, the driver who turned it offmay be held liable in the event of an accident using autopilot.



I have asked Tesla several times to provide data to separate accidents involving their vehicles by road type, but they refused. Some of the numbers given here are estimates and extrapolated from fatal accident data. I would like to receive more accurate data from Tesla.



It should be noted that Tesla vehicles are generally very safe. These cars have some of the highest crash test scores of any vehicle, and excellent collision avoidance systems have played an important role in generating good off-autopilot safety statistics. This article is only concerned with comparing safety performance with and without autopilot, but using ADAS. The author owns a Tesla, and part of the reason for the purchase was their excellent safety record.



Subscribe to the channels:

@TeslaHackers - a community of Russian Tesla hackers, rental and drift training on Tesla

@AutomotiveRu - auto industry news, hardware and driving psychology







image



About ITELMA
- automotive . 2500 , 650 .



, , . ( 30, ), -, -, - (DSP-) .



, . , , , . , automotive. , , .


Read more helpful articles:






All Articles