Home
News
Accidents
Tesla driver using Autopilot kills motorcyclist, prompting another NHTSA investigation
MotoRidersUniverse
Tesla driver using Autopilot kills motorcyclist, prompting another NHTSA investigation
03 Aug 2022 Accidents
Sponsored by Moto Animals

A motorcyclist in Draper, Utah, was killed early Sunday morning when a Tesla driver using Autopilot slammed into the rear of his bike. It is the latest crash involving Tesla’s advanced driver-assist system to draw scrutiny from federal investigators at the National Highway Traffic Safety Administration (NHTSA).

Tesla on autopilot strikes, kills motorcyclist on I-15 | Gephardt Daily

The incident occurred just after 1AM Sunday on southbound Interstate 15, according to local reports. The motorcyclist, who has not been identified, was traveling southbound near the Salt Lake and Utah County lines when the Tesla approached from behind. The Utah Department of Public Safety said the Tesla driver collided with the back of the motorcycle, throwing the motorcyclist to the ground and killing him instantly.

The Tesla driver, who remained at the scene, informed local officials that they did not see the motorcyclist. The driver was using Autopilot at the time of the crash, authorities said.

The Utah crash is the latest to be added to NHTSA’s list of Special Crash Investigations (SCI), in which the agency collects data beyond what local authorities and insurance companies typically gather at the scene. The agency also examines crashes involving advanced driver-assist systems, like Tesla’s Autopilot, and automated driving systems.

As of July 26th, there are 48 crashes on the agency’s SCI list, 39 of which involved Tesla vehicles. Nineteen people, including drivers, passengers, pedestrians, other drivers, and motorcyclists, were killed in those Tesla crashes.

14548962eae793dab66.png

Beyond the SCI program, NHTSA is also currently looking into 16 crashes in which Tesla owners using Autopilot crashed into stationary emergency vehicles, resulting in 15 injuries and one fatality. Most of these incidents took place after dark, with the software ignoring scene control measures including warning lights, flares, cones, and an illuminated arrow board. The probe was recently upgraded to an “Engineering Analysis,” which is the second and final phase of an investigation before a possible recall.

Tesla tops the government’s list of vehicle crashes that take place while using active driver-assist features, which automakers argue make driving safer. Tesla’s numbers were much higher than those of other companies, most likely due to the fact that it sells more vehicles equipped with Level 2 systems than its rivals. Tesla also collects real-time telematics data from its customers, giving it a much faster reporting process.

From July 20th, 2021, to May 21st, 2022, there were 273 crashes involving Tesla vehicles using Autopilot, according to NHTSA. The EV company’s crashes represent the bulk of the total 392 crashes reported during that period.

Of course, hundreds of car crashes occur every day, many of them fatal. More people died in auto-related crashes last year than in any year since 2005. NHTSA estimates that 42,915 people died in motor vehicle traffic crashes in 2021, a 10.5 percent increase from the 38,824 fatalities in 2020. The deaths include pedestrians, cyclists, and others who may have died during a crash.

Most of these crashes do not involve Tesla vehicles. But Tesla crashes, especially ones involving Autopilot, warrant attention because of the company’s willingness to beta test new software on its customers — and as a result, everyone else in close proximity to those drivers. Tesla is a different kind of company because of the risks it takes in putting advanced technology out in the world and because of the corners it cuts to do so.

When a particular technology is involved in a fatal incident — in this case, Autopilot — it deserves to be examined closely to determine what decisions were made that led to this particular failure. And because Tesla insists that Autopilot makes driving safer when the evidence seems to suggest that the results are way more complicated.

#Moto #Bike

2 2.1K
Comments
  • Arffjos 04 Aug 2022
    Fucking auto pilot, Learn how to drive.
    Remember when people, could Drive?
    Reply
  • Arffjos 04 Aug 2022
    It is killing by, being lazy, and ignorant?
    Reply
Please Log In or install the app. Comments can be posted only by registered users.
Related
Home
Menu
Posting
Notify
Sign In
Profile
Content creation
Search
See More