Who Is Responsible When Self-Driving EVs Get Into Accidents? A Guide for Consumers

So you‘re considering an electric vehicle (EV) with sophisticated self-driving features like Tesla‘s Autopilot or GM‘s Super Cruise. These automated systems promise an easier, less stressful driving experience. But a key question arises – if one of these vehicles gets into a crash while driving autonomously, who takes responsibility?

As an auto industry analyst, I field this liability question from prospective EV buyers regularly. The answer is complex, as today‘s self-driving vehicles occupy an uncertain middle-ground from a legal perspective. Read on as I break down the key details and data consumers need to assess responsibility risks with automated EVs.

By The Numbers: How Often Do Self-Driving Cars Crash Today?

Over 392 crashes involving vehicles with Level 2 self-driving automation were reported nationwide between July 2021 and May 2022, based on NHTSA data. The table below summarizes key statistics:

StatisticNumber
Total Crashes Reported392
Crashes Involving Tesla Vehicles273
Crashes With Injuries/Fatalities6 fatal, many injuries

Over 70% involved Teslas specifically

While any fatality is tragic, these figures indicate today‘s systems already assist in preventing many accidents compared to human drivers alone. However, the risks are still very real, so understanding liability in a crash scenario remains important.

Fully Autonomous Cars Don‘t Exist Yet – Why Liability Is Tricky

It‘s crucial to understand that despite advanced features like Autopilot or SuperCruise, no fully self-driving vehicles exist for consumers yet.

The automation involved in today‘s EVs still qualifies as "partial" – meaning drivers must monitor conditions and be ready to take over immediately if the system fails or reaches its limits.

So in crashes, liability questions emerge around whether control/responsibility lies with:

  • The human driver‘s actions (or inaction)
  • The vehicle‘s automated systems

Answers can differ greatly depending on specific circumstances and even state laws.

Take Uber‘s 2018 fatal crash for example – their autonomous test vehicle struck and killed a pedestrian in Arizona. However questions over assigning criminal liability remain complex even years later:

  • Human driver may face charges for negligent distraction
  • Uber may carry liability for system failures
  • Arizona overseers failed to enforce testing safety

See how determining definitive blame splits across multiple parties? Let‘s break down other key players further.

Individual Drivers – Stay Alert and Take Training Seriously!

Buyers Must Complete Self-Driving Tech Tutorials Conscientiously

For now, human EV drivers retain major responsibility to use automation appropriately and remain alert to situations needing takeover.

Even hands-free systems still require monitoring road conditions in case of malfunctions or edge cases. Looking away for more than a few seconds could impact legal liability if crashes occur.

I coach new self-driving EV owners on ensuring they follow all training procedures from the manufacturer diligently when first activating automation packages.

For example, Tesla‘s website states drivers must "keep your hands on the steering wheel at all times" and to always "maintain control and responsibility."

Automakers – Safety Updates Remain A Priority

Preventative Innovations Plus Failsafe Handoff Protocols Can Aid Responsibility

While drivers operate partially automated vehicles, automakers themselves also carry certain duties. Brands like Tesla, GM, Ford and more must ensure sensors, software and other systems provide adequate warnings for drivers to take over when needed.

No automation today proves 100% reliable in every potential scenario. I follow self-driving technology closely and see much room for improvement still through preventative sensor/AI innovations and failsafe redundancy protocols when primary systems malfunction.

Over the next 3-5 years, such updates will hopefully allow vehicles to enable safer handoff back to human control during software or hardware issues. These measures can allow automakers to confidently self-certify and monitor responsibility for autonomous performance without fully exempting drivers.

Insurers – Understand coverage Impacts, Limit Gaps

Average 10-30% Premium Hikes To Cover Self-Driving Tech Risks

Insurers are responding to growing automation prevalence by increasing premiums for tech-enabled vehicles — typically $100+ extra per 6-month policy term.

Why? Self-driving sensors, cameras, etc. mean pricier repair bills. And uncertainty around legal liability translates into actuarial unknowns.

My advice for prospective EV buyers includes closely comparing insurance options to minimize gaps in coverage around automation usage and crash fault. Avoid assumptions — discuss specifics with agents beforehand.

I hope this layperson‘s overview gives helpful perspective on the liability landscape for today‘s automated EVs. Laws and technology remain works-in-progress — but informed consumers can navigate self-driving responsibility risks confidently despite gaps. Feel free to reach out with other questions!

Did you like those interesting facts?

Click on smiley face to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

      Interesting Facts
      Logo
      Login/Register access is temporary disabled