Autonomous vehicles and accidents: are they safer than vehicles operated by drivers?
This article is also available here in Spanish.

Autonomous vehicles and accidents: are they safer than vehicles operated by drivers?

My list

Author | Jaime Ramos

The dream of self-driving cars is still maturing, ticking over silently. Advancements and tests contrast with the doubts surrounding this technology: will self-driving cars be capable of fulfilling the promise to reduce accident rates to anecdotal levels?

What is the likelihood of an accident in a self-driving car?

It is still very early days in terms of the development of autonomous vehicles to truly establish how much self-driving cars will reduce accidents. In this regard, legal advisory companies are somewhat pessimistic (without well-founded reasons in most cases) in terms of self-driving cars and their risks.

If we take a look at specialist lawyers experienced in road traffic law matters, they refer to unfavorable statistics for self-driving cars, with an estimated 9.1 self-driving car accidents per million miles driven in the United States. In turn, the average number of "human" accidents stands at around 4.1. However, these figures, greatly repeated in the sector, correspond to a study conducted in 2013. In other words, the figures are outdated.

self-driving car 2

Other legal sources also offer a forecast taking into account the three fatalities in the United States to date, associating this figure with the distance driven and comparing it to the average number of human-controlled vehicle accidents. Of course, self-driving cars clearly loose this bet.

It is worth noting that self-driving cars have remained in mid-development stage for a few years now and that, no matter how much we are bombarded with fears in relation to their reliability, not enough information has been collected to know what the accident rate will be at its peak.

Autonomous driving present and future

It will still be a few years before we see the commercial eruption of fully autonomous driving or what is known as level 5. During this time, manufacturers still have to overcome a series of obstacles. However, in some parts of the world, particularly in various North American cities, the number of experimental trials and projects with technologies at that level have almost multiplied.

In addition to the uncertainty regarding the definitive takeoff, there are issues in terms of driver assistance systems and automatic pilots already being introduced by manufacturers in their vehicles and which tend to get confused, as reported on numerous occasions by organizations such as Euro NCAP.

What we do know is that, according to the National Highway Traffic Safety Administration (NHTSA), human error accounts for 94% of all accidents. In this regard, there are great hopes that the number of victims will be reduced with autonomous driving. Studies suggest all types of figures, from the complete eradication of that 94%, to more pessimistic reports (by insurance companies) that calculate a reduction of around 35%.

How to regulate self-driving cars?

Regulating them is one of them. It will not be long before the various legal regulations have to take a stance in terms of what will happen with criminal and civil liability.

That is why, in the United States, a country in which case law has a significant influence, legal professionals are preparing for a likely legal battle between technology suppliers, insurers and victims, which will be a determining factor for the future of self-driving cars. Because, in the U.S. legal experts fear the complexity of this matter and are patiently working on establishing definitive criteria regarding who will be responsible for the mistakes of self-driving cars.

So far and according to TechCrunch, in 2019 in the United States there were around 1,400 self-driving vehicle models in testing by more than 80 different companies, most in California.

A moral self-driving car

self-driving car 3

In terms of the liability dilemma, it is not just a case of establishing the blame in the event of an accident, but of reflecting on decision making. One of the livelier aspects of the debate in recent years has been about how a self-driving car should act if it finds itself in a dilemma in which, no matter what it does, it will cause harm to a human being.

In this regard, the following experiment is interesting: "The Moral Machine". This gathered a human perspective from two million people in 233 countries on how a car should respond to these types of tricky situations. Given the complexity of the solutions, those responsible are set on developing legal frameworks adapted to each area and, particularly, on equipping motorized intelligence with a moral intelligence that emulates human intelligence. It will not be easy.

Images | Wikimedia.commons/Dllu, Waymo, Volvo Cars

Related Content

Recommended profiles for you

OA
o A
Msheireb
Smart city lead
AA
Alhareth Abdullah Ali Qasem Altholye
Urban Transport Departement
A representative of the Director of the Urban Transport Department
IR
Iban Rodríguez
Ronin Talent
Dirección Comercial
MV
Marci Vargha
Clean Air Action Group Hungary
ITS expert
LS
Luis Suárez de Lezo
Alcobendas Hub
Partner
MC
Moses C
St Joseph's College of Commerce
Graphic designer
JS
Joan Simó
Deloitte
Project Manager - Mobility
JD
JOSE DOMENECH GARCIA
ISTOBAL ESPAÑA, SLU
Jefe de Ventas España
IR
Isela Romo
Particular IT
Developer Business Intellenge app.
JJ
Jorge Jiménez Suárez
LAFON ESPAÑA
Project Manager Marketing & Communication\\nProject Manager Electric Mobility\\nProyect Manager MIRANE
LB
Lisa Brunzel
hs kehl
Student
HU
hiroto uesaka
DENSO Corpoation
Career Expert
XM
Xavier Majoral
Stimulo
Business Developer & Designer
ES
Edgars Starkis
WeAreDots (dots.)
Head Of Corporate Account
JW
Jianghui Wei
Zhejiang Times
Deputy DG
GG
Grisel Gastélum
UPC
MBArch
AB
Angel Batalla
Last Mile Team
Founder
EC
Emma Cobos
Port Of Barcelona
Business Development Director
DS
Dan Sturges
Mobineers, LLC
Founder
MH
Mathis Heller
Mathis Heller Design
Owner and Design Director