Self driving cars are hope for a better future. One where text or app notifications will not get the drivers attention speeding on a highway. Where “I had just two drinks” people don’t smash into objects on their way home, while being “not drunk”, of course. Where our overworked, tired bodies don’t fall asleep behind the wheel. Where motorcycles or scooters can’t just pop up on your blind spot and old seasonal drivers don’t make your nerves squeeze every time they get their snail speed on the road. And yet, it is still in production and by no means without flaws. A self-driving Uber vehicle run over and killed a pedestrian in Tempe, Arizona, on Sunday night. Some experts call it “catastrophic failure” by Uber’s technology, but it is not the first one. This is a second death by self-driving cars. The first reported fatal self-driving car crash happened in 2016 when a Tesla on “autopilot” did not detect a white truck in its path and killed the vehicle owner. And as far as the human capability to empathise goes, handy comes the quote of the Soviet leader Joseph Stalin:
“A single death is a tragedy; a million deaths is a statistic.”
Human error statistic according to World Health Organisation (WHO) data:
- More than 1.25 million people die each year as a result of road traffic crashes.
- Even more shocking is the statistic that road traffic injuries are the leading cause of death among people aged between 15 and 29 years.
- Nearly half of those dying on the world’s roads are “vulnerable road users”: pedestrians, cyclists, and motorcyclists.
- Somewhere between 20 and 50 million more people suffer non-fatal injuries, with many incurring a disability as a result of their injury.
WHO emphasises that without sustained action, road traffic crashes are predicted to become the seventh leading cause of death by 2030.
We all have heard the saying “thing happen”, when talking about traffic accidents and somehow human error seems to be more acceptable than technology glitch. Somehow we all know the dangers of driving and still don’t think about it that much till it happens to us. But we should. Since road traffic crashes cost most countries 3% of their gross domestic product. So, no matter who crashes, we all pay. When we will find it in us, to realise how interconnected we all are, it should become evident that self driving cars will change this field for better. Insurance Information Institute (III) estimated that by 2030, 25% of all cars sold will be autonomous and there will be an estimated 80% fewer traffic accidents because of the increased safety of it.
From my point of view the objectives are:
a) We have a problem with road safety, that is caused by human error;
b) Old ways won’t solve it;
c) It is easier to design new technology, improve and implement it in our lives than to try to fix the causes of human error;
d) Self driving cars even now are safer than the drivers on the roads;
e) There’s still a lot to sort out, like legal liability, policies and public acceptance.
What does trouble my mind is the possible loss of humanity and accountability by these artificial intelligence advancements. The first opinions I read on the accident /besides the news/ were on Twitter and they were in lines of “her own fault, who crosses the highway like that” and “of course there will be deaths, it’s new technology”. The first argument is just plain dum and I doubt that this person would say that if the circumstances were different – like a kid running to get a ball on the road in daylight. This scenario is just as possible by the details of the accident, since there was undeniable fault in the car, no matter how dark it was. The other arguments of knowing and accepting human casualties is more complex. We’re kind of saying it is okay if the outcome brings better future, like it’s “small cost to pay”. But what if in the future accidents like this won’t be rare at all? We are so used to smart phones and it seems like we’re very advanced in this field, but then again – some of them blow up, some are being hacked by governments and tracked on the usage patterns. Not even to mention cyber-security and new possibility to take control over persons life by hacking the infrastructure. Talking about “shooting range” going global..
So what if that happens? If a man kills someone, he usually goes to jail. Who’s going to be accountable for the future glitches? Recently I read Noah Harari book “Sapiens a brief history of humankind” and one of the ideas that got my attention is the fact that corporate companies that have many shareholders basically can live on forever. We believe in strong brands even after the funders are dead and we think about them from the product or value point of view, and when something happens they are basically untouchable. Society gets paid for loses but there’s no human responsibility. When have you heard of someone going to jail after some corporate product has caused death? Like for this specific case, Uber has temporarily suspended testing. The news information stated that a spokesperson (Mr. X, I guess) said in a statement that the video was “disturbing and heartbreaking”, adding: “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.” As you can see responsibility now and even more in the future will be a wild chase.
To sum it up, I would say that artificial intelligence and technology advancements are inevitable and hold great promise for the future and human longevity. At least we believe in it, as the only creatures who have the ability to believe in collective imagination. We have never been this powerful and at the same time more irresponsible than ever. As N. Harari wrote “Self -made gods with only the laws of physics to keep us company, we are accountable to no one. We are consequently wreaking havoc on our fellow animals and on the surrounding ecosystem, seeking little more than our own comfort and amusement, yet never finding satisfaction”. I just hope there won’t be a day when we care more about technologies than other human beings.
Drive safely, be openminded and keep your humanity!
Little extra for those who like data:
From a young age, males are more likely to be involved in road traffic crashes than females. About three quarters (73%) of all road traffic deaths occur among young males under the age of 25 years who are almost 3 times as likely to be killed in a road traffic crash as young females.
Drivers using mobile phones are approximately 4 times more likely to be involved in a crash than drivers not using a mobile phone. Using a phone while driving slows reaction times
More than 90% of road traffic deaths occur in low- and middle-income countries.
A pedestrian is killed by a car roughly every 90 minutes in the United States.
Nevada was the first state to allow the use of autonomous vehicles in 2011. Since then, five other states—California, Florida, Michigan, North Dakota and Tennessee—and Washington, D.C., have passed autonomous vehicle legislation. Sixteen states introduced legislation related to autonomous vehicles in 2015, up from 12 states in 2014, nine states and D.C. in 2013, and six states in 2012.
In 2015 Tesla Motors Inc. activated its Autopilot mode, which allows autonomous steering, braking and lane switching. In July 2016 the first fatality from an autonomous vehicle was reported. The National Highway Traffic Safety Administration is investigating what role if any that the Tesla Motors Model S Autopilot technology had in a Florida collision between the vehicle and a tractor trailer. Tesla said autopilot sensors failed to detect the truck, turning in front of a Model S, against a bright sky. The crash killed the vehicle’s owner.
A survey by IEEE, a technical professional organisation dedicated to advancing technology for humanity, of more than 200 experts in the field of autonomous vehicles found that of six possible roadblocks to the mass adoption of driverless, these three were ranked as the biggest obstacles: legal liability, policymakers and consumer acceptance. Cost, infrastructure and technology were seen as less of a problem.