paint-brush
Self Driving Psychopathsby@ben_longstaff
601 reads
601 reads

Self Driving Psychopaths

by Ben LongstaffDecember 20th, 2016
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The last article looked at how it’s <a href="https://hackernoon.com/uber-is-the-netscape-of-transportation-be0a13267797" target="_blank">just a question of time</a> until <a href="https://hackernoon.com/tagged/self-driving" target="_blank">self driving</a> <a href="https://hackernoon.com/tagged/cars" target="_blank">cars</a> are on the road and the impact for the transportation industry. In this article I explore the moral challenges of creating error handling for the inevitable accidents. <a href="https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/autonomousveh_ol316+" target="_blank">32 Accidents</a> involving self driving cars have been reported in California since 2014, mostly from humans rear ending the more cautious self driving cars.

People Mentioned

Mention Thumbnail
Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - Self Driving Psychopaths
Ben Longstaff HackerNoon profile picture

The last article looked at how it’s just a question of time until self driving cars are on the road and the impact for the transportation industry. In this article I explore the moral challenges of creating error handling for the inevitable accidents. 32 Accidents involving self driving cars have been reported in California since 2014, mostly from humans rear ending the more cautious self driving cars.

While humans are involved accidents will happen.

Most robots operate in controlled environments away from humans, self driving cars won’t have that luxury. Like to a psychopath, first generation self driving cars won’t be able to empathize with the people they impact.

Movies often refer to Isaac Asimov’s robot laws, in “I,Robot” Will Smith wants the robot to save the girl, but it saves him because the algorithm said his chance of survival was higher. Did the robot make the right decision?

When an accident is inevitable, should self driving cars prioritize the life of a 2o year old pregnant woman over a 75 year old man with ALS in a wheel chair? What if the 75 year old man is Stephen Hawking and the 20 year old woman is a heroin addict? Will the value for each person be pre-calculated so the algorithm can decide who should be saved and sacrificed?

Who will provide the moral framework for these decisions: the government, manufacturer, software engineers or the passengers? The government wants to minimize fatalities, the manufacturer wants to sell the most cars and the passenger wants to arrive safely.

The “right” choice is subjective.

The race to be first

There is a huge prize for winning and devastating impacts for getting it wrong, regulators must ensure corners are not cut. The number of miles per disengagement event, where the car gives control back to the driver, is a good measure for where the competitors are.

ai.bythebay.io: George Hotz,

Self driving technology like intelligent parking assist, lane keeping assist and cruise control have been introduced to help the public become comfortable releasing control to the car.

Regulation

The U.S. Department of Transportation is trying to strike the balance between supporting innovation and protecting the people, the National Highway Traffic Safety Administration has defined five levels of vehicle automation to help answer the hard questions of responsibility.

source: fortune.com

“Levels 0 - 2 the auto maker never takes liability, in 3 the auto maker takes liability sometimes and in level 4 the automaker always takes liability. The only difference between 2 and 4 is insurance.” — 

Regulation needs to balance public safety with innovation

The government is currently putting the burden on the hardware manufacturers. Responsibility becomes more complicated when the software, hardware and maps data are provided by different vendors, Comma.ai was originally going to retrofit existing cars but regulators imposed compliance issues to make that impractical.

“To oversee compliance with the requirements of the Safety Act and associated regulations, we are requiring that you provide the information in the attached Special Order. … If you do not timely or completely respond to the Requests in the Special Order, you may be subject to civil penalties of up to $21,000 per day.” — US Department of Transportation to Comma.ai

Software Updates

Updates will need to happen over time, will they be applied by the owner, automatically or a mechanic? While convenient auto updates requires connectivity which creates an attack vector for hackers. What happens if a bad update goes out that leaves tens of thousands of cars stopped in the middle of the road? Should there be a legal obligation for owners to maintain a minimum software version to be allowed on the road?

Final Thoughts

How much safer than a human driver does self driving tech need to be, 1%, 10%, 10x, 1000x? Do we mean better than a human when driving on an empty freeway on a sunny day or in the middle of peak hour traffic during torrential rain, because these are two very different problems.

“The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.” — Warren Bennis

There are a lot of hard questions that the consumer watch dogs are going to have to find answers for.

Enjoyed this post? Hit the heart button, I will give it to the next robot I meet.

source

Some other things I have written:


Stop building car boats — tech debt 101_Why do smart developers ship bad code?_hackernoon.com


Uber is the Netscape of transportation._Rewind to October 13, 1994, Netscape released its first browser for sale and began making the internet mainstream. Fast…_hackernoon.com