By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
After crash, Google still has hurdles before self-driving cars will be ready for consumers
68ac095172088126d9e456a2d93fc50683aead9311d600de67441903691ffcf0.PNG
Google's self-driving car project just had its first collision where the self-driving software was at fault. But besides fixing the code, automated vehicles spur legal and ethical dilemmas. - photo by Sam Turner
The self-driving car business hasn't come to a screeching halt, but recent events show that there are still some kinks to iron out before consumers will be able to let robots take the wheel.

On Monday, Google said that it bore "some responsibility" in a minor collision that happened on Feb. 14 in El Camino, California.

According to Google's report, one of its vehicles changed lanes and hit a bus after predicting that the bus would either slow down or stop.

The human operator in the vehicle agreed that the bus should have slowed down, but Google admits some responsibility because if its "car hadn't moved, there wouldn't have been a collision." However, Google maintains that this kind of misunderstanding happens with human drivers every day.

After driving over 2 million miles, Google has reported 17 collisions, but this is the first time that the self-driving car was to blame. According to Reuters, the collision has caused Google to refine its software.

"From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles," Google told Reuters.

But software isn't the only hurdle that Google and other self-driving car manufacturers are facing.

This accident, though minor, shows that there is some margin of error for self-driving vehicles which raises some legal and ethical questions about how the project will proceed.

Here are a few issues that the self-driving vehicle industry faces.

Legal restrictions

According to Bloomberg Business, Google's home state of California is planning on making it difficult for Google to operate fully autonomous vehicles.

The California Department of Motor Vehicles has drafted regulations that would require self-driving cars to have all the necessary controls for manual operation (pedals, steering wheel, etc.) and a licensed driver ready to take control.

The federal government may be leaning toward similar regulations. U.S. Transportation Secretary Anthony Foxx told NPR that he approved of California's requirement for a licensed driver.

Unfortunately, these regulations are bad news for the disabled a group that has more to gain from self-driving cars than almost any other.

In a heartwarming video by Google, a blind man named Steve Mahan is able to go about his day thanks to Google's self-driving car.

"Where this would change my life is to give me the independence and the flexibility to go the places I both want to go and need to go when I need to do those things," said Mahan

Requiring a licensed driver to operate self-driving cars would exclude people like Mahan.

These regulations would also limit the potential for self-driving vehicles to automate certain jobs. Truck drivers, taxi drivers and delivery services would all still need a licensed operator to function.

Finally there's the problem of continued licensing.

"In a world where the vehicle is doing more of the driving task, we are also asking ourselves questions as to how you train people to drive in cars like that, Foxx told NPR.

There is the question of whether drivers' skills and knowledge will deteriorate over time through reliance on self-driving functions. Even in the case of an emergency, they may not have the experience to take over the controls.

The process of licensing new drivers would also be more complicated. How new drivers will be trained and tested will have to shift in order to accommodate the changing technology of self-driving cars.

Ethical questions

Google's cars are designed to detect obstacles in the road, like sand bags or police barriers, and react accordingly.

Overall, the self-driving function should make driving a lot safer. Foxx told NPR that by some estimates, self-driving cars should cut the traffic death toll of 33,000 a year by 80 percent.

But there will still be some fatalities. There will be accidents and events that can't be predicted, and in these situations the self-driving software will have to make split-second decisions about what to do, which lives to save and which lives to risk.

For this reason, technology and philosophy folks have gotten together to try to solve the problem.

Last year, Stanford professor Chris Gerdes and California Polytechnic professor Patrick Lin held a workshop where Gerdes proposed a hypothetical situation, reports MIT Technology Review: a child darts out into the street, forcing the car to choose between hitting the child or swerving into an oncoming van.

If the car swerves into the van, more lives will be at risk, but if the car proceeds, the child will almost certainly die.

"These are very tough decisions that those that design control algorithms for automated vehicles face every day, said Gerdes.

The ethical dilemma lies in whether a machine is capable of making those value judgments, and whether we feel comfortable allowing a machine to make those decisions.

People are scared

The transition to self-driving may be difficult because most people are still apprehensive about putting their lives in the hands of computers.

CNN reports that according to AAA, 75 percent of people wouldn't feel safe in fully automated vehicles.

Many said that they trust themselves and their own capabilities more than a machine, but that trust might be misplaced.

About 90 percent of accidents are caused by human error, says the Economist. Pervasion and interconnectivity of self-driving cars are almost certain to increase safety. But drivers still aren't ready to relinquish control.

AAA managing director of automotive engineering and repair John Nielsen told CNN that people may need to be eased into the idea of self-driving vehicles. Sixty percent of drivers do want some automatic features in their car, like automatic braking or self-parking.

As people warm up to these features, Nielsen expects that they may be ready for full automation in the next 5-10 years.