Google Self-Driving Car at Partial Fault for Crash

John Lister's picture

Google has confirmed that one of its self-driving cars was partially responsible for a minor crash with a bus. It's the first time the company has taken a share of the blame for a prang.

The cars operate through a range of technologies including sensors, cameras, lasers, GPS and map data. The theory is that these allow them to track the activity of other vehicles on the road and more reliably avoid crashes than cars which are subject to human driver error.

California allows companies such as Google which meet set criteria to operate self-driving or "autonomous" vehicles on public roads for testing purposes, though only on streets with a speed limit of 35 miles per hour or less. One of the key rules of the licensing is that the test cars must always have a human driver on board with the ability to immediately take control in the event of an emergency.

Google issues monthly reports on any incidents involving its cars. While they've been involved in several minor accidents, to date these have always been the result of human error by the drivers of other vehicles, such as changing lanes without warning or not paying attention when making a turn. It appears that in every case the other vehicle drove into the Google car.

The only "at fault" incident came when a test driver was using the car for a personal journey and had turned the automatic controls off. The closest the self-driving cars have come to a traffic violation so far was when a police officer pulled over one vehicle for driving too slowly, though no ticket was issued.

Collision Took Place At 2MPH

Now Google says one of its cars has driven into the side of a bus that was travelling in the same direction. Thankfully it was a very minor collision as the car was travelling at just 2 miles per hour while the bus was moving at 15mph.

Both vehicles were in the same marked lane, though it was wide enough to allow two vehicles side by side in most places. The Google car had been driving close to the curb, allowing room towards the inside of the lane. However, it slowed down when it came up to sandbags by a storm drain (meaning there was no longer two vehicle widths of space) and waited for a gap in traffic to move further inside. (Source: ca.gov)

Algorithms Make Assumptions

It seems everyone and everything involved made an assumption. The car's system calculated that a bus coming up from behind would slow down to let it pull out; the human driver of the car made the same assumption and thus didn't override the controls. However, the bus driver assumed the car would stay put until a clearer space opened up in the traffic.

Google says that "This type of misunderstanding happens between human drivers on the road every day... In this case, we clearly bear some responsibility." (Source: engadget.com)

The company says some good has come of the incident. As a result of the data gathered in this and other test drives, the automatic driving system has been tweaked to take into account that drivers of large vehicles (which are more awkward to slow or stop) are less likely to yield to traffic trying to move into its lane, or into a particular area of a lane.

What's Your Opinion?

Has Google acted responsibly by publishing details of the crash and admitting partial responsibility? Is this an acceptable result of the testing process and something that's not a big deal compared with the number of human-on-human collisions? Or is it a sign that self-driving cars aren't reliable enough for safe public use?

Rate this article: 
Average: 5 (11 votes)

Comments

Dennis Faas's picture

All bugs and quirks aside, I'm looking forward when the self-driving car technology makes its way into mainstream where users can switch between manual and automated modes when needed. This will of course be especially useful for individuals who are simply unable to drive due to a disability, or those who are impaired and shouldn't be behind a wheel.

Don Cook's picture

If the Google car was in front of the any vehical on a single marked lane of road, then the vehical behind must give way to the vehical in front OK.

Commenter's picture

Nope. The vehicle that moves over must yield right of way. The Google car was entirely at fault here.

I'm sick and tired of drivers cutting in front of me who believe it's my duty to slow down so they can cut me off! Wrong! It's my job to keep a steady speed and it's their job to merge into an existing spot where they will not cause other traffic to need to slow down or speed up.

Does no one know the rules of the road anymore?

matt_2058's picture

This says it all: "...one of its cars has driven into the side of a bus that was travelling in the same direction." The bus was ahead of the car. Otherwise the bus would have driven into the side of the car...or rearend. Admitting full responsibility is what should happen.

I'd bet that a diagram would clearly show the Google car at fault. No matter how you look at it, the only way for the bus driver to be responsible is if the bus was overtaking the car improperly. Then there is the question of the sandbags being in the roadway, if the above description is correct.

I don't believe the error described. They say the car assumed the bus would slow down to let the car get in. I say BS. The car is supposed to be making driving safer by NOT thinking like humans. And you have to program that logic in. I bet the error was the car not calculating the speed of the bus correctly. Or better yet, some anomaly with the bus length (and maybe speed) calculation.

The scariest thing mentioned here was the car being pulled over for going too slow...VERY dangerous. I live in an area with alot of elderly people who should not be driving because they drive 20 under the limit and can't see far enough with the speeds involved in highway driving. The only thing worse is the texters in the left lane going 15 under with cruise on so they do not have to respond to traffic.

gdday_6551's picture

I think these vehicles are wishful thinking on the part of the manufacturers .
There is no way they can cope with the with the formidable rules & regulations let alone the driving conditions on poor quality road systems.
Here is a road rule in NSW Australia.

Drivers must give bicycle riders at least a metre of space
From 1 March 2016, drivers who pass a bicycle rider must allow a distance of at least:
1 metre when the speed limit is 60km/h or less
1.5 metres when the speed limit is more than 60km/h

How is a vehicle going to differentiate that one?
Also here in Sydney the road speed limits change multiple times on the same stretch of road there are no electronic signals for a self drive car to pick up on. Our bureaucratic revenue raising government would not want to do this as they loose all the money derived from fines for speeding.
I could raise many more problems but I am sure most drivers be aware of them already.
Put self drive cars on there own roads it might work.