Driverless Shuttle Crashes On First Day Of Operation

nsaspook

Joined Aug 27, 2009
13,312
Hi,

Ironic, but they say it was not the fault of the driverless vehicle...

https://www.usnews.com/news/nationa...11-09/driverless-shuttle-crashes-in-las-vegas
It's just stopped. :D Just like a computer. We all know the big rig drivers expect cars to give-way or move even if you, in that little car, have the right of way.

What about blowing your horn and yelling obscenities like a human to get the truck drivers attention before you get hit.

Mod edit: removed obscene gesture.
 
Last edited by a moderator:

GopherT

Joined Nov 23, 2012
8,009
Hi,

Ironic, but they say it was not the fault of the driverless vehicle...

https://www.usnews.com/news/nationa...11-09/driverless-shuttle-crashes-in-las-vegas

Ironic? Most accidents with driverless cars have been the fault of either the other driver or the engineer/'operator' onboard the driverless vehicle in charge of oversight.

Also, the Las Vegas vehicle had a top speed of 25 mph and typical speed of 15 mph. Hitting it is almost like hitting a stationary object. Hitting a stationary object or very slow moving object is almost always blamed on the moving vehicle.

I see no irony unless the autonomous vehicles computer was distracted with a text message from a hot looking cpu.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,494
It's just stopped. :D Just like a computer. We all know the big rig drivers expect cars to give-way or move even if you, in that little car, have the right of way.

What about blowing your horn and yelling obscenities like a human to get the truck drivers attention before you get hit.
Hi,

You mean the driverless vehicles dont have a finger to extend ? (he he) :)
 
Last edited by a moderator:

Thread Starter

MrAl

Joined Jun 17, 2014
11,494
Ironic? Most accidents with driverless cars have been the fault of either the other driver or the engineer/'operator' onboard the driverless vehicle in charge of oversight.

Also, the Las Vegas vehicle had a top speed of 25 mph and typical speed of 15 mph. Hitting it is almost like hitting a stationary object. Hitting a stationary object or very slow moving object is almost always blamed on the moving vehicle.

I see no irony unless the autonomous vehicles computer was distracted with a text message from a hot looking cpu.
Hi,

No ironic that it was only on the road for a few hours and an accident. It's kind of funny really :)
 

dl324

Joined Mar 30, 2015
16,943
Ironic, but they say it was not the fault of the driverless vehicle...
Generally, about half the people involved in motor vehicle accidents weren't at fault. In some cases, the at-fault driver does something to cause an accident and they just proceed on their merry way without ever noticing the mayhem they caused...
 

WBahn

Joined Mar 31, 2012
30,075
There's another facet of the this that I think will become more evident over time.

The question of who is at fault is one thing. But there is a very different issue of whether the accident would have happened at all if the car had had a human driver.

People make mistakes when driving all the time. Yet the vast majority of the time no accident results because of the way all of the other drivers respond to avoid the collision. How well do driverless vehicles perform this task? It's not a clear cut answer because certainly, at least some of the time, the driverless cars do better than human drivers typically do. But when they don't, it's simply pointed out that the other driver was at fault, not that the driverless car failed to avoid the collision. So what about on average?

Put another way, imagine a diver that ALWAYS follows the rules of the road EXACTLY. Now someone else does something wrong and an accident results. Sure, it's easy to say that the crash was the result of the other driver. But what if the always-lawful driver had simply done something simple to avoid the crash, even if that something was in technical violation of the law. Most people (and most judges) would find that person partly responsible for the accident even though they did nothing illegal, simply because the driver of a vehicle is expected to exercise good judgment including when to deviate from the law in the interest of safety.

We are constantly being told that the accident record for driverless vehicles is about the same or somewhat better than for human-driven vehicles. Even if that's the case, the range of conditions that driverless cars operate in and that humans operate in are different. This never seems to be reflected in these statements and so what they seem to never take into account is that driverless vehicles are currently operating in relatively low-risk conditions. As far as I know, they are not driving during blizzards, and on single-lane mountain roads in winter, through flooded regions, or any of the many other non-typical and high-risk situations in which human drivers are often operating in and which result in a disproportionate number of accidents that skew the overall rate upwards.
 

GopherT

Joined Nov 23, 2012
8,009
There's another facet of the this that I think will become more evident over time.

The question of who is at fault is one thing. But there is a very different issue of whether the accident would have happened at all if the car had had a human driver.

People make mistakes when driving all the time. Yet the vast majority of the time no accident results because of the way all of the other drivers respond to avoid the collision. How well do driverless vehicles perform this task? It's not a clear cut answer because certainly, at least some of the time, the driverless cars do better than human drivers typically do. But when they don't, it's simply pointed out that the other driver was at fault, not that the driverless car failed to avoid the collision. So what about on average?

Put another way, imagine a diver that ALWAYS follows the rules of the road EXACTLY. Now someone else does something wrong and an accident results. Sure, it's easy to say that the crash was the result of the other driver. But what if the always-lawful driver had simply done something simple to avoid the crash, even if that something was in technical violation of the law. Most people (and most judges) would find that person partly responsible for the accident even though they did nothing illegal, simply because the driver of a vehicle is expected to exercise good judgment including when to deviate from the law in the interest of safety.

We are constantly being told that the accident record for driverless vehicles is about the same or somewhat better than for human-driven vehicles. Even if that's the case, the range of conditions that driverless cars operate in and that humans operate in are different. This never seems to be reflected in these statements and so what they seem to never take into account is that driverless vehicles are currently operating in relatively low-risk conditions. As far as I know, they are not driving during blizzards, and on single-lane mountain roads in winter, through flooded regions, or any of the many other non-typical and high-risk situations in which human drivers are often operating in and which result in a disproportionate number of accidents that skew the overall rate upwards.

Do you really think that crash avoidance is not part of the logic programmed into these things? Oh, three paragraphs proves you do.

You should come up to Pittsburgh for a few days and attend the conferences, read the press releases, interviews, press conferences and academic papers from CMU, university of Pittsburgh and the companies here like Uber, Delphi, Google, Ford (Argo AI) and several other companies. Not only are they talking about crash avoidance but they are talking about Ethical crash avoidance - "what decisions could/should a self-driving car make if a collision is inevitable?" Does it crash into the vehicle with the fewest identifiable occupants? Does it crash into an identifiable "at fault" vehicle, what if the at-fault vehicle has the fewest occupants?

The developers are so far ahead of you, don't worry, it has been thought of.
 

JoeJester

Joined Apr 26, 2005
4,390
I guess stopping and failing to backup, as a human driver would have, makes everything better. The other driver was at fault for hitting a stationary vehicle.

The autonomous vehicle did not recognize an 18 wheeler backing up. It did recognize something came into the specified distance and stopped, ensuring the driven vehicle would be at fault and receive any citation.
 

nsaspook

Joined Aug 27, 2009
13,312
https://www.theatlantic.com/technology/archive/2013/10/the-ethics-of-autonomous-cars/280360/
But there are important differences between humans and machines that could warrant a stricter test. For one thing, we’re reasonably confident that human drivers can exercise judgment in a wide range of dynamic situations that don’t appear in a standard 40-minute driving test; we presume they can act ethically and wisely. Autonomous cars are new technologies and won’t have that track record for quite some time.

Moreover, as we all know, ethics and law often diverge, and good judgment could compel us to act illegally. For example, sometimes drivers might legitimately want to, say, go faster than the speed limit in an emergency. Should robot cars never break the law in autonomous mode? If robot cars faithfully follow laws and regulations, then they might refuse to drive in auto-mode if a tire is under-inflated or a headlight is broken, even in the daytime when it’s not needed.
..
What kinds of abuse might we see with autonomous cars? If the cars drive too conservatively, they may become a road hazard or trigger road-rage in human drivers with less patience. If the crash-avoidance system of a robot car is generally known, then other drivers may be tempted to “game” it, e.g., by cutting in front of it, knowing that the automated car will slow down or swerve to avoid an accident.
The general AI question is how do we program in ethics and judgement? Once we program in exceptions to legal driving rules who is responsible when laws are stretched to the limit or broken due to it's programming and people are hurt.
 

joeyd999

Joined Jun 6, 2011
5,287
But what if the always-lawful driver had simply done something simple to avoid the crash...
Here in S. Florida we have a lot of drivers who will try to run you into the guardrail as you attempt to merge onto the freeway. They purposefully adjust their speed to match yours so that you cannot merge. In these cases, you have three choices: accelerate hard, brake hard, or crash.

I wonder what a headless car would do in such a situation?
 

WBahn

Joined Mar 31, 2012
30,075
The general AI question is how do we program in ethics and judgement? Once we program in exceptions to legal driving rules who is responsible when laws are stretched to the limit or broken due to it's programming and people are hurt.
Consider an example that occurs probably thousands of times a day, but only very occasionally for a given driver. You hear a siren but can't tell where it is coming from nor whether it is coming your way. Every time you are in that situation you tend to behave differently because of the specifics of the situation. No doubt autonomous vehicles will, as well. Also, just as humans can be tricked into making the wrong conclusion about where the siren is coming from and where it is headed, so too can automated systems (though they will generally do better than humans at estimating that information).

So what do you do? Do you stop where you are until the situation resolves itself, do you continue normally until you determine that some different action is needed? Do you get off the road or turn onto a side street until the situation becomes clear? At different times, each of those (among others) are reasonable things to do. Now let's say that you determine that the siren is coming up behind you. What do you do? Ideally, you would pull off onto the shoulder. But what if that isn't an option? What if you are stopped at a traffic light with cars in all lanes? The common solution in this case is to go through the red light once you determine it is safe and proceed far enough so that others behind you can come through so as to clear a lane for the emergency vehicle to get through. Is this what TODAY's autonomous vehicles would do?

I recall one time when a fire truck was trying to make it's way through heavy traffic that was backed up badly enough so that cars were having to wait several cycles to get through (and at least, more often than not, American drivers will tend to leave the intersection navigable by not pulling into it unless they have a change of getting through it -- don't expect to see that in Taiwan!). So the fire truck used their loud speaker to say something like, "Will the blue car in westbound right turn lane on Fillmore to pull into the intersection far enough for us to get in behind you?" The truck was trying to turn eastbound on Fillmore and about the only way was to go against traffic on westbound Fillmore and then switch over about half a block east of the intersection. So the blue car pulled forward enough for the fire truck to sneak in behind her and the light pole and go the wrong way on Fillmore, driving on the shoulder (where, fortunately, none of the other cars had pulled off since they had chosen to stay in the right turn lane, and which the driver of the fire truck was able to easily observe from his height) and they had no problems.

What if that blue car had been an autonomous vehicle?
 

WBahn

Joined Mar 31, 2012
30,075
Here in S. Florida we have a lot of drivers who will try to run you into the guardrail as you attempt to merge onto the freeway. They purposefully adjust their speed to match yours so that you cannot merge. In these cases, you have three choices: accelerate hard, brake hard, or crash.

I wonder what a headless car would do in such a situation?
My guess is that most of the time it will try to brake hard.

You will also have situations in which the autonomous protocols will contend, just like in communication networks. Humans do this, to. You pull up to a four-way stop and two cars at right angles arrive at nearly the same time. But "nearly" can easily mean that each diver (or car) legitimately thought it was first -- or that it wasn't first. This is more often than not resolved with one driver signaling to the other to go ahead -- and sometimes both drivers signal for the other to go. Or, both cars start to go and one or both recognize that the other is going, too, then stop again. Sometimes this back and forth goes through two or three cycles before it gets resolved. What is the comparable protocol for similar interactions between two autonomous vehicles or between an autonomous vehicle and a human-driver? Does some variant of random-backoff work in this type of system, or are the constraints sufficiently different that this type of approach causes more problems than it solves?
 

dl324

Joined Mar 30, 2015
16,943
They purposefully adjust their speed to match yours so that you cannot merge.
People do that in Oregon; and we have a law that says it's the responsibility of drivers to allow traffic to merge. At least as far as I recall. I generally try to open up a gap to let merging traffic in or get in the other lane.
 

markdem

Joined Jul 31, 2013
113
So what do you do? Do you stop where you are until the situation resolves itself, do you continue normally until you determine that some different action is needed? Do you get off the road or turn onto a side street until the situation becomes clear? At different times, each of those (among others) are reasonable things to do. Now let's say that you determine that the siren is coming up behind you. What do you do? Ideally, you would pull off onto the shoulder. But what if that isn't an option? What if you are stopped at a traffic light with cars in all lanes? The common solution in this case is to go through the red light once you determine it is safe and proceed far enough so that others behind you can come through so as to clear a lane for the emergency vehicle to get through. Is this what TODAY's autonomous vehicles would do?
This one has already been solved.
I can't remember where the testing was done (from memory somewhere in Austria maybe) emergency services control the traffics lights. Not directly as such, but the system knows that there is a firetruck driving up the road and will force all the light too go green to clear the path.

What if that blue car had been an autonomous vehicle?
In a perfect world, this would be easy too. Just like the example above, the firetruck and it path would be announced to the autonomous cars and they would get out of the way and stop before the firetruck would get there.
Unfortunately this would require wireless control over the cars and that would be a security nightmare...
 

WBahn

Joined Mar 31, 2012
30,075
People do that in Oregon; and we have a law that says it's the responsibility of drivers to allow traffic to merge. At least as far as I recall. I generally try to open up a gap to let merging traffic in or get in the other lane.
The laws here, as far back as I can remember (i.e., back when my dad was pre-teaching me how to drive when I was about twelve or so) drivers have always been expected to let joining traffic merge, preferably by moving over a lane where possible. But, of course, that doesn't even eliminate all of the instances where people can't get merged before running out of merge lane and even the shoulder beyond even when you don't have jerks that are intentionally trying to prevent them from getting in. I don't see that behavior very often any more because, a couple decades ago, the cops and courts got pretty heavy handed on people that do that (lumped into the "aggressive driver" category which can get you significantly harsher penalties).

One thing that has become common on the busier on ramps (in Denver, Colorado Springs doesn't have any that I'm aware of yet) are signals that, during heavy traffic times, allow only one car per lane to go. These metered on ramps really seem to have helped. The pacing generally allows one car to join for every two to three cars already on and, as a result, the drivers that are already on the highway seem much more tolerant and willing to let one car in ahead of them because they aren't looking at having three or four cars force there way in if they do.

I don't know which city/state/country did this first. They've been in use in Denver for at least 30 years, maybe pushing 40, but I know they were used in many other cities quite a bit before that. They aren't a 100%, no-downside solution, but they do seem to help out a lot at least at certain types of on-ramps (combination of physical characteristics and traffic characteristics).
 

Sinus23

Joined Sep 7, 2013
248
We in Iceland only recently learned the zipper merge way of traffic(as a community there is). Living in a small country where the capital smells worse than many cities in Europe even when they are by multitude larger because every person has to drive its own car.

I kid you not. over twenty years ago if you did not have a drivers licence by the age of 18 there had to be something wrong with you in the head...
 

WBahn

Joined Mar 31, 2012
30,075
This one has already been solved.
I can't remember where the testing was done (from memory somewhere in Austria maybe) emergency services control the traffics lights. Not directly as such, but the system knows that there is a firetruck driving up the road and will force all the light too go green to clear the path.
That helps, but it doesn't solve it. Just because a light is green does NOT mean that it is legal to go through it when an emergency vehicle is approaching.

In a perfect world, this would be easy too. Just like the example above, the firetruck and it path would be announced to the autonomous cars and they would get out of the way and stop before the firetruck would get there.
Unfortunately this would require wireless control over the cars and that would be a security nightmare...
And therein lies a huge part of the problem -- "in a perfect world". Neither humans nor autonomous vehicles operate in a perfect world. Consider that it is not at all uncommon for multiple emergency vehicles to be responding to the same event but coming from different directions requiring drivers to take different actions for each one and actions that differ from what they might have done given just one or the other or even both of them separate by a different amount of time.

And let's say that we were willing to let the powers that be exercise wireless control over all the cars (and which I would be very much opposed to). Where is the entity going to be that takes in all this information and makes the decisions and exercises control? It can't be the emergency vehicles themselves because they can't be counted on to even be aware of the other emergency vehicles coming into the area (the classic hidden node problem). So what happens when a vehicle coming from the left and one coming from the right both try to take over the same car with conflicting commands? And then this assumes that all of the cars are always able to be taken over. Perhaps that might be a reasonable assumption in New York City (although then you are still assuming that the communication hardware and software at all of the many, many actors involved never hiccups), but what about 20 miles southeast of some town in Kansas with a population of 123?
 

markdem

Joined Jul 31, 2013
113
That helps, but it doesn't solve it. Just because a light is green does NOT mean that it is legal to go through it when an emergency vehicle is approaching.
Not sure what you mean. Remember I am talking about a system where ALL cars (not the firetruck) are auto. Therefore, if the light is green (and you would not really need light anyway) the path is clear. If there is a child, deer or elephant on the road the normal crash avoidance would get involved.
But you are right, the system fails now because there are still humans involved today...

And let's say that we were willing to let the powers that be exercise wireless control over all the cars
We sort of do anyway, that's what traffic lights and sirens on the firetruck do now.
But again, I agree. This would be a nightmare to implement currently. I, however, have no doubt that this will happen..
 
Top