First driverless fatality

WBahn

Joined Mar 31, 2012
30,058
Published on May 25, 2016
Just to make it clear: The Tesla Model S is the absolute best car in the world at the moment. Nothing comes close.
Whenever I see a claim like this I pretty much ignore whatever else is said afterward because the writer has demonstrated such a high level of non-objectivity that it is unlikely that they are capable of writing anything remotely objective in the lines that follow.
 

nsaspook

Joined Aug 27, 2009
13,272
http://www.latimes.com/business/technology/la-fi-hy-tesla-google-20160701-snap-story.html
Once behind the wheel of the modified Lexus SUVs, the drivers quickly started rummaging through their bags, fiddling with their phones and taking their hands off the wheel — all while traveling on a freeway at 60 mph.

“Within about five minutes, everybody thought the car worked well, and after that, they just trusted it to work,” Chris Urmson, the head of Google’s self-driving car program, said on a panel this year. “It got to the point where people were doing ridiculous things in the car.”

After seeing how people misused its technology despite warnings to pay attention to the road, Google has opted to tinker with its algorithms until they are human-proof. The Mountain View, Calif., firm is focusing on fully autonomous vehicles — cars that drive on their own without any human intervention and, for now, operate only under the oversight of Google experts.
...
The NHTSA ranks self-driving cars based on the level they cede to the vehicle, with 1 being the lowest and 5 the highest.
...
Tesla’s autopilot feature is classified as level 2, which means it is capable of staying in the center of a lane, changing lanes and adjusting speed according to traffic.
...
The problem with level 2, critics say, is that it’s just autonomous enough to give drivers the false sense that the vehicle can drive itself, which can lead to careless behavior.

Tesla disputes this — its owner’s manual details the feature’s limitations — and it says drivers are actually clamoring for the product. Tesla executive Jonathan McNeil said in a February investor call that the autopilot feature is “one of the core stories of what’s going on here at Tesla.”
Beta test software with low levels of real functionality (0 to 4 are the actual levels) while hyping hands off driving is asking for trouble. The drivers are still totally responsible for accidents but Tesla is pushing the limits of the car makers responsibility.

http://www.nhtsa.gov/About+NHTSA/Pr...eases+Policy+on+Automated+Vehicle+Development
 
Last edited:

MrAl

Joined Jun 17, 2014
11,480
Who's the "them" that "legally that may get them out of it"? Tesla and such? Only the laws that specifically shield them will get them out of it. History has shown time and time again that juries will look right past abject stupidity on the part of the person and assign blame to the manufacturer (and whomever else they can find with deep pockets). My dad's company, who manufactured industrial air compressors, was sued by the families of two guys that died while diving because the air being pumped down to them had oil in it. The guy that was the divemaster has built the compressor using junk yard compressors that my dad's company had built some thirty years earlier for a tire shop. The compressors were not oil-free to begin with (as is the case with nearly all compressors), but the suit claimed that since the pumps (which had been sitting in a junk yard for over a decade, mind you) hadn't been labeled as not being suitable for human breathing air, that they were none-the-less liable. The jury agreed.

Hi,

Sorry to hear about that. It is strange that a jury with probably no knowledge of pumps had some say in that matter. I happen to have experience with pumps because i build my own air compressor with a 1965 Cadillac air conditioning compressor. I can tell you that the thing needed regular oiling, an an oil filter to filter the excess oil in the air stream out. This looks like an inverted container with a somewhat clear plastic of some kind. There is a release valve at the bottom to release the excess once it builds up. Without that there would be a lot of oil in the paint when using it to spray paint with, but anyone that would use the air that came out of that thing to breath with has to be just plain stupid!

As to the 'them', yes the company that makes the car. I was speculating that they can use the car manual to get out of any liability because that addresses the case in question, as to the actions of the user. A company may not be responsible when the user uses a product in a way it was not designed for. That, plus the more modern trend for companies to be able to get out of legal responsibility for things, especially high tech stuff. But that's stuff that they are truly responsible for, not as in your dad's case.
What may hold this back however is the fact that even after one crash it didnt know enough to stop so it caused some other damage too. That's just plain nuts.

But my pain point is...
Back when i was thinking about this myself after hearing about driverless vehicles, i could not help but think how a photo could tell a computer that there is something in the field that is recognizable if that object is the same color as the background. it takes even a human a little longer to recognize this sometimes.
But the main image i get when i think about this is that common joke about the painting from a new unknown artist, and it's just a canvas covered completely in white, with no objects in the picture. The joke punch line is that it is a "white rabbit in a snowstorm".
Unfortunately, it's not always a joke because that's what a real white rabbit would look like on a background of white snow. It would take a very very high resolution camera to catch any sign of a rabbit, which the car may want to try to avoid hitting.
What else surprises me though is that this must have never been tested, or else there had been only minimal testing of objects that blend into the background scene. It would then surprise me that they never saw a failure, or did they perhaps let it go anyway. Sometimes companies on a tight schedule let things go because there isnt time to fix them yet.

In my opinion the company should be liable at least in part, because it could not recognize a problem had come up that is dangerous to human life. I guess it depends partly on the wording of the manual though.

After all is said and done, we have to wait for any trial, if there is one, to find out what will happen for sure. That's the reality of it.
 

jpanhalt

Joined Jan 18, 2008
11,087
In my opinion the company should be liable at least in part, because it could not recognize a problem had come up that is dangerous to human life. I guess it depends partly on the wording of the manual though.

After all is said and done, we have to wait for any trial, if there is one, to find out what will happen for sure. That's the reality of it.
Just an FYI, here is a link to the Model S Owner's Manual: https://www.teslamotors.com/sites/default/files/Model-S-Owners-Manual.pdf Pages 50 thru 61 seem to cover the assist and "autopilot" functions. I see plenty of warnings there.

1) Considering only the driver of the Model S who was killed, why should the company be responsible for someone driving a complex car who is flagrantly (apparently) ignoring the clear warnings?
2) Suicide must be ruled out in all single-vehicle fatal accidents as a high percentage of them are considered probable suicide. Has that been ruled out? Is the company at all liable if it was suicide?

Based on what we know today, the only fault I can attribute to Tesla is lack of a kill switch on impact. However, that absence dd not contribute to the accident or the death. It apparently did contribute to the subsequent property damage.

John
 

MrAl

Joined Jun 17, 2014
11,480
Just an FYI, here is a link to the Model S Owner's Manual: https://www.teslamotors.com/sites/default/files/Model-S-Owners-Manual.pdf Pages 50 thru 61 seem to cover the assist and "autopilot" functions. I see plenty of warnings there.

1) Considering only the driver of the Model S who was killed, why should the company be responsible for someone driving a complex car who is flagrantly (apparently) ignoring the clear warnings?
2) Suicide must be ruled out in all single-vehicle fatal accidents as a high percentage of them are considered probable suicide. Has that been ruled out? Is the company at all liable if it was suicide?

Based on what we know today, the only fault I can attribute to Tesla is lack of a kill switch on impact. However, that absence dd not contribute to the accident or the death. It apparently did contribute to the subsequent property damage.

John

Hi John, thanks for the reply.

It's called "speculative license": "I am, therefore i speculate".

There is no way to be absolutely sure until after the court date, and even then it's not always definite because there may be an appeal. So what this means is that no amount of arguing (of course that's in a debating sense that is) will yield any useful results.

What did prompt my response however was the fact that AFTER the original crash the car kept going and going, as if it had no mind whatsoever about what had just happened. After any crash i would expect it to stop whether it was because of the driver OR the automated system. The only thing i dont know for sure yet is when exactly the driver lost consciousness which would answer the question: did the driver keep it going, or did the car system keep it going?
But again, this and any other guess at this point can only be pure speculation. Although we could argue the merits of any given perspective it's still up to the "courts" to decide.

Just to note, i would probably be more for the driver being at fault if the car stopped right after the crash. Maybe the driver was at fault to begin with, but the system should have done something right after that.

It's also hard to ask questions about this because we dont have the entire story and we dont have the forensic evidence that would be needed to understand this case in it's entirety. This makes any guess at this point just about as speculative as it gets :)

For example, asking about possible suicide is only one out of many questions that would have to be answered. Even if it was, we'd have to know what happened after that first crash. If it was suicide or an accident the car still should not have kept going. I think you would agree here.
Another example, we are still assuming that the driver ignored the warnings. What if the system really went haywire and the driver had no control at that point?
 

nsaspook

Joined Aug 27, 2009
13,272
http://www.wsj.com/articles/teslas-autopilot-vexes-some-drivers-even-its-fans-1467827084
After his Tesla Model S had driven itself for 17 straight minutes on Interstate 66 in Virginia last November, Carl Bennett, sitting in the driver’s seat, began looking at a printed document.

Seconds later, he glanced up and saw a truck parked in the road ahead. His car’s Autopilot technology didn’t react the way he expected, Mr. Bennett said. He slammed on the brakes, swerved and hit the truck. He wasn’t hurt, but the $106,000 electric car was totaled.
...
Interviews with drivers and engineers suggest that enthusiasm for autonomous driving has raced ahead of the technology’s capabilities, deepening concerns about road safety.
...
Arianna Simpson, a venture capitalist in San Francisco, said the Autopilot in her Model S “did absolutely nothing” when the car she was following on Interstate 5 near Los Angeles changed lanes, revealing another car parked on the highway.
..
Tesla responded that the April crash was her fault because she hit the brakes right before the collision, disengaging Autopilot. Before that, the car sounded a collision warning as it should have, the car’s data show.

“So if you don’t brake, it’s your fault because you weren’t paying attention,” said Ms. Simpson, 25. “And if you do brake, it’s your fault because you were driving.”
Tesla is IMO playing a dangerous game with the autopilot + lane-keeping assist. All the official operational information is filled with disclaimers but the drivers impression when the system executes the usual lane following operation is that of a complete collision warning or avoidance system.

Autopilot related accidents are still pretty rare so I don't think we'll see Tesla cars used as an IED anytime soon.

 
Last edited:

nsaspook

Joined Aug 27, 2009
13,272
Hi,

Sounds like what you are saying is that we should buy one for our worst enemies (har har har) :)
I hope not.:)

Sadly the story of the Ford Pinto is one where a simple calculation of 'harm' as money meant leaving the car 'as is' actually saved Ford millions of 1970's Dollars over the years paying for deaths, injuries and destroyed cars vs fixing the gas tanks before selling to the public.

https://philosophia.uncg.edu/phi361...oes-business-need-ethics/case-the-ford-pinto/
 

WBahn

Joined Mar 31, 2012
30,058
I hope not.:)

Sadly the story of the Ford Pinto is one where a simple calculation of 'harm' as money meant leaving the car 'as is' actually saved Ford millions of 1970's Dollars over the years paying for deaths, injuries and destroyed cars vs fixing the gas tanks before selling to the public.

https://philosophia.uncg.edu/phi361...oes-business-need-ethics/case-the-ford-pinto/
Though a relevant question is just when such a calculation crosses the line into culpability? What is the threshold? I can guarantee that every single car model ever built could have been made safer (in the sense that fewer people would have died while driving or riding in that model car), and often at a fairly marginal increase in production cost.

FWIW: Overall the Ford Pinto actually had a lower fatality rate than other cars in the subcompact class and only a slightly higher fatality rate involving fire.
 

MrAl

Joined Jun 17, 2014
11,480
I hope not.:)

Sadly the story of the Ford Pinto is one where a simple calculation of 'harm' as money meant leaving the car 'as is' actually saved Ford millions of 1970's Dollars over the years paying for deaths, injuries and destroyed cars vs fixing the gas tanks before selling to the public.

https://philosophia.uncg.edu/phi361...oes-business-need-ethics/case-the-ford-pinto/
Hi,

I couldnt help but read the whole story. Very sad. That's what the USA is about now.
It is becoming more and more clear that it is no longer a government by the people and for the people. They do too many things that are obviously not what the 'people' want anymore and they ignore the desires of the people with some idealistic view of what a society should be like. There is a saving grace however, and that is that they become increasingly brazen with time and that means that eventually their brains will becomes less and less aware of it and so they will start to act in a manner in which every American in the US will recognize as totalitarianism and then start to act to do something about it. It's just a matter of time. What comes after that is either admission of the fault or total dictatorship.
 

Thread Starter

cmartinez

Joined Jan 17, 2007
8,253

WBahn

Joined Mar 31, 2012
30,058
This is a detail I didn't know:

"The high ride height of the trailer, combined with its positioning across the road and the extremely rare circumstances of the impact, caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S," the statement said.

I wonder if the poor fellow was decapitated...
He was very likely disassembled at some level -- whether that level was at the neck (to qualify as a true decapitation) or not is an open question.

The original post showed the track of the car passing through (i.e., under) the trailer, so since the guy was killed but the car was still sufficiently intact to keep going any significant distance at all pretty much requires that the impact be at the windshield level. So I assumed from the getgo that he was disassembled, though I still allow for the possibility that either the impact was high enough not to cut him in two or he ducked, resulting in the same effect, but that the impact resulted in shrapnel or crushing that killed him without actually disassembling him. Whichever it was, I suspect death was nearly instantaneous, though of course not even that is guaranteed.
 

nsaspook

Joined Aug 27, 2009
13,272
FWIW: Overall the Ford Pinto actually had a lower fatality rate than other cars in the subcompact class and only a slightly higher fatality rate involving fire.
Maybe that's true in spite of poor engineering & corporate negligence but maybe the fact people knew they were driving a potential fire-bomb made them actually drive a bit safer. In the long run Ford started building safer cars after this coldblooded corporate calculation so there was a plus side.
 

nsaspook

Joined Aug 27, 2009
13,272
Another 'autopilot' incident. This one seems to demonstrate how completely some people believe the autopilot hype.

http://abcnews.go.com/Business/wireStory/feds-seek-autopilot-data-tesla-crash-probe-40515954
The company said the Model X alerted the driver to put his hands on the wheel, but he didn't do it. "As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway," the statement said.

The car negotiated a right curve and went off the road, traveling about 200 feet on the narrow shoulder, taking out 13 posts, Shope said.

The trooper did not cite the driver, saying he believed any citation would be voided because of the driver's claim that the car was on Autopilot.
 

WBahn

Joined Mar 31, 2012
30,058

nsaspook

Joined Aug 27, 2009
13,272
http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
Regaining Control
Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer's control. At 65 mph, that's between 100 feet and quarter-mile traveled by a vehicle effectively under no one's control.

This is what’s known by researchers as the “Handoff Problem.” Google, which has been working on its Self-Driving Car Project since 2009, described the Handoff Problem in a 2015 monthly report (PDF). "People trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax,” said the report. “There’s also the challenge of context—once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?"

...

Consumer Reports calls for Tesla to do the following:
  • Disable Autosteer until it can be reprogrammed to require drivers to keep their hands on the steering wheel
  • Stop referring to the system as “Autopilot” as it is misleading and potentially dangerous

  • Issue clearer guidance to owners on how the system should be used and its limitations

  • Test all safety-critical systems fully before public deployment; no more beta releases
The Tesla 'Autopilot' operates in the man-machine control loop at the point (physical driving process) where the brain during driving normally operates primarily below full awareness in the land of instincts and impulses. We learn to trust our internal human 'Autopilot' to evaluate the environment and warn the fully aware driving brain of danger in time to avoid problems.

The problem with the Tesla system IMO is trust. Tesla has managed to create a system (the entire human-machine interface) that seems so good it can be trusted to drive even if the manual says NO.

https://hbr.org/2016/07/tesla-autopilot-and-the-challenge-of-trusting-machines
 
Last edited:

jpanhalt

Joined Jan 18, 2008
11,087
1) There are plenty of warnings in the Tesla manual.
2) 65 mph =95 fps, 3 seconds = 286 feet; 17 seconds = 1620 feet (0.31 mile). What system of measure is the NHTSA using?
3) The pilot of an airplane "on autopilot" is 100% responsible for what happens to the airplane. Why should some rich and dumb motorist be any less responsible?
4) Yes, response to an emergency signal is a lot longer than reaction time. I do not see how that has any mitigating value on who is responsible -- the person behind the steering wheel.
 

nsaspook

Joined Aug 27, 2009
13,272
They are not less responsible drivers. The semi-autonomous Tesla pseudo-autopilot is making them less capable drivers. It is simply impossible to change the attention behavior of human beings as a group by warnings in a manual when they see it work in their hands.

The equivalent driving skill of the Tesla car is really close to a kid with a learners permit (I don't think the car could pass a on the road drivers test) with you in the passenger seat ready to take over by slamming the brakes and/or yanking the wheel at the last moment. The responsibility is totally in the drivers hands but it would be IMO foolish to hand the keys of a sports car to a kid who can only handle less than basic traffic conditions and then let them drive at 80+ MPH in typical traffic while they are still learning to drive.

 
Last edited:
Top