Only first page is showing.
Lots of research going on for the self driving car. Some happening right here in Pittsburgh. My co-worker's nephew is the wrench for one of the projects here.
Will we see a real self driving car in our life time? I just don't see it outside of experimentation under controlled conditions. When humans drive it is amazing the number of seemingly simple decisions we make that would be extremely complex for a computer to perform.
Encounters with flagmen or police directing traffic for example. The police might direct you into a lane that might have normally been an on coming lane. Or perhaps a road that had an unscheduled closure. A sign is posted. The human reads the sign and makes a simple adjustment to his route.
I just can't see the computer being able to recognize these cues and then be able to react to them appropriately.
Perhaps it would work in a closed system like a campus but I just don't see the self driving car going out into the real world. At least not for some time to come.
A self driving flying car might actually be easier to implement.
Pratt explained what researchers already know but perhaps others don't: Autonomous cars look great in controlled environments but soon fail when faced with tasks that human drivers find simple.
In a blog post, the Mountain View Police Department said the officer noticed traffic backing up behind a slow-moving car in the eastbound No. 3 lane, near Rengstorff Avenue.
The vehicle was traveling at 24 mph in a 35 mph zone.
This brings up an interesting topic. Assuming the occupants have no control over the vehicle other than giving it a destination, who will be responsible for traffic violations? A common one will most likely be speed issues but I imagine other violations will be possible. Who will be at fault? The manufacturer? Who will be at fault in the event of an acident?
The CEO.This brings up an interesting topic. Assuming the occupants have no control over the vehicle other than giving it a destination, who will be responsible for traffic violations? A common one will most likely be speed issues but I imagine other violations will be possible. Who will be at fault? The manufacturer? Who will be at fault in the event of an acident?
The lab rats who tampered with the software.......................The CEO.
Actually, in the end, the shareholders. Figuratively and literally.The lab rats who tampered with the software.......................
Speaking of which................lately; lots of adverts on the radio for very attractive terms on approved used VW cars.Actually, in the end, the shareholders. Figuratively and literally.
And the rest of us. In the end it will be all about insurance, and Warren Buffet will make sure that HE isn't the one to lose out. Those of us that pay premiums will absorb the risk of whatever is on the road.Actually, in the end, the shareholders. Figuratively and literally.
it will have disadvantages because driver will lose his job.Will we see a real self driving car in our life time?
Wow. It's impressive that you are in a position to make such a suggestion. What'd they say?My suggestion to who ever is developing automatic drive control is to sell their design and patents to the automakers who already carry a huge amount of product liability insurance.
How do you program self driving cars to break the law just a little bit to increase safety or at least drive like humans that take small risks daily while driving? How can a machine cope with the imperfections of humans in a uncontrolled environment.The glitch?
They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well. As the accidents have piled up -- all minor scrape-ups for now -- the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?
“It’s a constant debate inside our group,” said Raj Rajkumar, co-director of theGeneral Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh. “And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”
...
They’re usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.
“It’s a dilemma that needs to be addressed,” Rajkumar said.
I would love to see all law-abiding, safe humans driving but we all know there is a sizable percentage of drivers that most of us can spot subconsciously as cars to avoid or watch for stupid moves in traffic. Bad driver detection would seem to an important area to research.Google is working to make the vehicles more “aggressive” like humans -- law-abiding, safe humans -- so they “can naturally fit into the traffic flow, and other people understand what we’re doing and why we’re doing it,” Dolgov said. “Driving is a social game.”
by Duane Benson
by Jake Hertz
by Aaron Carman
by Duane Benson