Jump to content

Uber ~ Automated Driving Death


Teken

Recommended Posts

"Did you say your son actually drives  car?....like manually?" "I would be too nervous"
"Is he a historic vehicle collector or a racecar driver?"
"How did he pass all those exams to get licenced? I hear that is a long process."
"Doesn't manually driving your car make *you* responsible for any accidents??"

Sent from my SM-N950U using Tapatalk

Link to comment
20 hours ago, asbril said:

Tesla Blames Driver in Fatal Car Crash

Tesla Blames Driver in Fatal Car Crash - WSJ.pdf

 

18 hours ago, larryllix said:

Thanks! This may blow up and resolve some of the issues and doubts. The article addresses many  of the concerns expressed by users here. Tesla emphasises these cars are NOT self driving.

Interesting is the implication that the car didn't crash headlong into a barrier but rather clipped a barrier and was hit by two other cars, killing the driver. The photo gives the impression the car just drove into the barrier perpendicular to the barrier. Tesla, in their defence, seems to be trying to perpetuate that stating...

     "driver took no action despite having five seconds and about 500 feet of unobstructed view of the concrete highway divider"

It would be interesting to see an aerial view of this piece of road.

 

The plot thickens.

This article proves me point to the T.

No matter how much *Do Not, Should Not, Will Not, Could Not* some stupid SOB will ignore what the maker states as the intended function. Bottom line, the DRIVER was not DRIVING and thus is a smear on the ground.

Then you got companies like Google *Waymo* <-- What a stupid name . . .

Saying the following which I can tell you they have no freaking clue because they have NEVER had their vehicle tested in a real winter city like Canada. 

Quote

Others, like Alphabet Inc.’s Waymo, believe there should be no need for a human to take control in driving situations.

As I stated also no matter how many If, And, But's you put out there some SOB will sue you just like this Huang's family. :wacko: Take some freaking ownership already the driver was not driving and was picking his nose doing something else like watching a movie, surfing, reading a book, sending emails.

I just can't wait to see all the millions of dead people sitting in these cars with no steering wheel . . . :D 

Link to comment
16 minutes ago, Teken said:

 

This article proves me point to the T.

No matter how much *Do Not, Should Not, Will Not, Could Not* some stupid SOB will ignore what the maker states as the intended function. Bottom line, the DRIVER was not DRIVING and thus is a smear on the ground.

Then you got companies like Google *Waymo* <-- What a stupid name . . .

Saying the following which I can tell you they have no freaking clue because they have NEVER had their vehicle tested in a real winter city like Canada. 

As I stated also no matter how many If, And, But's you put out there some SOB will sue you just like this Huang's family. :wacko: Take some freaking ownership already the driver was not driving and was picking his nose doing something else like watching a movie, surfing, reading a book, sending emails.

I just can't wait to see all the millions of dead people sitting in these cars with no steering wheel . . . :D 

Yeah. The "This is not a self driving vehicle" doesn't cut it, and is being used as a hiding behind excuse for lack of responsibility.

An analogy:
When the kids were young they played Tee-ball or some other form of kids baseball. The kids that don't play well, get stuffed into outfield where they never see the ball for many weeks at a time. Then one game, at the end of the season a ball gets cracked out to little Johnny, who is now entertaining himself picking dandelions in the field. The crowd and parents scream at Johnny
...."Get the ball!!!", "Wake Up!!!. Get the ball!!!"
as Johnny wonders why the long grass is ruffling in a fast running streak

. Even if Johnny does get the ball, he doesn't even know how to hold the ball, let alone throw it to some baseman. He has hardly ever touched a ball in the game before.

After all the yelling and shaming the poor kid settles down, Johnny states "I don't want to play anymore!"

 

IOW: You can't tell people they must be responsible for driving the car when they haven't needed to touch the wheel for the last two months. That's BS.  If they do touch it in an emergency  will they even know what to do?
..."What foot do I use to put on the brake pedal, and which one was it again?" "Are you sure the brake stops the car?" "WTF is a gas pedal?"

Now when the human does freak out and "take control" he will get blamed for interfering with a system that works better than he/she does and be at fault.

It is going to be either the human drives the car, with some assists or overrides/backups, ,or the car totally drives itself, without any emergency assists from the human.  Flashing screens and beepers just get ignored after a while. Ever been in a noisy environment and felt like you could go to sleep and just  ignore it all?

 

I don't know how we are going to get past the "sticking point". Other people's lives are not a testing ground. 

Link to comment
2 minutes ago, larryllix said:

Yeah. The "This is not a self driving vehicle" doesn't cut it and is being used as a hiding place for lack of responsibility.

An analogy:
When the kids were young they played Tee-ball or some form of kids baseball. The kids that don't play well get stuffed into outfield where they never see the ball for many weeks at a time. Then one game at the end of the season a ball gets cracked out to little Johnny who is now entertaining himself picking dandelions in the field. The crowd and parents scream at Johnny
...."Get the ball!!!", "Wake Up!!!. Get the ball!!!"

as Johnny wonders why the long grass is ruffling in a running streak. Even if Johnny does get the ball he doesn't know how to hold the ball, let alone throw it to some baseman. He has hardly ever touched a ball in the game before.

After all the yelling and shaming the poor kid, Johnny states "I don't want to play anymore!"

 

IOW: You can;t tell people they must be driving the car when they haven't needed to touch the wheel for the last two months. If they do touch it will they even know what to do?
..."What foot do I use to put on the brake pedal, and which one was it again?" "Are you sure the brake stops the car?" "WTF is a gas pedal?"

It is going to be either the human drives the car, with some assists or overrides/backups, ,or the car totally drives itself, without any emergency assists from the human.  Flashing screens and beepers just get ignored after a while. Ever been in a noisy environment and felt like you could go to sleep and just  ignore it all?

 

I don't know how we are going to get past the "sticking point". Other people's lives are not a testing ground.

That's the problem with society which is *Just Show Up* and get a prize, medal, pin, trophy. :wacko: 

Link to comment

Tesla, Safety Agency Duel Over Fatal-Crash Probe

and the story continues......

But I still do absolutely not understand why some extrapolate from current state of self driving cars that they will not be safe in the future.  Let us agree to meet in 15 years and you will see that there will be far fewer car accidents, much less traffic, fewer people getting their driving license and fewer cars owned by people. 

Right now I am playing with my 2 year old grandson (taking a minute to post this) and  I dont think that he will get a driver license. At high school and college he will be using Lyft or Uber, and then later it will be self driving cars......

Tesla, Safety Agency Duel Over Fatal-Crash Probe - WSJ.pdf

Link to comment





Ewww, Teken, a smiley next to that phrase?!?

Sent from my SM-N950U using Tapatalk



I had a weak moment and that was indeed in bad taste. ☹️ I would not wish to see anyone be injured much less killed even if the fault laid with the owner.

Regardless, the technology is evolving and getting better. Perhaps one day more safety measure will be in place where I can say it's worthwhile.


Sent from my iPhone using Tapatalk
Link to comment


I had a weak moment and that was indeed in bad taste. ️ I would not wish to see anyone be injured much less killed even if the fault laid with the owner.

Regardless, the technology is evolving and getting better. Perhaps one day more safety measure will be in place where I can say it's worthwhile.


Sent from my iPhone using Tapatalk
I have many such moments!

Sent from my SM-N950U using Tapatalk

Link to comment




But I still do absolutely not understand why some extrapolate from current state of self driving cars that they will not be safe in the future.


If I remember correctly, the Human Genome project sequenced something like 90% of the genome in the last year of a 10 year project, building on all the perfected methods developed over previous 9 years. So I'm watching for a sudden tipping point when progress reaches critical mass.

Sent from my SM-N950U using Tapatalk

Link to comment
3 hours ago, asbril said:

Tesla, Safety Agency Duel Over Fatal-Crash Probe

and the story continues......

But I still do absolutely not understand why some extrapolate from current state of self driving cars that they will not be safe in the future.  Let us agree to meet in 15 years and you will see that there will be far fewer car accidents, much less traffic, fewer people getting their driving license and fewer cars owned by people. 

Right now I am playing with my 2 year old grandson (taking a minute to post this) and  I dont think that he will get a driver license. At high school and college he will be using Lyft or Uber, and then later it will be self driving cars......

Tesla, Safety Agency Duel Over Fatal-Crash Probe - WSJ.pdf

Wow! Only in California! It almost sounds like Tesla went out of their way to tamper with the legal process. Trouble there.

Now we have another Tesla that ran into the back of a firetruck? This should have been a basic driver assistance mechanism that just plain failed.

This plot is getting really thick!

Thanx asbril (I think :) )

Link to comment
  • 5 weeks later...
  • 2 weeks later...

Seems that the technology worked after all.  It was the human operators that chose to turn it off. The car knew it needed to stop 1.3 seconds before it struck the woman.  From 39 mph it would have come close to a complete stop in 1.3 seconds (unlike humans, the car didn't need any reaction time once it determined a collision was imminent).  This really made no sense to me that the car didn't even try to stop since I was quite certain that the lidar would have recognized the obstacle.  Now it makes sense.

Link to comment
Seems that the technology worked after all.  It was the human operators that chose to turn it off. The car knew it needed to stop 1.3 seconds before it struck the woman.  From 39 mph it would have come close to a complete stop in 1.3 seconds (unlike humans, the car didn't need any reaction time once it determined a collision was imminent).  This really made no sense to me that the car didn't even try to stop since I was quite certain that the lidar would have recognized the obstacle.  Now it makes sense.
Was it the human operators e.g 'driver' or the development team that turned it off - can you share a link/reference, I've not kept up - Thanks!

Sent from my SM-N950U using Tapatalk

Link to comment

https://www.reuters.com/article/us-uber-crash/uber-disabled-emergency-braking-in-self-driving-car-u-s-agency-idUSKCN1IP26K

Turns out a combination of errors from all three / four factors were in play.

  • The driver pedestrian had traces of methamphetamines and marijuana in her system. Driver claimed/determined she was looking at system information and not texting.
  • Uber had turned off the emergency braking system due to erratic experiences with it. The sensors detector the pedestrian many seconds before.
  • The woman with the bicycle was either dunk, stoned, or suicidal. Not she didn't even flinch just before the Uber hit her.

It looks like they have worn out their welcome in Arizona and are moving to other states to try out their weapons of mush destruction.

This looks like it should/may  get real dirty before it's done.

EDIT: Corrected as per Apostolakisl post below.

Link to comment
2 hours ago, larryllix said:

https://www.reuters.com/article/us-uber-crash/uber-disabled-emergency-braking-in-self-driving-car-u-s-agency-idUSKCN1IP26K

Turns out a combination of errors from all three / four factors were in play.

  • The driver had traces of methamphetamines and marijuana in her system. Claimed/determined she was looking at system information and not texting.
  • Uber had turned off the emergency braking system due to erratic experiences with it. The sensors detector the pedestrian many seconds before.
  • The woman with the bicycle was either dunk, stoned, or suicidal. Not she didn't even flinch just before the Uber hit her.

It looks like they have worn out their welcome in Arizona and are moving to other states to try out their weapons of mush destruction.

This looks like it should/may  get real dirty before it's done.

The uber driver was not on meth, it was the cyclist.

Reading that description in the article, the car was totally capable of not hitting that woman at all if defensive actions had been taken at first notice of the woman.  Had it taken evasive action when it knew impact was imminent, it would have slowed to single digit mph, which I read as the most correct action the car should have taken.  It would have also been reasonable to take the speed down a few mph at the 6 second mark but a panic stop at that point would clearly have pissed off any occupants in the car.  As a driver, you have to assume that every object that could move into your path is not going to move into your path otherwise we could never move more than about 1 mph.

Again, it would seem that the automatic "panic stop" system just needs to not be turned off and perhaps refined a bit so that it doesn't make for "erratic" driving, which I read as excessive panic stops.  This article to me is quite reassuring that the technology is very good.  If you read what the car knew about it surroundings at the various points in time, it is very impressive, better than any human would have known driving at night.

Link to comment
6 minutes ago, apostolakisl said:

The uber driver was not on meth, it was the cyclist.

Reading that description in the article, the car was totally capable of not hitting that woman at all if defensive actions had been taken at first notice of the woman.  Had it taken evasive action when it knew impact was imminent, it would have slowed to single digit mph, which I read as the most correct action the car should have taken.  It would have also been reasonable to take the speed down a few mph at the 6 second mark but a panic stop at that point would clearly have pissed off any occupants in the car.  As a driver, you have to assume that every object that could move into your path is not going to move into your path otherwise we could never move more than about 1 mph.

Again, it would seem that the automatic "panic stop" system just needs to not be turned off and perhaps refined a bit so that it doesn't make for "erratic" driving, which I read as excessive panic stops.  This article to me is quite reassuring that the technology is very good.  If you read what the car knew about it surroundings at the various points in time, it is very impressive, better than any human would have known driving at night.

Mine has three levels. It issues a visual flashing and an audible  warning. If the driver does not react in X seconds , then gentle braking is applied to slow the car down. If the dangerous situation does not go away and becomes urgent, then panic stop braking is applied.

Of course, all the disclaimers are apparent everywhere but in the end it tells you it is not designed to always prevent collisions, but only minimise the damage of the impact. That may be just another disclaimer. I have never tested it it. That bothers me some, as I don't trust things I haven't tested (34 years proving systems) but not likely I would ever be given that opportunity, especially  using my vehicle. :)

Link to comment

Well after all this discussion, it is time to add the DIYer driverless car. Just like when they started making DIY electric cars in the 60's using a jet-engine starting motors to power the cars (Mother Earth News-Is till have the plans), we are now at the stage where the everyday DIY person can get into the driverless car action.

Check out this clip using the Raspberry Pi3 and a Nvidia Jetson TX2 video card and a little programming. PHASE 1

https://youtu.be/WsWCAi7tkv8

Next is installing the necessary mechanical devices to turn, stop an monitor distance

So, in the future, we may have to add cars on the road that are DIY driverless cars. 

 

 

Link to comment
5 minutes ago, Mustang65 said:

Well after all this discussion, it is time to add the DIYer driverless car. Just like when they started making DIY electric cars in the 60's using a jet-engine starting motors to power the cars (Mother Earth News-Is till have the plans), we are now at the stage where the everyday DIY person can get into the driverless car action.

Check out this clip using the Raspberry Pi3 and a Nvidia Jetson TX2 video card and a little programming. PHASE 1

https://youtu.be/WsWCAi7tkv8

Next is installing the necessary mechanical devices to turn, stop an monitor distance

So, in the future, we may have to add cars on the road that are DIY driverless cars. 

 

 

There was a guy that was putting together kits and selling them. I was thinking of adding one to the pickup but found it was only for a specific car model or two though. Like Honda Civics and some other small import. I can't find the article now (I guess it didn't work out). That was about 5 years ago.

Anyway, I would have faith in the tech, as long as you have a well marked road, but not so much the added servos and controllers for the steering and brakes. I sure as hell wouldn't run it on a Pi.

Link to comment

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...