Jump to content

Tesla ~ Second Automated Death


Teken

Recommended Posts

What they did not mention in the article is that there is a world wide internet study in progress where they ask you specific questions regarding how you would handle a last minute decision regarding what would be your option, hit a tree... a group of kids or a group or older individuals. Just one of the questions to be answered according to the news this morning. They, but did not admit, that they are going to use the data in possible programming of cars in the distant future. 

Link to comment
5 hours ago, Mustang65 said:

What they did not mention in the article is that there is a world wide internet study in progress where they ask you specific questions regarding how you would handle a last minute decision regarding what would be your option, hit a tree... a group of kids or a group or older individuals. Just one of the questions to be answered according to the news this morning. They, but did not admit, that they are going to use the data in possible programming of cars in the distant future. 

It is awful to think about, but it has to be done.  When a no-win situation hits you, then you have to choose something.  They are trying to decide what society thinks is the least bad when only bad outcomes can be had.  For example, a kid jumps out in front of your car, there a sidewalk to your right with an adult pedestrian and a dump truck in the oncoming lane.  Your choices are 1) run over the kid and kill him/her 2) run into the dumptruck and kill yourself/your passengers, or 3) jump the curb and kill the adult pedestrian.  

Link to comment
2 hours ago, apostolakisl said:

It is awful to think about, but it has to be done.  When a no-win situation hits you, then you have to choose something.  They are trying to decide what society thinks is the least bad when only bad outcomes can be had.  For example, a kid jumps out in front of your car, there a sidewalk to your right with an adult pedestrian and a dump truck in the oncoming lane.  Your choices are 1) run over the kid and kill him/her 2) run into the dumptruck and kill yourself/your passengers, or 3) jump the curb and kill the adult pedestrian.  

And then after filling out the research document and submitting it, the Research company would then send you an email that says "Thank you for your selection, now place yourself into the category that you selected as the pedestrian. Do you still want to keep your selection or change it?" I would bet some may change their choice to the driver/passenger(s) biting the dust.

 

Link to comment
  • 2 months later...

Autonomous vehicles don’t need to be perfect. They just need to be better than meatbags who are prone to inebriation and fits of rage, among other undesirable driving characteristics.

 

https://www.tesla.com/blog/q3-2018-vehicle-safety-report

 

“Here’s a look at the data we’re able to report for Q3:

 

Over the past quarter, we’ve registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.

For those driving without Autopilot, we registered one accident or crash-like event for every 1.92 million miles driven. By comparison, the National Highway Traffic Safety Administration’s (NHTSA) most recent data shows that in the United States, there is an automobile crash every 492,000 miles. While NHTSA’s data includes accidents that have occurred, our records include accidents as well as near misses (what we are calling crash-like events).”

 

Granted, we take Tesla’s self-reported numbers with a grain of salt, recognize Tesla drivers probably aren’t representative of all drivers, and that Autopilot miles probably aren’t the same as all miles. But still, even if they’re off by a lot, that’s still impressive.

 

This was right up the road from me recently. Pretty sure this guy and/or some number of random people are alive/uninjured today because of autonomous vehicle capabilities. Again, autonomous cars don’t need to be 100% safe. They just need to kill people less often than random drunks do to be considered a success.

 

https://www.losaltosonline.com/news/sections/news/59030-los-altos-commission-chairman-arrested-for-dui-in-palo-alto-early-friday?fbclid=IwAR085Xidf1YStsrJcPDOtmYMScyRw1nhQgLps4aE5pyGmmvzU2We9VP5LsA

 

Link to comment
4 hours ago, builderb said:

Autonomous vehicles don’t need to be perfect. They just need to be better than meatbags who are prone to inebriation and fits of rage, among other undesirable driving characteristics.

I hope you are right.

In another life, I was involved with autonomous aerial vehicles.  Gaining access to the national airspace was, and is, an on-going concern.  Unfortunately, the standards they were trying to apply to these vehicles was perfection.  They were, in my mind, trying to demonstrate the impossible: that there is ZERO chance that a UAV would strike another aircraft.  They were placing far higher standards on UAVs than could ever be met by a human-piloted vehicle.

While zero collisions would be good, I always felt it got in the way of better.

Link to comment

Tesla reports the obvious, but tries to deceive the public with their "autonomous driving" implications. At best, Telsa has produced an "assisted driving" vehicle. With a human driving and in full control, of course the crash-like figures should improve with a second opinion from the car's computers and superior sensing systems. Two heads are better than one.

From Telsa owners I have talked to, Tesla is no further in this technology than say, Honda, or other vehicle manufacturers. Owning one of these vehicles myself, I like the assisted driving features but they do not drive the car without me, as the main driver, 95% of the time. Steering control is based on clearly seeing highway lines and doesn't function in Canada once snow is on the road or the lines are dusty. It doesn't function properly with a dusty camera lens.

Automatic cruise control is great for highway following other vehicles, but the 1 dimensional radar system gets confused at times, not seeing around curves, and not "letting go" of vehicles that have travelled down side- roads  for over 100 metres/yards.

The system sees vehicles slowing down way before I can detect it, waiting to see brakes lights half a kilometre ahead. The lane deviation system gives a wake-up before I would know it, when it sees a line getting too close.

Other manufacturers state that Telsa can never become self-autonomous without  a LIDAR system. Telsa claims they can. After owning another brand,  I don't believe Telsa. Looking at their past and current deceptive claims to the public, completely furthers my doubt about any Telsa claims . Without tracking transmitters in the roadways, I doubt it will be done in the next 50-100 years. 100% visual and 1D radar will never work.

Once the fully self-autonomous stage is attempted, and humans are eliminated, the "crash-like accidents" will dramatically esclate when only machine is 100% at the wheel, without the current 95% "human assistance". Unfortunately, the manufacturers and some public are pushing full-autonomy, based on current statistics of human+ driving records. It's a scam, attempting to confuse the pubic and regulators with unrelated statistics.

Link to comment

Curious what the miles/accident rates are when controlled for driver characteristics.  The owner/driver of a $100k car is going to be a different thing than the general population.   You would probably get a better idea if you compared it to other 4 door sedans that cost $80k to $120k.  Maybe something like a BMW 7 series or Mercedes S class.

But those numbers are very impressive and I expect some benefit to persist even when controlling for similar demographic of people behind the wheel.

Curious also to know what happens when we get 5g going.  My understanding is that GPS is going to be  better  when 5g is implemented potentially making lane stripes irrelevant.  Obviously, you would then need to keep very close track of re-striping events.  The thing with Tesla is that all of that data is going back to Tesla so they are collecting a lot of info on where cars are located times many millions of miles.

Link to comment
11 minutes ago, apostolakisl said:

Curious what the miles/accident rates are when controlled for driver characteristics.  

More than the cost of vehicle, what about good drivers and bad.  While it could be argued a good thing to reduce accidents per mile for the general population, hopefully that won’t be at the expense of higher risk for the best drivers.

Link to comment
9 minutes ago, oberkc said:

More than the cost of vehicle, what about good drivers and bad.  While it could be argued a good thing to reduce accidents per mile for the general population, hopefully that won’t be at the expense of higher risk for the best drivers.

The type and cost of the car is going to be a pretty good way to pick out a very similar demographic of people owning them, that would include their driving record.  The insurance companies to a large extent do that.  Think about it, 50 year old professionals and successful business people own $100k 4  door sedans.  They mostly all drive the same whether they choose a Tesla or a beemer.

Link to comment
Tesla reports the obvious, but tries to deceive the public with their "autonomous driving" implications. At best, Telsa has produced an "assisted driving" vehicle. With a human driving and in full control, of course the crash-like figures should improve with a second opinion from the car's computers and superior sensing systems. Two heads are better than one.
From Telsa owners I have talked to, Tesla is no further in this technology than say, Honda, or other vehicle manufacturers. Owning one of these vehicles myself, I like the assisted driving features but they do not drive the car without me, as the main driver, 95% of the time. Steering control is based on clearly seeing highway lines and doesn't function in Canada once snow is on the road or the lines are dusty. It doesn't function properly with a dusty camera lens.
Automatic cruise control is great for highway following other vehicles, but the 1 dimensional radar system gets confused at times, not seeing around curves, and not "letting go" of vehicles that have travelled down side- roads  for over 100 metres/yards.
The system sees vehicles slowing down way before I can detect it, waiting to see brakes lights half a kilometre ahead. The lane deviation system gives a wake-up before I would know it, when it sees a line getting too close.
Other manufacturers state that Telsa can never become self-autonomous without  a LIDAR system. Telsa claims they can. After owning another brand,  I don't believe Telsa. Looking at their past and current deceptive claims to the public, completely furthers my doubt about any Telsa claims . Without tracking transmitters in the roadways, I doubt it will be done in the next 50-100 years. 100% visual and 1D radar will never work.
Once the fully self-autonomous stage is attempted, and humans are eliminated, the "crash-like accidents" will dramatically esclate when only machine is 100% at the wheel, without the current 95% "human assistance". Unfortunately, the manufacturers and some public are pushing full-autonomy, based on current statistics of human+ driving records. It's a scam, attempting to confuse the pubic and regulators with unrelated statistics.


Honda (and other’s) autopilot systems are nothing like Tesla’s and I’ve used / owned several. Simply nothing comes close to what I have in my Model 3 (including Super Cruise). Everyone I know who owns a Tesla across versions 1-2.5 of the autopilot systems would agree with me - and none of them would take the position that a competitor approaches Tesla’s capability. Right now, Tesla’s Enhanced Autopilot is a driver assist system, but as has been pointed out: people are dumb and will use things outside of their purpose in unsafe ways. Aside from relying on autopilot to completely drive the car, they’ll do things like: Drive standard vehicles while exhausted, or while intoxicated. They’ll drive vehicles which their mechanic says are unsafe, etc. I don’t know what the point of this thread is, but I see a lot of typical FUD being expressed here. As early adopters of tech here I’m pretty disappointed with the discussion - especially since many have no first hand experience either developing or using the technology.

With the proliferation of any new technology, there are going to be problems in the early stages. However, if the change is still a net improvement at the macro level, then it does not justify holding these technologies back to the point they cannot make significant advancements. In order for these systems to operate at the safety levels we all want, they need to be tested (used) at large scales and pushed incrementally towards full autonomy. This cannot be done in a lab - it just wouldn’t happen.
Link to comment

Here is a comparison video for the Telsa AutoPliot and the Cadillac Super Cruise system performances. Neither of these systems offer much  advantage, if any, over Honda, Mazda, or other manufacturer's recent release vehicle systems.

https://insideevs.com/tesla-autopilot-vs-cadillac-super-cruise-comparison/

The performances of these two compared, are very similar, but the requirements of the drivers are done on different criteria. It should be noted that driver attention requirements are not related to autonomous driving, only to policing the driver into safer driving habits.

The self steering systems are all based on clear road lane marking which will do not  work properly where lane markings are not clear. This means where there is snow,  dust on the road markings, or without any lane markings, all systems lack decent performance. Honda, Mazda and Tesla all have these same limitations.

After my discussion with two Tesla vehicle owners at a car show last summer,  several friends, using different manufacturer's systems, didn't see any differences in capability, only  the interpretation of what the features mean, different manufacturers  terminology, style of engaging, and  price tags.

However, Tesla makes a lot of claims what their vehicles are capable of, except they are not capable of many of the claims as the public perceives them.

Wikipedia has a decent  discussion of terms and definitions
https://en.wikipedia.org/wiki/Self-driving_car

Link to comment

Tesla has made a lot of, shall we say, over-exuberant claims about the capabilities of their "Autopilot" system that they've had to temper, for sure. At the same time, there's nobody else out there with the sheer volume of real-world miles driven under autopilot. They are also, AFAIK, the only ones developing their own in-house optomized silicon rather than relying on manufacturers like NVIDIA, and they are planning for upgrade paths for current vehicles. How that translates into a timeline for Level 5 autonomy, and whether currently-produced vehicles will ever be truly capable of it is anyone's guess. I know a lot of people think it won't happen, and I tend to agree. I think we're still at least a decade away from widespread autonomous passenger vehicles.

Clearly, Tesla's data is to be taken with a grain or more of salt. I look forward to government testing agencies or someone like Consumer Reports developing ways to compare data in a more apples-to-apples fashion as we head into the autonomous era. People have mentioned the age / income factor. That's huge. Comparing all miles driven to all Tesla miles driven is a big difference. The Tesla sample group will certainly skew older, more educated, more experienced driver, better maintained vehicle, etc. And miles driven under Autopilot are most likely skewed heavily towards highway / freeway driving, which is clearly different from all miles. I suspect that most people don't set Autopilot to 80 mph or above (which they quite often happily will do with their foot). I'd also wager that Autopilot miles are skewed towards being driven with a greater average following distance between vehicles than meatbag miles. All these things make the Tesla data suspect to some degree, or at least difficult to directly compare Autopilot-to-non-Autopilot miles. That complexity goes way up trying to compare different flavors of driver-assistance technology.

But I wasn't really trying to make a point about who's version of the autonomous car is better, just that it's not that case that autonomous cars have to be flawless, or near-flawless before they are a better solution than humans driving. Humans, as a group, are not great drivers. Assuming Tesla isn't outright lying about their data, they're already seeing a safety improvement by removing the meatbag from the decision-making process in certain situations. 

Obviously it's hard to prove something specific was avoided in that article about the city councilman getting busted for DUI, but it's sure easy to imagine that one or more people could have ended up dead or seriously injured as a result of that guy's behavior. But because of driver assistance, and some savvy cops, the only real harm was to this guy's reputation. That kind of stuff needs to get weighed against the people who die with Autopilot active.

You could be the best driver in the world in complete control of your own vehicle at all time, and still get whacked by a drunk or someone who was distracted or fell asleep or was driving recklessly. 

Full disclosure, I've had a Model 3 for about 9 months now, which is why I'm using it as an example, it's the one I'm most familiar with. I also think that the fact Tesla vehicles are electric is way more important than whether they can drive themselves, and as BEVs, they deliver in spades.

Link to comment

You could be the best driver in the world in complete control of your own vehicle at all time, and still get whacked by a drunk or someone who was distracted or fell asleep or was driving recklessly. 


Interesting discussion. Couldn’t this accident still happen with similar / same probability while using any driver-assist technologies? What evasive action would a Tesla take for instance if another vehicle crossed the center yellow lines and was barreling towards it? Would it even know to slow down? I’d likely pull over if possible or in some scenarios could even imagine myself going into the “oncoming” lane if needed assuming no other traffic and inability to stop in time.

Another thing I’ve wondered - will a driver-assist car cross over the center yellow lines assuming no oncoming traffic to give cyclists or pedestrians comfortable room?





Sent from my iPhone using Tapatalk
Link to comment
27 minutes ago, TrojanHorse said:

 


Interesting discussion. Couldn’t this accident still happen with similar / same probability while using any driver-assist technologies? What evasive action would a Tesla take for instance if another vehicle crossed the center yellow lines and was barreling towards it? Would it even know to slow down? I’d likely pull over if possible or in some scenarios could even imagine myself going into the “oncoming” lane if needed assuming no other traffic and inability to stop in time.

Another thing I’ve wondered - will a driver-assist car cross over the center yellow lines assuming no oncoming traffic to give cyclists or pedestrians comfortable room?





Sent from my iPhone using Tapatalk

 

Yeah, couldn't tell you what it would do in situations like that. I'm sure there are situations where it would be worse to be in an autonomous vehicle. Clearly there are situations where an autonomous system has killed someone where a human driver would totally have avoided an accident. But again, as long as that happens less frequently than human-caused accidents, autonomy is better. Staggering numbers of people die on the roads every year. If autonomy can put a dent in that number, it'll be a net win for society. Plus, there are significant mobility benefits to disabled people, elderly drivers, and other groups that currently can't drive themselves around. Not to mention, it opens up the possibility of affordable shared vehicle access similar to the current scooter craze. It might cost more per mile, but someone who can't afford a car may well be able to afford to summon a car to use for a couple hours. And think about how city designs could change if public parking could be located up to several miles away from the city core, or from significant attractions.

Link to comment

It’ll be interesting to watch this space. Put me in the near term doubter camp. My biggest issue is who gets to play God when programming these vehicles. It’s one thing to imagine a world of all autonomous vehicles working together. But there will be simple old cars on the road for a while. I could even see them becoming more attractive to some.

 

Reading the story about that douche in Palo Alto he likely wouldn’t have gotten very far without self driving technology (maybe not out of the parking lot), so I’m inclined to put that incident in the cons column for this technology rather than the pro. Unfortunately I’ve heard that many people use their Tesla’s as a sober ride home which doesn’t fix the drunk (or distracted - see AZ Uber incident) driving problem given the current state of the technology. It could actually make it worse IMO, at least in the coming decade while it’s likely to not be fully autonomous vehicles.

 

 

Sent from my iPhone using Tapatalk

 

Link to comment

Autopilot won't take you out of a parking lot and onto the street without intervention. It also won't take you onto the freeway by itself. It can take you from on-ramp to off-ramp across freeway interchanges, but drunk dude had to have driven himself at least to the freeway. There are plenty of examples of VERY drunk people getting into cars and out of parking lots before getting into an accident.

It's definitely tough to quantify things that didn't happen versus things that do, but I still say there is a good chance without Autopilot this guy hurts himself and possibly others.

Link to comment
2 hours ago, builderb said:

Autopilot won't take you out of a parking lot and onto the street without intervention. It also won't take you onto the freeway by itself. It can take you from on-ramp to off-ramp across freeway interchanges, but drunk dude had to have driven himself at least to the freeway. There are plenty of examples of VERY drunk people getting into cars and out of parking lots before getting into an accident.

It's definitely tough to quantify things that didn't happen versus things that do, but I still say there is a good chance without Autopilot this guy hurts himself and possibly others.

I guess this is the problem I was trying expose. Calling a combination system "AutoPliot", implies you can give the address and it takes you there while you enjoy a drink in the back seat or read a book. I have been told by many people this is the case. Not exactly  claimed, but the implication has been  promoted, so the public is falsely given the misconception.

Other manufacturers call the individual  systems

  • LKAS (Lane keeping assist System),  watches the marked lines on the road and keeps you centred, or a fixed distance from only one found.
  • ACC (automatic cruise control),  sets a maximum speed as well as a safe distance from the vehicle in front of you, even at a stop light.
  • Lane deviation warning , notifies the driver of seeing a marked line too close to the car. flashing lights, and/or steering wheel shaking techniques are used
  • Others.

In reality most of these systems only warn or correct the driver when they are driving/behaving  badly and are not capable of actually driving the car. The inebriated driver's vehicle should have shut down and pulled off the road after several warnings so something stinks in Denmark there.   Many think with the experimental  attempts  of automatically parking the car in a garage (almost) and the new "Ramp to Ramp" capability Tesla  claims (can change lanes and pass another car but driver has to initiate it ?), that these cars drive themselves from garage to destination without driver intervention

None of these systems, to my knowledge, will drive the car for any extended periods. Mine warns you about lack of steering resistance then shuts down and you would go off the road. These systems are only designed to keep the driver operating the vehicle in a safer manner.
Level 5 autonomy may only ever happen on railway tracks, dedicated pathways, or electronically guided and restricted  paths. However then it will not be called "autonomous" but rather "automatic".

Fun to watch the technology and interesting  to watch how people swallow the propaganda.

Link to comment
5 hours ago, larryllix said:

I guess this is the problem I was trying expose. Calling a combination system "AutoPliot", implies you can give the address and it takes you there while you enjoy a drink in the back seat or read a book. I have been told by many people this is the case. Not exactly  claimed, but the implication has been  promoted, so the public is falsely given the misconception.

Other manufacturers call the individual  systems

  • LKAS (Lane keeping assist System),  watches the marked lines on the road and keeps you centred, or a fixed distance from only one found.
  • ACC (automatic cruise control),  sets a maximum speed as well as a safe distance from the vehicle in front of you, even at a stop light.
  • Lane deviation warning , notifies the driver of seeing a marked line too close to the car. flashing lights, and/or steering wheel shaking techniques are used
  • Others.

In reality most of these systems only warn or correct the driver when they are driving/behaving  badly and are not capable of actually driving the car. The inebriated driver's vehicle should have shut down and pulled off the road after several warnings so something stinks in Denmark there.   Many think with the experimental  attempts  of automatically parking the car in a garage (almost) and the new "Ramp to Ramp" capability Tesla  claims (can change lanes and pass another car but driver has to initiate it ?), that these cars drive themselves from garage to destination without driver intervention

None of these systems, to my knowledge, will drive the car for any extended periods. Mine warns you about lack of steering resistance then shuts down and you would go off the road. These systems are only designed to keep the driver operating the vehicle in a safer manner.
Level 5 autonomy may only ever happen on railway tracks, dedicated pathways, or electronically guided and restricted  paths. However then it will not be called "autonomous" but rather "automatic".

Fun to watch the technology and interesting  to watch how people swallow the propaganda.

This technology is obviously evolving.  As someone who has a car with lane keeping assist. . . . it is a pointless piece of sh . . .  As someone who does not own a Tesla but has driven them a bit, the car actually drives in the lane and speeds up or slows down according to traffic.  It does not observe stop lights/signs so you still have to pay attention and hit the brake.  On the highway it will change lanes when directed.  I read a police report of a guy who was sound asleep in his tesla on the interstate.  The cop couldn't get the guy to wake up with his sirens and what not, so he got in front of the Tesla and just slowed down to a stop, and so did the Tesla.

If you fully unleashed the Tesla autopilot, it could drive all over town.  However, it would screw up on occasion which would be very unacceptable.  They are only allowing the autopilot to function within its extremely proven abilities.  Slowly, they will keep turning on the more advanced features, one at a time as they can.  This has been what they have already done and I am quite certain will continue to do.  

Link to comment
24 minutes ago, apostolakisl said:

This technology is obviously evolving.  As someone who has a car with lane keeping assist. . . . it is a pointless piece of sh . . .  As someone who does not own a Tesla but has driven them a bit, the car actually drives in the lane and speeds up or slows down according to traffic.  It does not observe stop lights/signs so you still have to pay attention and hit the brake.  On the highway it will change lanes when directed.  I read a police report of a guy who was sound asleep in his tesla on the interstate.  The cop couldn't get the guy to wake up with his sirens and what not, so he got in front of the Tesla and just slowed down to a stop, and so did the Tesla.

If you fully unleashed the Tesla autopilot, it could drive all over town.  However, it would screw up on occasion which would be very unacceptable.  They are only allowing the autopilot to function within its extremely proven abilities.  Slowly, they will keep turning on the more advanced features, one at a time as they can.  This has been what they have already done and I am quite certain will continue to do.  

I really like the ACC a lot, and it works well, but has some minor quirks. Originally it would really freak me out when it stops behind a stopped car, but then the car ahead makes a right turn, clearing out of the radar image. Suddenly my car wants to resume it's 100 kph setting across the intersection! Of course it has an auto-cancel after a few seconds, but other than that timer, no real intelligence for  that one. 
On a crowded highway it is a big jump up from regular cruise control, but much more relaxing knowing the car will slow down as needed and it takes quite a while at 20 kph under the speed limit before you notice you should be impatient  and attempt a pass. For me, that's a big drop in stress during driving, not feeling the impatience behind a slow driver so soon. The radar can detect a guy letting his foot off the brake in front of me, about 4-5 seconds before I can even notice it, without it's brake lights coming on. That's a great  thing, especially just after you  began to glance to the side .

The LKAS works well, but takes some getting used to, letting the system steer the car and yet keeping a good enough grip on the steering wheel to "be ready". On long runs that is more relaxing, but took some time to adapt to the feel. Trouble with the Honda is it doesn't sense your touch on the wheel, but rather senses your resisting  the auto-steering mechanism. Mine complains frequently, and takes a slight wiggle of the steering wheel to make the timer start over. Of course, this is all recorded in the  "flight recorder", and will likely be used against you in an accident. Driver attention stats are available on the dash from this, so I know it is there. This system requires a light touch to allow it to work, but not too light in order to make it complain you are not doing anything.

The LKAS has other quirks to get familiar with. On a lane with two clearly marked lines, I assume it tries to keep you centred between them. Along comes an intersection or a dirty part of the road that eliminates the shoulder markings. Now the system needs to gauge it's distance from the centre line only. This distance may not be the same as the distance while centred. Having your car suddenly feel like it is veering into oncoming traffic is a lot scary!! It may not ever take you into oncoming traffic but it sure scares the pants off the driver!

Another system, the lane deviation warning, is great and I like that one too, but it can be too late at times. It detects close to line, or line crossings. The wiggle on the steering wheel is similar to the rumble strips and wakes you up from  a wandering mind, as well as steering you back across the line, unless you signal, meaning it was intentional.

The second opinion on lane changes is awesome ! It forces you to signal without fail, and warns you loudly if an impending collision is imminent by comparing position and speed differences of cars in the next lane. You still have to look, but the second opinion, without forgetting, is a great feature.

These systems all keep drivers driving more safely and should improve the accident statistics. However, they are forcing humans to drive safer, not safer by  themselves.
If the woman, crossing the street with the bicycle, would have  triggered any braking systems, those same systems  would have hammered on the brakes for every car approaching an intersection from  a side street. A computer will be a long time before it can read the side driver's mind.

Link to comment
I guess this is the problem I was trying expose. Calling a combination system "AutoPliot", implies you can give the address and it takes you there while you enjoy a drink in the back seat or read a book. I have been told by many people this is the case. Not exactly  claimed, but the implication has been  promoted, so the public is falsely given the misconception.

Other manufacturers call the individual  systems
  • LKAS (Lane keeping assist System),  watches the marked lines on the road and keeps you centred, or a fixed distance from only one found.
  • ACC (automatic cruise control),  sets a maximum speed as well as a safe distance from the vehicle in front of you, even at a stop light.
  • Lane deviation warning , notifies the driver of seeing a marked line too close to the car. flashing lights, and/or steering wheel shaking techniques are used
  • Others.
In reality most of these systems only warn or correct the driver when they are driving/behaving  badly and are not capable of actually driving the car. The inebriated driver's vehicle should have shut down and pulled off the road after several warnings so something stinks in Denmark there.   Many think with the experimental  attempts  of automatically parking the car in a garage (almost) and the new "Ramp to Ramp" capability Tesla  claims (can change lanes and pass another car but driver has to initiate it ?), that these cars drive themselves from garage to destination without driver intervention
None of these systems, to my knowledge, will drive the car for any extended periods. Mine warns you about lack of steering resistance then shuts down and you would go off the road. These systems are only designed to keep the driver operating the vehicle in a safer manner.
Level 5 autonomy may only ever happen on railway tracks, dedicated pathways, or electronically guided and restricted  paths. However then it will not be called "autonomous" but rather "automatic".
Fun to watch the technology and interesting  to watch how people swallow the propaganda.

I’m a Tesla driver, and I fully agree they are overhyping their system by calling it “Autopilot” at this point. Although, to be fair, that name was previously used in the airline industry where it also debuted as the flying version of lane keeping assistance.

As far as the passed out guy driving, speculation is that he fell asleep with his hand on the wheel, thus leading the car to think he was alert. One of the valid critiques of Tesla’s system is that it doesn’t use eye monitoring to determine alertness, like other systems do. It relies on pressure on the steering wheel, which doesn’t seem like as reliable an indicator.
Link to comment
  • 3 weeks later...

What will be interesting is when they add the vehicle to vehicle communications into the process (already being tested). Granted this will not be effective for 10+ years after they start with this, as the older cars will still be smart cars yet, but your car will know that the other car is not as smart and it will take necessary actions based on this information. 

Emergency vehicles will tell your car that they are approaching from behind and your vehicle pulls over to the side of the road to let it pass. Police officer's car instructs your car to pull over after the police car informs your car that it is being pulled over because it was going 15 mph over the speed limit, your car informs you (speaker) of the situation and your car responds by pulling over, puts itself in park and turns on the emergency flashers. In the case of an unmarked police car, your car may contact the police department with the police car data to verify that it is actually an unmarked police car.

A car getting on to a congested expressway and pulls up next to your car asks your car to allow it to enter the express way in front of you, your car allows it to without your approval (only because it was another bright red Chevy and it was a member of the opposite sex driving (is that politically correct?).  Your car may get some vibes from you that you were unhappy with its decision to allow the other car in and apologizes for its decision and will discuss its decision the next time that the same thing happens. Two cars approaching an intersection, from different directions, out of the Lidar radar's sight, your car tells the other car that you have the right-of-way and the other car stops at the intersection. OR, the traffic light at that intersection tells the car that it has a stop light in 150 feet start to slow down and stop which it does.

Wait, the cars will also be talking to your Apple watches, smart phones, bicycle computers..... there will be no end to this

Just additional code for the programmers to insert...….

Don

Link to comment

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...