• We have updated the guidelines regarding posting political content: please see the stickied thread on Website Issues.

Oddities & Idiocies In Using Autonomous / Self-Driving Vehicles

Another emergency vehicle with flashing lights crash

Tesla on autopilot crashes into police car and nearly hits officer

A Tesla on partial autopilot smashed into a police car as an officer stopped to help with a breakdown in the US.

Florida Highway Patrol said the Tesla Model 3 hit its state trooper’s Dodge Charger police vehicle and the broken down Mercedes GLK 350.

The agency said the trooper was ‘extremely lucky’ to avoid being struck. The Mercedes driver suffered minor injuries, according to reports.
The Tesla driver told officers at the scene she was using its partially automated driving system.
1630233295833.png
 
'Partially Automated' in this application means it automatically drove straight to the scene of the accident.
 
People in a 'dead end' street getting loads of 'waymo' vehicles visiting their road.

Self-driving Waymo cars clog up dead-end San Francisco street
Autonomous-driving firm Waymo's cars have been going up and down the cul-de-sac at all hours "for weeks", according to local news station KPIX.
Residents say vehicles sometimes have to queue before making multi-point turns to leave the way they came.


https://www.bbc.co.uk/news/technology-58928706
 
Sure it would be simpler just to have a driver.

A driver deserves to be treated as a human being. As one who, IRL, hates socialisation, I cannot wait for a driverless car.
 
US sends team to probe fatal Tesla crash with no driver
Federal safety regulators have sent a team to investigate the fatal crash of a Tesla in a Houston suburb in which local authorities say no one was behind the wheel.
Update ... National Transportation Safety Board investigators have concluded there was indeed someone in the driver's seat during last spring's fiery Tesla crash in Texas.
NTSB: Driver's seat was occupied during fatal Tesla crash

The driver and passenger seats were occupied during a fiery Tesla crash that killed two in Texas earlier this year, the National Transportation Security Board said Thursday.

The NTSB's findings contradict initial statements by local police that no one was driving the Model S P100D electric car when it veered off a road in Spring, Texas. ...

Footage from the residence of the vehicle owner shows two people entering the car, in the driver and passenger seats, the NTSB said. ...
FULL STORY: https://www.upi.com/Top_News/US/2021/10/21/Texas-Tesla-crash-driver-involved/6791634835010/
 
"Footage from the residence of the vehicle owner shows two people entering the car, in the driver and passenger seats, the NTSB said. ."

Because there is no way they could have possibly stopped and moved to a rear seat saying " hey buddy, watch this" as the car drives off autonomously. :thought:
 
Because there is no way they could have possibly stopped and moved to a rear seat saying " hey buddy, watch this" as the car drives off autonomously. :thought:

That's right - there's no way. For one thing, they never stopped ...
The crash trip originated at the owner's residence near the end of a cul-de-sac. Footage from the owner's home security camera shows the owner entering the car's driver's seat and the passenger entering the front passenger seat. The car leaves and travels about 550 feet before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole, and a tree.

Neither could they have been using the 'autopilot' self-driving system ...
Using Autopilot requires both the Traffic Aware Cruise Control and the Autosteer systems to be engaged. NTSB tests of an exemplar car at the crash location showed that Traffic Aware Cruise Control could be engaged but that Autosteer was not available on that part of the road.

The results from data recovery and testing since May indicate a driver was driving the whole time ...

Event Data Recorder​

With the assistance of the EDR module manufacturer, the NTSB Recorders Laboratory repaired and downloaded the fire-damaged EDR. Data from the module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash. The data also indicate that the driver was applying the accelerator in the time leading up to the crash; application of the accelerator pedal was found to be as high as 98.8 percent.
SOURCE: https://www.ntsb.gov/investigations/Pages/HWY21FH007.aspx
 
That's right - there's no way. For one thing, they never stopped ...


Neither could they have been using the 'autopilot' self-driving system ...


The results from data recovery and testing since May indicate a driver was driving the whole time ...

SOURCE: https://www.ntsb.gov/investigations/Pages/HWY21FH007.aspx
If the whole incident from start to finish was on camera, why was there any speculation regarding whether someone was in the drivers seat at the time of the crash?
 
If the whole incident from start to finish was on camera, why was there any speculation regarding whether someone was in the drivers seat at the time of the crash?
Good question ... It seems the local authorities who originally investigated the crash jumped to that conclusion based on the state of the burned-out wreckage.
 
San Francisco police intercepted a driverless / riderless Cruise autonomous car, which seemed to try to escape the cops.
Autonomous Car Pulled Over By Cops, Makes a Run For It

... It remains something of an open question who is at fault or held liable in an accident, for example—the carmaker? The owner of the car? What happens if or when an autonomous car commits a moving violation on public roads? The San Francisco Police Department unwittingly became party to this thought exercise when one of its units tried pulling over a GM Cruise autonomous vehicle ...

The Cruise vehicle, which is based on a Chevrolet Bolt electric hatchback, initially stops for the officers attempting the traffic stop. Amusingly, not long after one of the officers exits the police car to approach the self-driving Bolt, the Chevy, ahem, bolts—slowly driving through an intersection before pulling over again and activating its flashers. The police catch up to the Bolt, which presumably pulled over the second time because the cops activated their emergency lights, and proceed to walk around the now-parked Chevy while making calls on their phones. ...

Apparently, there is a special phone number officers must call when they're interacting with one of the Cruise vehicles. That could have been what the officer seen on the phone ... was doing, meaning the officers were less "confused" than bemused and milling around waiting for direction, checking out the Bolt up close while their colleague contacted the relevant resources. And the brief "chase," such as it was, between the police and the car? Well, per the Cruise tweet above, it was the Bolt merely finding a new place to pull over, though it's amusing to think it was making a break for it ...
FULL STORY (With Video): https://www.motortrend.com/news/gm-cruise-av-self-driving-car-pulled-over-police/
 
‘One dead after self-driving BMW swerved into oncoming traffic’

I bet the human will get the blame even if it was in self drive mode. But there have been incidents before where the car has bottled it and handed back control seconds before a collision.

A lot of humans drive like a**e holes but I don’t trust computers either. How many programmes or apps have just suddenly closed while you were in the middle of using them? And they trust that to drive? Also I would say a large percentage of driving is trying to guess what people are going to do. Do these cars do that or just react to what is already happening?

https://metro.co.uk/2022/08/16/one-...g-bmw-swerved-into-oncoming-traffic-17194772/
 
I don't trust a computer to drive a car either. But the computers in aircraft and cars are a completely different architecture to a PC. For instance, in the the entire service life of the space shuttle, the computers on that didn't screw up once.
 
I think you have to draw a distinction on the level of autonomy you're talking about though.
An aircraft, although being obviously much more complex than a car, has far fewer things to have to be concerned about once it's in the air. The 'auto pilot' essentially only has to maintain level flight in the direction required. It doesn't have to be concerned with whether or not the old lady in the Volvo coming from a junction on the left has noticed your approach.
 
the entire service life of the space shuttle,
135 missions and only 2 catastrophically explosive, unplanned, dismantlement episodes.
That's a 1.48% failure rate, which is not great.
I wouldn't get on a plane if they told me it had a 1.48% chance of destroying itself and everyone on board in a fireball.
 
135 missions and only 2 catastrophically explosive, unplanned, dismantlement episodes.
That's a 1.48% failure rate, which is not great.
I wouldn't get on a plane if they told me it had a 1.48% chance of destroying itself and everyone on board in a fireball.
Dude, the computers didn't control the seal that failed or the tile that fell off, or whatever it was.
 
Yeah I know.
But the computers were of no help preventing both Challenger and Columbia having sub-optimal flight characteristics.
 
135 missions and only 2 catastrophically explosive, unplanned, dismantlement episodes.
That's a 1.48% failure rate, which is not great.
I wouldn't get on a plane if they told me it had a 1.48% chance of destroying itself and everyone on board in a fireball.
Fusspot! What do you want, the Moon on a stick?
 
Back
Top