Car Chat General discussion about Lexus, other auto manufacturers and automotive news.

Self-Driving Vehicles

Thread Tools
 
Search this Thread
 
Old 05-15-22, 07:39 PM
  #1021  
SW17LS
Lexus Fanatic
 
SW17LS's Avatar
 
Join Date: May 2012
Location: Maryland
Posts: 57,294
Received 2,731 Likes on 1,956 Posts
Default

Your local area has been largely the same for 10 years.

Originally Posted by mmarshall
You also have had a some variety of vehicles in your own family, at least in recent years
Not really...we've always had one multipurpose vehicle, it was the Prius, then it became the Jeeps, then when we had the kids it became minivans, and I have always driven a luxury sedan. The difference in the sedan has simply been my financial wherewithal.
SW17LS is offline  
Old 05-15-22, 08:02 PM
  #1022  
mmarshall
Lexus Fanatic
 
mmarshall's Avatar
 
Join Date: Oct 2003
Location: Virginia/D.C. suburbs
Posts: 91,293
Received 87 Likes on 86 Posts
Default

Originally Posted by SW17LS
Your local area has been largely the same for 10 years.
It has always been crowded, but, except for a period in the spring of 2020 when traffic was mostly off the road because of the pandemic, conditions have generally been getting worse and worse each year. Hopefully, there will be an improvement when that super-mess of the I-66 construction is finally done. That has been a complete and royal pain in the a**.

mmarshall is offline  
Old 05-16-22, 03:17 AM
  #1023  
bitkahuna
Lexus Fanatic
iTrader: (20)
 
bitkahuna's Avatar
 
Join Date: Feb 2001
Location: Present
Posts: 74,913
Received 2,442 Likes on 1,601 Posts
Default

back to self-driving cars please.
bitkahuna is offline  
Old 06-09-22, 05:20 PM
  #1024  
mmarshall
Lexus Fanatic
 
mmarshall's Avatar
 
Join Date: Oct 2003
Location: Virginia/D.C. suburbs
Posts: 91,293
Received 87 Likes on 86 Posts
Default

The Federal investigation into the Tesla Autopilots has just been expanded. This, from today's New York Times:

https://www.nytimes.com/live/2022/06...-investigation

Federal safety agency expands its investigation of Tesla’s Autopilot system.




A Tesla Model 3 on the road in California. It is one of the models being investigated by the National Highway Traffic Safety Administration.Credit...Roger Kisby for The New York Times



By Neal E. Boudette

The federal government’s top auto-safety agency is significantly expanding an investigation into Tesla and its Autopilot driver-assistance system to determine if the technology poses a safety risk.

The agency, the National Highway Traffic Safety Administration, said Thursday that it was upgrading its preliminary evaluation of Autopilot to an engineering analysis, a more intensive level of scrutiny that is required before a recall can be ordered.

The analysis will look at whether Autopilot fails to prevent drivers from diverting their attention from the road and engaging in other predictable and risky behavior while using the system.

“We’ve been asking for closer scrutiny of Autopilot for some time,” said Jonathan Adkins, executive director of the Governors Highway Safety Association, which coordinates state efforts to promote safe driving.

NHTSA has said it is aware of 35 crashes that occurred while Autopilot was activated, including nine that resulted in the deaths of 14 people. But it said Thursday that it had not determined whether Autopilot has defects that can cause cars to crash while it is engaged.

The wider investigation covers 830,000 vehicles sold in the United States. They include all four Tesla cars — the Models S, X, 3 and Y — in model years from 2014 to 2021. The agency will look at Autopilot and its various component systems that handle steering, braking and other driving tasks, and a more advanced system that Tesla calls Full Self-Driving.

Tesla did not respond to a request for comment on the agency’s move.

The preliminary evaluation focused on 11 crashes in which Tesla cars operating under Autopilot control struck parked emergency vehicles that had their lights flashing. In that review, NHTSA said Thursday, the agency became aware of 191 crashes — not limited to ones involving emergency vehicles — that warranted closer investigation. They occurred while the cars were operating under Autopilot, Full Self-Driving or associated features, the agency said.

Tesla says the Full Self-Driving software can guide a car on city streets but does not make it fully autonomous and requires drivers to remain attentive. It is also available to only a limited set of customers in what Tesla calls a “beta” or test version that is not completely developed.

The deepening of the investigation signals that NHTSA is more seriously considering safety concerns stemming from a lack of safeguards to prevent drivers from using Autopilot in a dangerous manner.

“This isn’t your typical defect case,” said Michael Brooks, acting executive director at the Center for Auto Safety, a nonprofit consumer advocacy group. “They are actively looking for a problem that can be fixed, and they’re looking at driver behavior, and the problem may not be a component in the vehicle.”

Tesla and its chief executive, Elon Musk, have come under criticism for hyping Autopilot and Full Self-Driving in ways that suggest they are capable of piloting cars without input from drivers.

“At a minimum they should be renamed,” said Mr. Adkins of the Governors Highway Safety Association. “Those names confuse people into thinking they can do more than they are actually capable of.”

Competing systems developed by General Motors and Ford Motor use infrared cameras that closely track the driver’s eyes and sound warning chimes if a driver looks away from the road for more than two or three seconds. Tesla did not initially include such a driver monitoring system in its cars, and later added only a standard camera that is much less precise than infrared cameras in eye tracking.

Tesla tells drivers to use Autopilot only on divided highways, but the system can be activated on any streets that have lines down the middle. The G.M. and Ford systems — known as Super Cruise and BlueCruise — can be activated only on highways.

Autopilot was first offered in Tesla models in late 2015. It uses cameras and other sensors to steer, accelerate and brake with little input from drivers. Owner manuals tell drivers to keep their hands on the steering wheel and their eyes on the road, but early versions of the system allowed drivers to keep their hands off the wheel for five minutes or more under certain conditions.

Unlike technologists at almost every other company working on self-driving vehicles, Mr. Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether relying on cameras without other sensing devices was safe enough.

Mr. Musk has regularly promoted Autopilot’s abilities, saying autonomous driving is a “solved problem” and predicting that drivers will soon be able to sleep while their cars drive them to work.



This image provided by the National Transportation Safety Board shows the damage to the Tesla involved in a 2016 crash in Florida. Credit...National Transportation Safety Board, via Associated PressQuestions about the system arose in 2016 when an Ohio man was killed when his Model S crashed into a tractor-trailer on a highway in Florida while Autopilot was activated. NHTSA investigated that crash and in 2017 said it had found no safety defect in Autopilot.

But the agency issued a bulletin in 2016 saying driver-assistance systems that fail to keep drivers engaged “may also be an unreasonable risk to safety.” And in a separate investigation, the National Transportation Safety Board concluded that the Autopilot system had “played a major role” in the Florida crash because while it performed as intended, it lacked safeguards to prevent misuse.

Tesla is facing lawsuits from families of victims of fatal crashes, and some customers have sued the company over its claims for Autopilot and Full Self-Driving.

Last year, Mr. Musk acknowledged that developing autonomous vehicles was more difficult than he had thought.

NHTSA opened its preliminary evaluation of Autopilot in August and initially focused on 11 crashes in which Teslas operating with Autopilot engaged ran into police cars, fire trucks and other emergency vehicles that had stopped and had their lights flashing. Those crashes resulted in one death and 17 injuries.

While examining those crashes, it discovered six more involving emergency vehicles and eliminated one of the original 11 from further study.

At the same time, the agency learned of dozens more crashes that occurred while Autopilot was active and that did not involve emergency vehicles. Of those, the agency first focused on 191, and eliminated 85 from further scrutiny because it could not obtain enough information to get a clear picture if Autopilot was a major cause.

In about half of the remaining 106, NHTSA found evidence that suggested drivers did not have their full attention on the road. About a quarter of the 106 occurred on roads where Autopilot is not supposed to be used.

In an engineering analysis, NHTSA’s Office of Defects Investigation sometimes acquires vehicles it is examining and arranges testing to try to identify flaws and replicate problems they can cause. In the past it has taken apart components to find faults, and has asked manufacturers for detailed data on how components operate, often including proprietary information.

The process can take months or even a year or more. NHTSA aims to complete the analysis within a year. If it concludes a safety defect exists, it can press a manufacturer to initiate a recall and correct the problem.

On rare occasions, automakers have contested the agency’s conclusions in court and prevailed in halting recalls.
mmarshall is offline  
Old 06-09-22, 08:30 PM
  #1025  
LeX2K
Lexus Fanatic
 
LeX2K's Avatar
 
Join Date: Sep 2010
Location: Alberta
Posts: 20,219
Received 2,938 Likes on 2,474 Posts
Default

Originally Posted by mmarshall
..snip..
I can practically hear you salivating through my screen.
LeX2K is offline  
Old 06-09-22, 09:03 PM
  #1026  
xjokerz
Racer
 
xjokerz's Avatar
 
Join Date: Mar 2019
Location: WA
Posts: 1,535
Received 69 Likes on 54 Posts
Default

I'll never trust self-driving cars. Besides, I like driving. It's going to be a sad day when my girls won't be able to drive, instead having a car drive them around instead.
xjokerz is offline  
Old 06-10-22, 09:15 AM
  #1027  
SW17LS
Lexus Fanatic
 
SW17LS's Avatar
 
Join Date: May 2012
Location: Maryland
Posts: 57,294
Received 2,731 Likes on 1,956 Posts
Default

Originally Posted by xjokerz
I'll never trust self-driving cars. Besides, I like driving. It's going to be a sad day when my girls won't be able to drive, instead having a car drive them around instead.
They will still be able to drive. We are a long long way away from being unable to drive ourselves. These systems will remain optional driver aids. Perhaps their children won't drive themselves...

I like driving too, but driving a car down a straight highway isn't much fun, nor is driving a car in bumper to bumper traffic. Having the car just drive itself in traffic while you relax or send an email or text is a beautiful thing. Having it do most of the driving on the highway on a long trip is also a beautiful thing.
SW17LS is offline  
Old 06-10-22, 10:25 AM
  #1028  
xjokerz
Racer
 
xjokerz's Avatar
 
Join Date: Mar 2019
Location: WA
Posts: 1,535
Received 69 Likes on 54 Posts
Default

Originally Posted by SW17LS
They will still be able to drive. We are a long long way away from being unable to drive ourselves. These systems will remain optional driver aids. Perhaps their children won't drive themselves...

I like driving too, but driving a car down a straight highway isn't much fun, nor is driving a car in bumper to bumper traffic. Having the car just drive itself in traffic while you relax or send an email or text is a beautiful thing. Having it do most of the driving on the highway on a long trip is also a beautiful thing.
Good, I look forward to getting my girls behind the wheel and teaching them how to drive.

Not for me. I enjoy going through a scenic highway cruising along. I've done it for so long (I don't need everyone here to say they've been driving longer than I've been living.... 18 years driving is quite a while with the mileage I put on my cars) that it would feel awkward to be driven around automatically. I don't think I could ever get used to that.

I don't need to email or text someone that bad. Lol it can always wait until I get home.
xjokerz is offline  
Old 06-10-22, 10:29 AM
  #1029  
mmarshall
Lexus Fanatic
 
mmarshall's Avatar
 
Join Date: Oct 2003
Location: Virginia/D.C. suburbs
Posts: 91,293
Received 87 Likes on 86 Posts
Default

Originally Posted by LeX2K
I can practically hear you salivating through my screen.

First, this has nothing to do with me or my saliva, Thank-You. Second, this is a Federal Investigation, which is far above anything either you or I could do or say.
mmarshall is offline  
Old 06-10-22, 10:47 AM
  #1030  
SW17LS
Lexus Fanatic
 
SW17LS's Avatar
 
Join Date: May 2012
Location: Maryland
Posts: 57,294
Received 2,731 Likes on 1,956 Posts
Default

Originally Posted by xjokerz
Good, I look forward to getting my girls behind the wheel and teaching them how to drive.

Not for me. I enjoy going through a scenic highway cruising along. I've done it for so long (I don't need everyone here to say they've been driving longer than I've been living.... 18 years driving is quite a while with the mileage I put on my cars) that it would feel awkward to be driven around automatically. I don't think I could ever get used to that.

I don't need to email or text someone that bad. Lol it can always wait until I get home.
I used to say the same thing. You’ve never used this technology though, it’s not driving for you, you are still driving it’s just helping you drive. It makes driving on a scenic highway even better, because you can watch the scenery more than you could if it wasn't helping you drive.

If you looked at me sitting in the car you would never know the car was doing most of the driving.
SW17LS is offline  
Old 06-15-22, 11:12 AM
  #1031  
mmarshall
Lexus Fanatic
 
mmarshall's Avatar
 
Join Date: Oct 2003
Location: Virginia/D.C. suburbs
Posts: 91,293
Received 87 Likes on 86 Posts
Default

Latest data shows that the number of crashes with Tesla self-driving vehicles is much worse than previously thought.


https://www.washingtonpost.com/techn...-kT67mY-ty9Rx0

Teslas running Autopilot involved in 273 crashes reported since last year

Regulators released the first batch of data since mandating that companies such as Tesla report on serious crashes involving their driver-assistance systems.


Updated June 15, 2022 at 1:42 p.m. EDT|Published June 15, 2022 at 9:08 a.m. EDT

SAN FRANCISCO — Tesla vehicles running its Autopilot software have been involved in 273 reported crashes over roughly the past year, according to regulators, far more than previously known and providing concrete evidence regarding the real-world performance of its futuristic features.

The numbers, which were published by the National Highway Traffic Safety Administration for the first time Wednesday, show that Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems reported since last July, and a majority of the fatalities and serious injuries — some of which date back further than a year. Eight of the Tesla crashes took place prior to June 2021, according to data released by NHTSA Wednesday morning.

Previously, NHTSA said it had probed 42 crashes potentially involving driver assistance, 35 of which included Tesla vehicles, in a more limited data set that stretched back to 2016.

Of the six fatalities listed in the data set published Wednesday, five were tied to Tesla vehicles — including a July 2021 crash involving a pedestrian in Flushing, N.Y., and a fatal crash in March in Castro Valley, Calif. Some dated as far back as 2019.

Tesla Autopilot is a suite of systems that allows drivers to cede physical control of their electric vehicles, though they must pay attention at all times. The cars can maintain speed and safe distance behind other cars, stay within their lane lines and make lane changes on highways. An expanded set of features, called the “Full Self-Driving” beta, adds the ability to maneuver city and residential streets, halting at stop signs and traffic lights, and making turns while navigating vehicles from point to point.

But some transportation safety experts have raised concerns about the technology’s safety, since it is being tested and trained on public roads with other drivers. Federal officials have targeted Tesla in recent months with an increasing number of investigations, recalls and even public admonishments directed at the company.

Federal investigators step up probe into Tesla Autopilot crashes

The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

The NHTSA order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, in part to mitigate the concern that manufacturers would hide crashes by claiming the software wasn’t in use at the time of the impact.

“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” NHTSA’s administrator, Steven Cliff, said in a call with media about the full data set from manufacturers.

Tesla did not immediately respond to a request for comment. Tesla has argued that Autopilot is safer than normal driving when crash data is compared. The company has also pointed to the vast number of traffic crash deaths on U.S. roadways annually, estimated by NHTSA at 42,915 in 2021, hailing the promise of technologies like Autopilot to “reduce the frequency and severity of traffic crashes and save thousands of lives each year.”

Data pitting normal driving against Autopilot is not directly comparable because Autopilot operates largely on highways. Tesla CEO Elon Musk, however, had described Autopilot as “unequivocally safer.”

NHTSA launches probe into Tesla’s ‘phantom braking' problem

Musk
as recently as January that there had been no crashes or injuries involving the Full Self-Driving beta software, which has been rolled out to a more limited number of drivers for testing. NHTSA officials said their data was not expected to specify whether Full Self-Driving was active at the time of the crash.

The reports presents a new window into systems like Autopilot, but the database remains a work in progress — with many unknowns even in the raw data and questions left outstanding. The data does not lend itself easily to comparisons between different manufacturers, because it does not include information such as how many vehicle miles the different driver-assistance systems were used across, or how widely they are deployed across carmakers’ fleets.

Still, the information gives regulators a more complete look than they had before. Previously, regulators relied on a piecemeal collection of data from media reports, manufacturer notifications and other sporadic sources to learn about incidents involving advanced driver-assistance.

“It revealed that more crashes are happening than NHTSA had previously known,” said Phil Koopman, an engineering professor at Carnegie Mellon University who focuses on autonomous vehicle safety. He noted that the reports may omit more minor crashes, including fender benders.

The data set doesn’t include every piece of information that would be helpful to know, but it could be an early indication of a focus on gathering more information and using that to improve technologies and safety regulations, said Bryant Walker Smith, a law professor at University of South Carolina who studies emerging transport technologies.

“The promise of these, the potential of these is ultimately to make driving safer,” he said of the driver assistance technologies. “It’s an open question whether these systems overall or individual systems have accomplished that.”

Companies such as Tesla collect more data than other automakers, which might leave them overrepresented in the data, according to experts in the systems as well as some officials who spoke on the condition of anonymity to candidly describe the findings. Tesla also pilots much of the technology, some of which comes standard on its cars, putting it in the hands of users who become familiar with it more quickly and use it in a wider variety of situations.

Several lawmakers weighed in on the report Wednesday, with some calling for greater investigation and possible safety standards for cars with the technology. Sen. Richard Blumenthal (D-Conn.) called the findings “cause for deep alarm.”

“It is a ringing alarm bell affirming many of the warnings that we’ve made over the years,” he said. “The frequency and severity of these crashes is a cause for yellow lights flashing and maybe red lights flashing on some of this technology.”

Blumenthal and Sen. Edward J. Markey (D-Mass.) have previously criticized Tesla for putting software on the roads “without fully considering its risks and implications.” On a call with media Wednesday, Markey called out Tesla’s assertion that Autopilot technology makes cars safer.

“This report provides further evidence slamming the brakes on those claims by Tesla,” he said.

The senators plan to send a letter to NHTSA requesting the regulator “take additional steps in order to protect safety,” Markey said.

Driver-assistance technology has grown in popularity as owners have sought to hand over more of the driving tasks to automated features, which do not make the cars autonomous but can offer relief from certain physical demands of driving. Automakers such as Subaru and Honda have added driver-assistance features that act as a more advanced cruise control, keeping set distances from other vehicles, maintaining speed and following marked lane lines on highways.

But none of them operate in as broad a set of conditions, such as residential and city streets, as Tesla’s systems do. NHTSA disclosed last week that Tesla’s Autopilot is on around 830,000 vehicles dating back to 2014.

Autopilot has spurred several regulatory probes, including into crashes with parked emergency vehicles and the cars’ tendency to halt for imagined hazards.

As part of its probe into crashes with parked emergency vehicles, NHTSA has said it is looking into whether Autopilot “may exacerbate human factors or behavioral safety risks.”

Autopilot has been tied to deaths in crashes in Williston and Delray Beach, Fla., as well as in Los Angeles County and Mountain View, Calif. The driver-assistance features have drawn the attention of NHTSA, which regulates motor vehicles, and the National Transportation Safety Board, an independent body charged with investigating safety incidents.

Tesla driver faces felony charges in fatal crash involving Autopilot

Federal regulators last year ordered car companies including Tesla to submit crash reports within a day of learning of any incident involving driver assistance that resulted in a death or hospitalization because of injury, or that involved a person being struck. Companies are also required to report crashes involving the technology that included an air bag deployment or cars that had to be towed.

The agency said it was collecting the data because of the “unique risks” of the emerging technology, to determine whether manufacturers are making sure their equipment is “free of defects that pose an unreasonable risk to motor vehicle safety.”

How U.S. regulators played mind games with Elon Musk

Carmakers and hardware-makers reported 46 injuries from the crashes, including five serious injuries. But the total injury rate could be higher — 294 of the crashes had an “unknown” number of injuries.

One additional fatality was reported, but regulators noted it wasn’t clear if the driver-assistance technology was being used.

Honda reported 90 crashes during the same time period involving advanced driver-assistance systems, and Subaru reported 10.

In a statement, Honda spokesman Chris Martin urged caution when comparing companies’ crash report data, noting that the firms have different ways to collect information. Honda’s reports “are based on unverified customer statements regarding the status of ADAS systems at the time of a reported crash,” he said.

Some systems appear to disable in the moments leading up to a crash, potentially allowing companies to say they were not active at the time of the incident. NHTSA is already investigating 16 incidents involving Autopilot where Tesla vehicles slammed into parked emergency vehicles. On average in those incidents, NHTSA said: “Autopilot aborted vehicle control less than one second prior to the first impact.”

Regulators also released data on crashes reported by automated driving systems, which are commonly called self-driving cars. These cars are far less common on roads, loaded with sophisticated equipment and not commercially available. A total of 130 crashes were reported, including 62 from Waymo, a sister company to Google.

Waymo spokesman Nick Smith said in a statement that the company sees the value in collecting the information and said “any reporting requirements should be harmonized across all U.S. jurisdictions to limit confusion and potentially enable more meaningful comparisons, and NHTSA’s effort is a step toward achieving that goal.”

The automated driving systems report shows no fatalities and one serious injury. There was also one report of an automated driving crash involving Tesla, which has tested autonomous vehicles in limited capacities in the past, though the circumstances of the incident were not immediately clear.

In the crashes where advanced-driver assistance played a role, and where further information on the collision was known, vehicles most frequently collided with fixed objects or other cars. Among the others, 20 hit a pole or tree, 10 struck animals, two crashed into emergency vehicles, three struck pedestrians and at least one hit a cyclist.

When the vehicles reported damage, it was most commonly to the front of the car, which was the case in 124 incidents. Damage was more often concentrated on the front left, or driver’s side, of the car, rather than the passenger’s side.

The incidents were heavily concentrated in California and Texas, the two most populous states and also the U.S. locations Tesla has made its home. Nearly a third of the crashes involving driver assistance, 125, occurred in California. And 33 took place in Texas.

mmarshall is offline  
Old 06-15-22, 11:20 AM
  #1032  
LeX2K
Lexus Fanatic
 
LeX2K's Avatar
 
Join Date: Sep 2010
Location: Alberta
Posts: 20,219
Received 2,938 Likes on 2,474 Posts
Default

Citing limitations, such as information needed to contextualize totals, the NHTSA said that the data can’t be used to compare the safety of manufacturers against one another. It notes that access to crash data may affect reporting and that initial reports may reflect incomplete or unknown information. Tesla did not immediately respond to a request for comment.
source

In other words the data is useless.
LeX2K is offline  
Old 06-15-22, 01:20 PM
  #1033  
Och
Lexus Champion
iTrader: (3)
 
Och's Avatar
 
Join Date: Feb 2003
Location: NY
Posts: 16,436
Likes: 0
Received 14 Likes on 13 Posts
Default

Originally Posted by mmarshall
Latest data shows that the number of crashes with Tesla self-driving vehicles is much worse than previously thought.


https://www.washingtonpost.com/techn...-kT67mY-ty9Rx0

Teslas running Autopilot involved in 273 crashes reported since last year

Regulators released the first batch of data since mandating that companies such as Tesla report on serious crashes involving their driver-assistance systems.

Mike, this actually gets far worse. A lot of crashes happen when the Tesla is in FSD mode, and then at the last moment when the crash is unavoidable the software panics and cancels, or the driver panics and presses the brake cancelling FSD - so on paper it is not a FSD crash, but it is the result of using FSD.

This garbage needs to be banned ASAP. Good thing we don't have any of that nonsense in NYC.
Och is offline  
Old 06-15-22, 02:02 PM
  #1034  
SW17LS
Lexus Fanatic
 
SW17LS's Avatar
 
Join Date: May 2012
Location: Maryland
Posts: 57,294
Received 2,731 Likes on 1,956 Posts
Default

I really think so much of this stems from how Tesla markets this system
SW17LS is offline  
Old 06-19-22, 09:21 AM
  #1035  
LeX2K
Lexus Fanatic
 
LeX2K's Avatar
 
Join Date: Sep 2010
Location: Alberta
Posts: 20,219
Received 2,938 Likes on 2,474 Posts
Default

Tesla is getting close, but also so far edge cases trip it up. Still this is impressive.

LeX2K is offline  


Quick Reply: Self-Driving Vehicles



All times are GMT -7. The time now is 09:19 PM.