Self-Driving Vehicles
#766
Lexus Fanatic
iTrader: (20)
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
looks like it weighed the white line on the left too heavily considered it the left edge of the lane and proceeded... not good, but fixable.
#767
Lexus Champion
iTrader: (3)
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
Tesla's auto pilot is just a primitive gimmick that is only going to cause idiot drivers to be even more negligent and distracted.
#768
Lexus Fanatic
iTrader: (20)
#769
Lexus Fanatic
iTrader: (20)
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
#770
Lexus Champion
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
In addition to the safety concern of not having a human backup driver (although it proved pointless in the uber AZ crash), there's the burgeoning question of liability - who would be liable for damages caused by a completely driver-less car? Software maker? In light of this unknown and as-of-yet untested ground, I give waymo credit, takes ***** to put themselves out there as a possible test case.
If we use the recent Uber collision as an example, we should hold Uber responsible, not the driver, not the provider of the lidar transmitter-receiver, not the provider of the software that interprets the lidar data, not the provider of the software that controls the throttle (using information from the lidar), not the provider that controls the steering (using information from the lidar), ... Each different piece of software, controlling a different driving system in the car, may be provided by a different supplier.
All of the hardware and software must work together to drive the car, and blaming any one unit out of the many runs the risk of blaming the wrong provider. But it is Uber that put it all together; Uber was responsible for ensuring that each little piece works with another piece and they all work together to drive the car. Uber was responsible for using only one lidar unit -- rather than many as an earlier car had used -- that may have given that car blind spots so that the pedestrian was not detected.
The provider of that automated vehicle made the big decisions and took the big risks. The provider of that whole vehicle must be held responsible, not the driver or any one provider of the many interacting systems.
#771
Lexus Champion
iTrader: (3)
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
One more nail into the coffin of self driving lunacy.
https://arstechnica.com/cars/2018/04...-in-18-months/
https://arstechnica.com/cars/2018/04...-in-18-months/
It is no secret that Tesla's Autopilot project is struggling. Last summer, we covered a report that Tesla was bleeding talent from its Autopilot division. Tesla Autopilot head Sterling Anderson quit Tesla at the end of 2016. His replacement was Chris Lattner, who had previously created the Swift programming language at Apple. But Lattner only lasted six months before departing last June.
Now Lattner's replacement, Jim Keller, is leaving Tesla as well.
Now Lattner's replacement, Jim Keller, is leaving Tesla as well.
#772
Lexus Fanatic
#773
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
First-line responsibility rests with the provider of the product or the service, not with the provider of one part of the whole system.
If we use the recent Uber collision as an example, we should hold Uber responsible, not the driver, not the provider of the lidar transmitter-receiver, not the provider of the software that interprets the lidar data, not the provider of the software that controls the throttle (using information from the lidar), not the provider that controls the steering (using information from the lidar), ... Each different piece of software, controlling a different driving system in the car, may be provided by a different supplier.
All of the hardware and software must work together to drive the car, and blaming any one unit out of the many runs the risk of blaming the wrong provider. But it is Uber that put it all together; Uber was responsible for ensuring that each little piece works with another piece and they all work together to drive the car. Uber was responsible for using only one lidar unit -- rather than many as an earlier car had used -- that may have given that car blind spots so that the pedestrian was not detected.
The provider of that automated vehicle made the big decisions and took the big risks. The provider of that whole vehicle must be held responsible, not the driver or any one provider of the many interacting systems.
If we use the recent Uber collision as an example, we should hold Uber responsible, not the driver, not the provider of the lidar transmitter-receiver, not the provider of the software that interprets the lidar data, not the provider of the software that controls the throttle (using information from the lidar), not the provider that controls the steering (using information from the lidar), ... Each different piece of software, controlling a different driving system in the car, may be provided by a different supplier.
All of the hardware and software must work together to drive the car, and blaming any one unit out of the many runs the risk of blaming the wrong provider. But it is Uber that put it all together; Uber was responsible for ensuring that each little piece works with another piece and they all work together to drive the car. Uber was responsible for using only one lidar unit -- rather than many as an earlier car had used -- that may have given that car blind spots so that the pedestrian was not detected.
The provider of that automated vehicle made the big decisions and took the big risks. The provider of that whole vehicle must be held responsible, not the driver or any one provider of the many interacting systems.
#774
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
finally, ty toyota: https://www.autoblog.com/2018/05/03/...higan-1791826/
this should have been done first and foremost prior to any roadtests and probably cheaper than a 2D/3D simulator
Autonomous testing is taking off around the country, and Michigan is not missing out on the action. Notably, the University of Michigan has its MCity testing grounds, while another self-driving test facility, the American Center for Mobility, has broken ground at the historic former Willow Run manufacturing site. Now Toyota will have its own autonomous testing facility in southeastern Michigan, near Toledo, Ohio. Today, the Toyota Research Institute (TRI) has announced it will open a 60-acre testing site at the Michigan Technical Resource Park in Ottawa Lake.
The automated driving facility will be built within the existing 1.75-mile oval track, to which TRI will also have access. Research will primarily focus on re-creating "edge case" scenarios, which will push autonomous driving systems to their limits in ways that would be too hazardous for testing on public roads. It will replicate highway on- and off-ramps, four-lane highways, city traffic and slick surfaces
"By constructing a course for ourselves, we can design it around our unique testing needs and rapidly advance capabilities, especially with Toyota Guardian automated vehicle mode," said TRI senior VP of automated driving Ryan Eustice. "This new site will give us the flexibility to customize driving scenarios that will push the limits of our technology and move us closer to conceiving a human-driven vehicle that is incapable of causing a crash."
TRI's automated vehicle test facility is expected to open in October.
The automated driving facility will be built within the existing 1.75-mile oval track, to which TRI will also have access. Research will primarily focus on re-creating "edge case" scenarios, which will push autonomous driving systems to their limits in ways that would be too hazardous for testing on public roads. It will replicate highway on- and off-ramps, four-lane highways, city traffic and slick surfaces
"By constructing a course for ourselves, we can design it around our unique testing needs and rapidly advance capabilities, especially with Toyota Guardian automated vehicle mode," said TRI senior VP of automated driving Ryan Eustice. "This new site will give us the flexibility to customize driving scenarios that will push the limits of our technology and move us closer to conceiving a human-driven vehicle that is incapable of causing a crash."
TRI's automated vehicle test facility is expected to open in October.
#775
#776
Lexus Champion
iTrader: (3)
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
Tesla hits stationary car:
http://bgr.com/2018/05/29/tesla-auto...ch-police-car/
http://bgr.com/2018/05/29/tesla-auto...ch-police-car/
#777
Lexus Fanatic
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
This one should really open some eyes.
......... it totalled a police vehicle.
IMO, Och is being proved correct. There are some serious problems with this technology.....and it doesn't seem to be getting any better.
![EEK!](https://www.clublexus.com/forums/images/smilies/eek1.gif)
IMO, Och is being proved correct. There are some serious problems with this technology.....and it doesn't seem to be getting any better.
Last edited by mmarshall; 05-29-18 at 05:43 PM.
#778
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
SAN FRANCISCO — Police in Tempe, Arizona said evidence showed the "safety" driver behind the wheel of a self-driving Uber was distracted and streaming a television show on her phone right up until about the time of a fatal accident in March, deeming the crash that rocked the nascent industry "entirely avoidable."
A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.
According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.
Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.
It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination. The Maricopa County Attorney's Office referred the case to the Yavapai County Attorney's office because of a conflict and that office could not be reached late Thursday.
Vasquez could not immediately be reached for comment and Reuters could not locate her attorney.
The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving cardevelopers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.
Vasquez looked up just 0.5 seconds before the crash, after keeping her head down for 5.3 seconds, the Tempe Police report said. Uber's self-driving Volvo SUV was traveling at just under 44 miles-per-hour.
Uber declined to comment.
Last month, an Uber spokeswoman said the company was undergoing a "top-to-bottom safety review," and had brought on a former federal transportation official to help improve the company's safety culture. The company prohibits the use of any mobile device by safety drivers while the self-driving cars are on a public road, and drivers are told they can be fired for violating this rule.
Police said a review of video from inside the car showed Vasquez was looking down during the trip, and her face "appears to react and show a smirk or laugh at various points during the times that she is looking down." The report found that Vasquez "was distracted and looking down" for close to seven of the nearly 22 minutes prior to the collision.
Tempe Police Detective Michael McCormick asked Hulu for help in the investigation, writing in a May 10 email to the company that "this is a very serious case where the charges of vehicle manslaughter may be charged, so correctly interpreting the information provided to us is crucial." Hulu turned over the records on May 31.
According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.
Herzberg, who was homeless, was walking her bicycle across the street, outside of a crosswalk on a four-lane road, the night of March 18 when she was struck by the front right side of the Volvo.
The police report faulted Herzberg for "unlawfully crossing the road at a location other than a marked crosswalk."
In addition to the report, police released on Thursday a slew of audio files of 911 calls made by Vasquez, who waited at the scene for police, and bystanders the night of the crash; photographs of Herzberg's damaged bicycle and the Uber car; and videos from police officers' body cameras that capture the minutes after the crash, including harrowing screams in the background.
The crash dealt Uber a major setback in its efforts to develop self-driving cars, and the company shuttered its autonomous car testing program in Arizona after the incident. It says it plans to begin testing elsewhere this summer, although in some cities it will have to first win over increasingly wary regulators.
A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.
According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.
Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.
It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination. The Maricopa County Attorney's Office referred the case to the Yavapai County Attorney's office because of a conflict and that office could not be reached late Thursday.
Vasquez could not immediately be reached for comment and Reuters could not locate her attorney.
The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving cardevelopers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.
Vasquez looked up just 0.5 seconds before the crash, after keeping her head down for 5.3 seconds, the Tempe Police report said. Uber's self-driving Volvo SUV was traveling at just under 44 miles-per-hour.
Uber declined to comment.
Last month, an Uber spokeswoman said the company was undergoing a "top-to-bottom safety review," and had brought on a former federal transportation official to help improve the company's safety culture. The company prohibits the use of any mobile device by safety drivers while the self-driving cars are on a public road, and drivers are told they can be fired for violating this rule.
Police said a review of video from inside the car showed Vasquez was looking down during the trip, and her face "appears to react and show a smirk or laugh at various points during the times that she is looking down." The report found that Vasquez "was distracted and looking down" for close to seven of the nearly 22 minutes prior to the collision.
Tempe Police Detective Michael McCormick asked Hulu for help in the investigation, writing in a May 10 email to the company that "this is a very serious case where the charges of vehicle manslaughter may be charged, so correctly interpreting the information provided to us is crucial." Hulu turned over the records on May 31.
According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.
Herzberg, who was homeless, was walking her bicycle across the street, outside of a crosswalk on a four-lane road, the night of March 18 when she was struck by the front right side of the Volvo.
The police report faulted Herzberg for "unlawfully crossing the road at a location other than a marked crosswalk."
In addition to the report, police released on Thursday a slew of audio files of 911 calls made by Vasquez, who waited at the scene for police, and bystanders the night of the crash; photographs of Herzberg's damaged bicycle and the Uber car; and videos from police officers' body cameras that capture the minutes after the crash, including harrowing screams in the background.
The crash dealt Uber a major setback in its efforts to develop self-driving cars, and the company shuttered its autonomous car testing program in Arizona after the incident. It says it plans to begin testing elsewhere this summer, although in some cities it will have to first win over increasingly wary regulators.
#779
Lexus Fanatic
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
If this crash happened because the driver was streaming and not paying attention, then that proves that "self-driving" cars are not truly self driving.
![Wink](https://www.clublexus.com/forums/images/smilies/wink.gif)
#780
![Default](https://www.clublexus.com/forums/images/icons/icon1.gif)
The fact that a "self-driving" car needs a "safety driver" behind the wheel in the first place means that the car maker does not believe that its "self-driving" car can fully self-drive just yet, so nothing needs proving here. The safety driver did not do her job, that's all.
Last edited by ydooby; 06-22-18 at 04:59 PM.