First Tesla Fatality Using the Autopilot.
#46
Moderator
Even Toyota's Safety Sense system wouldn't be able to sense a trailer ahead of the car, you'd need Lidar or longer ranged radar for that. As they say in aviation, sometimes the holes in the cheese line up and an accident happens.
The human brain is being pushed to its limits when you're driving at highway speeds. Recognizing objects, assigning potential vectors and taking evasive action all take time, and being distracted adds seconds to that. I don't know if the driver could have braked in time if he wasn't watching a movie but an adequately equipped and programmed computer system could have reacted much faster.
The human brain is being pushed to its limits when you're driving at highway speeds. Recognizing objects, assigning potential vectors and taking evasive action all take time, and being distracted adds seconds to that. I don't know if the driver could have braked in time if he wasn't watching a movie but an adequately equipped and programmed computer system could have reacted much faster.
#47
Lexus Fanatic
How are you ever going to know if the system is "ready"? Auto makers can test their tech all they want but until it is released into the wild there is no way to be certain the software is robust enough. When other automakers have autopilot systems in production you can bet they will experience issues as well, for all we know they will be much worse than what Tesla has.
#48
Any sensor mounted on the lower grille probably won't be able to see a tall obstacle like a trailer. You'd need a puck on the roof or infrared stereo cameras just ahead of the rearview mirror. Inter-car comms would be the best method but that could take decades to implement.
#49
InterCar comm - a virtual network where every vehicle in a certain radius is providing location and activity status update - is the end solution. Everything else is just popcorn along the way.
#52
Moderator
While the name autopilot does imply certain things, MB just released their "drive pilot"...so this may be a non issue if big automakers are using similar naming conventions
#53
How are you ever going to know if the system is "ready"? Auto makers can test their tech all they want but until it is released into the wild there is no way to be certain the software is robust enough. When other automakers have autopilot systems in production you can bet they will experience issues as well, for all we know they will be much worse than what Tesla has.
It starts by taking a professional, safety-first and safety-conscious approach. Hire the proper team, starting with experienced, senior engineers and managers who have safety-critical automation background. I am not convinced that Tesla has taken this approach.
Then determining if the system is ready starts at the very beginning. Carefully plan and determine what you want your automated system to do -- limit the functionality, if you have to, for your first product -- and then take an extremely disciplined approach to design a system that implements your required functionality. This is the approach followed for decades in aerospace.
Once you have a product developed, have it tested by professional test engineers and test drivers (as the aerospace industry tests with professional test engineers and test pilots). Be ready to refine the product during the testing round. Only when the professional test engineers and drivers give their approval do you release it to the general public. Never, ever release a safety-critical system for Beta testing by unqualified testers (i.e. everyday drivers).
I work in the safety-critical system engineering industry. Neither I nor my colleagues are convinced that Tesla has followed this disciplined approach.
#54
Even if the Tesla automated driving system does not have full hands-off capability, it is so implied by giving it the name "Autopilot".
#55
Lexus Fanatic
#56
Originally Posted by Sulu
This type of certification has been done for years in the aerospace and medical device industries.
It starts by taking a professional, safety-first and safety-conscious approach. Hire the proper team, starting with experienced, senior engineers and managers who have safety-critical automation background. I am not convinced that Tesla has taken this approach.
Then determining if the system is ready starts at the very beginning. Carefully plan and determine what you want your automated system to do -- limit the functionality, if you have to, for your first product -- and then take an extremely disciplined approach to design a system that implements your required functionality. This is the approach followed for decades in aerospace.
Once you have a product developed, have it tested by professional test engineers and test drivers (as the aerospace industry tests with professional test engineers and test pilots). Be ready to refine the product during the testing round. Only when the professional test engineers and drivers give their approval do you release it to the general public. Never, ever release a safety-critical system for Beta testing by unqualified testers (i.e. everyday drivers).
I work in the safety-critical system engineering industry. Neither I nor my colleagues are convinced that Tesla has followed this disciplined approach.
It starts by taking a professional, safety-first and safety-conscious approach. Hire the proper team, starting with experienced, senior engineers and managers who have safety-critical automation background. I am not convinced that Tesla has taken this approach.
Then determining if the system is ready starts at the very beginning. Carefully plan and determine what you want your automated system to do -- limit the functionality, if you have to, for your first product -- and then take an extremely disciplined approach to design a system that implements your required functionality. This is the approach followed for decades in aerospace.
Once you have a product developed, have it tested by professional test engineers and test drivers (as the aerospace industry tests with professional test engineers and test pilots). Be ready to refine the product during the testing round. Only when the professional test engineers and drivers give their approval do you release it to the general public. Never, ever release a safety-critical system for Beta testing by unqualified testers (i.e. everyday drivers).
I work in the safety-critical system engineering industry. Neither I nor my colleagues are convinced that Tesla has followed this disciplined approach.
Here you have a very low bar to entry. You get some basic driver training and you're unleashed on the world. The size of your bank account determines the scope of what you can do.
But to take a page from aviation and what other replies here say - you can eventually create a sort of TCAS (Traffic Collision Avoidance System) tech to prevent this from happening again.
Ultimately there is no way to create a completely idiot-proof system. Some people are just determined to get themselves killed.
Last edited by MattyG; 07-03-16 at 04:20 PM.
#57
Lexus Champion
https://www.theguardian.com/technolo...-car-elon-musk
I doubt a human driver could have done anything differently, plus the person could have reacted the same way as if they were driving the car.
Condolences to the families.
I doubt a human driver could have done anything differently, plus the person could have reacted the same way as if they were driving the car.
Condolences to the families.
#58
Very well put and certainly the standards that any sort of automotive self-drive system will need to use. But in the aerospace and medical sectors, the end users are often highly educated and extremely skilled as the operators of the technology.
Here you have a very low bar to entry. You get some basic driver training and you're unleashed on the world. The size of your bank account determines the scope of what you can do.
Here you have a very low bar to entry. You get some basic driver training and you're unleashed on the world. The size of your bank account determines the scope of what you can do.
Just as important as safe operation is safe failure: If any sensors fail or there are conflicting signals that would make the system not able to operate, there must be backups or it must fail in such a way that the driver and passengers, or bystanders outside the vehicle are not too badly harmed. This concept, too, comes from aerospace and medical devices.
Agreed. But you cannot build a system that is absolutely perfect. The best you can hope to do is to design for the 99% who are not idiots.
#59
We have TCAS and ADS on planes and they still crash into the ground, into each other or disappear completely. For ground vehicles, there's a big difference between what a vehicle plans to do and what it ends up doing, even if it's controlled by a computer.
I think we need autonomous driving systems that can take evasive action quickly if it detects something abnormal. If it doesn't know what to do, it should immediately give control back to the driver - who hopefully isn't asleep or watching a movie. You can make driverless cars when all cars on the road are controlled by a centralized computer. Until then, you still need a driver if you use independent autonomous systems.
I think we need autonomous driving systems that can take evasive action quickly if it detects something abnormal. If it doesn't know what to do, it should immediately give control back to the driver - who hopefully isn't asleep or watching a movie. You can make driverless cars when all cars on the road are controlled by a centralized computer. Until then, you still need a driver if you use independent autonomous systems.
#60
I agree. "Autopilot" implies (marketing) a complete, hands-off (hands off the steering wheel, and accelerator and brake pedals) approach to automation.
Even if the Tesla automated driving system does not have full hands-off capability, it is so implied by giving it the name "Autopilot".
Even if the Tesla automated driving system does not have full hands-off capability, it is so implied by giving it the name "Autopilot".