Tesla "full self driving" discussion
#151
Tesla "summon" feature drives the car into a pole, and Tesla won't take any responsibility. Hilarious.
https://kmph.com/news/local/mans-tes...-valet-feature
https://kmph.com/news/local/mans-tes...-valet-feature
#152
Full self driving is turning out to be a great source of comedy. It mistakes the moon for a yellow traffic light.
https://www.autoweek.com/news/green-...m=social-media
https://www.autoweek.com/news/green-...m=social-media
#154
My son's RX does the same thing. Definitely beat my response time more than once.
#156
I couldn’t agree more, in any of these drivers assistance systems the driver just can’t expect to let the car do everything, it requires the driver to override as necessary. Really the only thing I think Tesla is at fault for is the name of their ASSISTANCE package because there are dumb people everywhere that will really think it’s autopilot and hands off. I’m sure other carmakers are too risk adverse to call their systems something as dumb as this because they know people are dumb. I also don’t like it because it’s such a marketing ploy for those dumb people.
The people who tend to do risky (stupid) things, are the ones who actually have a good understanding of the system, which is why they- A. have enough knowledge to try and bypass it and B. feel comfortable enough to let it manage itself beyond its intended use, despite the alerts and warnings that would stop an otherwise newer/proper user. In other words, if we see someone climbing into their back seat or whatever gets caught on video these days, they're definitely not confused about the name of the system or how it functions, it's deliberate and they should be held accountable. Tesla could change the name to "not FSD" tomorrow, and we'd see the same behaviors, because the intent is still there. It's a shame, because it's promising tech and the reporting rarely adds context, because clicks matter.
#157
#158
This raises a thought, but I don't think the name makes a difference. When I let new people try autopilot, they usually micro-manage it like crazy and watch it like a hawk. The same was true when I had the radar cruise feature on my IS, back in 06 (which by a different name, operates on similar principle). IME, people tend to have a healthy reluctance about letting the system handle itself, before they develop enough confidence to let it do its thing.
The people who tend to do risky (stupid) things, are the ones who actually have a good understanding of the system, which is why they- A. have enough knowledge to try and bypass it and B. feel comfortable enough to let it manage itself beyond its intended use, despite the alerts and warnings that would stop an otherwise newer/proper user. In other words, if we see someone climbing into their back seat or whatever gets caught on video these days, they're definitely not confused about the name of the system or how it functions, it's deliberate and they should be held accountable. Tesla could change the name to "not FSD" tomorrow, and we'd see the same behaviors, because the intent is still there. It's a shame, because it's promising tech and the reporting rarely adds context, because clicks matter.
The people who tend to do risky (stupid) things, are the ones who actually have a good understanding of the system, which is why they- A. have enough knowledge to try and bypass it and B. feel comfortable enough to let it manage itself beyond its intended use, despite the alerts and warnings that would stop an otherwise newer/proper user. In other words, if we see someone climbing into their back seat or whatever gets caught on video these days, they're definitely not confused about the name of the system or how it functions, it's deliberate and they should be held accountable. Tesla could change the name to "not FSD" tomorrow, and we'd see the same behaviors, because the intent is still there. It's a shame, because it's promising tech and the reporting rarely adds context, because clicks matter.
#160
#163
#164
No. Ultimately, that probably will not happen for reasons of litigation.....if that feature does in fact exist now, it won't last very long. If there is a crash and people inside are hurt or killed, those who produced software that allowed excessive speed will leave themselves wide open for lawsuits. No only that, but in some states, 20 MPH over the posted limit is considered Reckless Driving, although, of course, in this case, a machine, not a human, is doing the actual driving.
#165
Originally Posted by mmarshall
No. Ultimately, that probably will not happen for reasons of litigation.....if that feature does in fact exist now, it won't last very long. If there is a crash and people inside are hurt or killed, those who produced software that allowed excessive speed will leave themselves wide open for lawsuits. No only that, but in some states, 20 MPH over the posted limit is considered Reckless Driving, although, of course, in this case, a machine, not a human, is doing the actual driving.