View Poll Results: Tesla Model 3 Build Defects - Deal Breaker?
Yes
31
65.96%
No
11
23.40%
Maybe
5
10.64%
Voters: 47. You may not vote on this poll
Tesla Model 3 (merged megathread)
#556
Originally Posted by LexsCTJill
no it does not.
Last edited by EZZ; 05-10-21 at 06:31 PM.
#557
‘The cause wasn’t determined. They say the cruise could have been in operation. They never said the conclusion as it’s still coming.
‘I think it was something else and it was full driver error.
‘I think it was something else and it was full driver error.
But the car's cruise-control function could still have been in operation,
#558
Originally Posted by LexsCTJill
‘The cause wasn’t determined. They say the cruise could have been in operation. They never said the conclusion as it’s still coming.
‘I think it was something else and it was full driver error.
‘I think it was something else and it was full driver error.
#560
I think it implied the front. We know that the 2 people went into the Tesla to test out the autonomous features as reiterated by the family members. We know some aspect of the Autopilot safety system was on but not full autopilot (article linked below). We know the dude was in the back and the seat belt was engaged in the front. I think the driver thought he was in autopilot, left the seatbelt engaged and got into the back seat, then the car crashed because it wasn't on autopilot but was in cruise control. Doh...
https://thenextweb.com/news/tesla-vp...nt-fatal-crash
https://thenextweb.com/news/tesla-vp...nt-fatal-crash
#565
"Tesla privately admits Elon Musk has been exaggerating about ‘full self-driving"
https://www.theverge.com/2021/5/7/22424592/tesla-elon-musk-autopilot-dmv-fsd-exaggerationI don't think anyone is really surprised by this revelation (based on a lot of comments on the forum). What I find interesting is that a Tesla engineer has revealed this information.
Tesla CEO Elon Musk has been overstating the capabilities of the company’s advanced driver assist system, the company’s director of Autopilot software told the California Department of Motor Vehicles. The comments came from a memo released by legal transparency group PlainSite, which obtained the documents from a public records request.
It was the latest revelation about the widening gap between what Musk says publicly about Autopilot and what Autopilot can actually do. And it coincides with Tesla coming under increased scrutiny after a Tesla vehicle without anyone in the driver’s seat crashed in Texas, killing two men.
“ELON’S TWEET DOES NOT MATCH ENGINEERING REALITY PER CJ”
“Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in the memo about its March 9th conference call with Tesla representatives, including the director of Autopilot software CJ Moore. Level 2 technology refers to a semi-automated driving system, which requires supervision by a human driver.
In an earnings call in January, Musk told investors that he was “highly confident the car will be able to drive itself with reliability in excess of human this year.” (It would appear the DMV was referring to these January comments, which Moore misunderstood as a tweet from Musk.)
Last October, Tesla introduced a new product called “Full Self-Driving” (FSD) beta to vehicle owners in its Early Access Program. The update enabled drivers to access Autopilot’s partially automated driver assist system on city streets and local roads. The early access program is used as a testing platform to help iron out software bugs. In the DMV memo, Tesla said that as of March 9th there were 824 vehicles in the pilot program, including 753 employees and 71 non-employees.
Musk has said the company was handling the software update “very cautiously.” Drivers still are expected to keep their hands on the steering wheel and should be prepared to assume control of their Tesla at any time. But he has also offered lofty predictions about Tesla’s ability to achieve full autonomy that conflict with what his own engineers are saying to regulators.
Tesla is unlikely to achieve Level 5 (L5) autonomy, in which its cars can drive themselves anywhere, under any conditions, without any human supervision, by the end of 2021, Tesla representatives told the DMV.
The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation. Tesla indicated that Elon is extrapolating on the rates of improvement when speaking about L5 capabilities. Tesla couldn’t say if the rate of improvement would make it to L5 by end of calendar year.
This isn’t the first time that Tesla’s private communications with the DMV have contradicted Musk’s public declarations about his company’s autonomous capabilities. In March, PlainSite published communications from last December between Tesla’s associate general counsel Eric Williams and California DMV’s chief of the autonomous vehicles branch, Miguel Acosta. In it, Williams notes that “neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous.” In other words, Tesla’s FSD beta is self-driving in name only.
“TESLA COULDN’T SAY IF THE RATE OF IMPROVEMENT WOULD MAKE IT TO L5 BY END OF CALENDAR YEAR”
(Al Prescott, acting general counsel at Tesla, was also involved in the December meeting with the DMV. Prescott has since left Tesla for LIDAR maker Luminar.)
Tesla and Musk have long been criticized for overstating the capabilities of the company’s Autopilot system, which in its most basic form can center a Tesla vehicle in a lane and around curves and adjust the car’s speed based on the vehicle ahead. The use of brand names like Autopilot and FSD has also helped contribute to an environment in which Tesla customers are misled into believing their vehicles can actually drive themselves.
There have been a number of fatal crashes involving Tesla vehicles with Autopilot enabled. The latest took place in Spring, Texas, in which two men were killed after their Tesla smashed into a tree. Local law enforcement said there was no one in the driver’s seat at the time of the crash, leading to speculation that the men were misusing Autopilot. Later, Tesla claimed that Autopilot was not in use at the time of the crash and someone may have been in the driver’s seat, too.
The US National Highway Traffic Safety Administration and the National Transportation Safety Board are both investigating the crash, in addition to dozens of other incidents involving Tesla Autopilot. Tesla didn’t respond to a request for comment, likely because the company has dissolved its press office and typically doesn’t respond to media requests anymore.
It was the latest revelation about the widening gap between what Musk says publicly about Autopilot and what Autopilot can actually do. And it coincides with Tesla coming under increased scrutiny after a Tesla vehicle without anyone in the driver’s seat crashed in Texas, killing two men.
“ELON’S TWEET DOES NOT MATCH ENGINEERING REALITY PER CJ”
“Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in the memo about its March 9th conference call with Tesla representatives, including the director of Autopilot software CJ Moore. Level 2 technology refers to a semi-automated driving system, which requires supervision by a human driver.
In an earnings call in January, Musk told investors that he was “highly confident the car will be able to drive itself with reliability in excess of human this year.” (It would appear the DMV was referring to these January comments, which Moore misunderstood as a tweet from Musk.)
Last October, Tesla introduced a new product called “Full Self-Driving” (FSD) beta to vehicle owners in its Early Access Program. The update enabled drivers to access Autopilot’s partially automated driver assist system on city streets and local roads. The early access program is used as a testing platform to help iron out software bugs. In the DMV memo, Tesla said that as of March 9th there were 824 vehicles in the pilot program, including 753 employees and 71 non-employees.
Musk has said the company was handling the software update “very cautiously.” Drivers still are expected to keep their hands on the steering wheel and should be prepared to assume control of their Tesla at any time. But he has also offered lofty predictions about Tesla’s ability to achieve full autonomy that conflict with what his own engineers are saying to regulators.
Tesla is unlikely to achieve Level 5 (L5) autonomy, in which its cars can drive themselves anywhere, under any conditions, without any human supervision, by the end of 2021, Tesla representatives told the DMV.
The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation. Tesla indicated that Elon is extrapolating on the rates of improvement when speaking about L5 capabilities. Tesla couldn’t say if the rate of improvement would make it to L5 by end of calendar year.
This isn’t the first time that Tesla’s private communications with the DMV have contradicted Musk’s public declarations about his company’s autonomous capabilities. In March, PlainSite published communications from last December between Tesla’s associate general counsel Eric Williams and California DMV’s chief of the autonomous vehicles branch, Miguel Acosta. In it, Williams notes that “neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous.” In other words, Tesla’s FSD beta is self-driving in name only.
“TESLA COULDN’T SAY IF THE RATE OF IMPROVEMENT WOULD MAKE IT TO L5 BY END OF CALENDAR YEAR”
(Al Prescott, acting general counsel at Tesla, was also involved in the December meeting with the DMV. Prescott has since left Tesla for LIDAR maker Luminar.)
Tesla and Musk have long been criticized for overstating the capabilities of the company’s Autopilot system, which in its most basic form can center a Tesla vehicle in a lane and around curves and adjust the car’s speed based on the vehicle ahead. The use of brand names like Autopilot and FSD has also helped contribute to an environment in which Tesla customers are misled into believing their vehicles can actually drive themselves.
There have been a number of fatal crashes involving Tesla vehicles with Autopilot enabled. The latest took place in Spring, Texas, in which two men were killed after their Tesla smashed into a tree. Local law enforcement said there was no one in the driver’s seat at the time of the crash, leading to speculation that the men were misusing Autopilot. Later, Tesla claimed that Autopilot was not in use at the time of the crash and someone may have been in the driver’s seat, too.
The US National Highway Traffic Safety Administration and the National Transportation Safety Board are both investigating the crash, in addition to dozens of other incidents involving Tesla Autopilot. Tesla didn’t respond to a request for comment, likely because the company has dissolved its press office and typically doesn’t respond to media requests anymore.
#566
good find Hameed, and no surprise. Musk is a genius but like Steve Jobs before him, loves to project this distorted (exaggerated) reality.
this test (below) was very well done and informative about autopilot. they praise a lot of it, but show its shortcomings, straight up weirdness, and scary moments where if they weren't paying close attention the car may have killed them.
i love cars, i love the idea of cars driving themselves one day, but no way i'm spending $10K for the 'privilege' of testing mostly vaporware and features on 'beta' to drive my car.
this test (below) was very well done and informative about autopilot. they praise a lot of it, but show its shortcomings, straight up weirdness, and scary moments where if they weren't paying close attention the car may have killed them.
i love cars, i love the idea of cars driving themselves one day, but no way i'm spending $10K for the 'privilege' of testing mostly vaporware and features on 'beta' to drive my car.
#567
FSD isn't worth it. Only rich Tesla owners that want to use it as a toy buy it for ****s and giggles. Most people on the forums don't get it. They buy the car because it drives well, cost effective, and like the other tech.
I don't think we will see L5 autonomy for a good 10 years. Maybe even longer. Humans are far too unpredictable for computers to easily adapt to the current driving environment.
I don't think we will see L5 autonomy for a good 10 years. Maybe even longer. Humans are far too unpredictable for computers to easily adapt to the current driving environment.
#569
#570