2009 J.D. Power Initial Quality Study-Lexus receives awards for IS, GS, GX, LS & LX
#32
Guest
Posts: n/a
Sorry, Mr. Surveys suck says your Dad didn't get one either and you're lying too
#33
Lexus Fanatic
iTrader: (2)
Initial quality means very little in my book. My Fusion was great for a few months, then all hell broke loose and off and that POS rated great in initial quality.
Who purchases a car for 3 months? Initial quality is not a bullet proof indicator of long term reliability.
Who purchases a car for 3 months? Initial quality is not a bullet proof indicator of long term reliability.
#34
Lexus Champion
Join Date: Sep 2006
Location: MIchigan
Posts: 2,025
Likes: 0
Received 0 Likes
on
0 Posts
I have never seen a CR form
#37
Lexus Fanatic
iTrader: (2)
You're the same guy that balks at people who share their experiences as well?
So you want us to
1. Believe you
2. Not believe other people and their stories
3. Not believe JD Power
Wow
1. Believe you
2. Not believe other people and their stories
3. Not believe JD Power
Wow
I strongly suggest reading this before people go from laughing at your posts to not paying any attention to them.
http://www.dobney.com/Research/MR_basics.htm
http://www.dobney.com/Research/MR_basics.htm
Here is a survey I received that I posted about a while back
https://www.clublexus.com/forums/car...cle-study.html
Last edited by DASHOCKER; 06-23-09 at 03:03 AM.
#38
Since when does the average woman know anything about a car other than if it looks cute or not? I think you basically just admitted to being a badge *****.
Anyhow... IMO he asked a valid question (even though he didn't ask it very politely)... if you discredit JD Power and CR entirely, what other surveys do you give credit to, or do you just not believe anything you ever read and assume that your personal experience is the best metric of future experiences?
While I don't consider any survey to be the absolute word on quality, if CR, JDP, Consumer Guide, Wards, etc plus the various mags/editorials etc are all telling me similar things... then I'll definitely take note. I combine that with my own personal experience, but I tend to give my experience less priority over what I read since I believe in statistics and the law of averages as a whole when the surveys are executed properly.
Anyhow... IMO he asked a valid question (even though he didn't ask it very politely)... if you discredit JD Power and CR entirely, what other surveys do you give credit to, or do you just not believe anything you ever read and assume that your personal experience is the best metric of future experiences?
While I don't consider any survey to be the absolute word on quality, if CR, JDP, Consumer Guide, Wards, etc plus the various mags/editorials etc are all telling me similar things... then I'll definitely take note. I combine that with my own personal experience, but I tend to give my experience less priority over what I read since I believe in statistics and the law of averages as a whole when the surveys are executed properly.
#39
#40
Lexus Fanatic
iTrader: (2)
Lets see what J.D. Powers is all about..
The Truth About J.D. Power’s IQS
By Michael Karesh
June 13, 2006
Another year, another J.D. Power survey. Since the non-profit Consumer Reports organization prohibits carmakers from using its ratings in their ads, “ranked highest in initial quality by J.D. Power and Associates” should start flooding the airwaves and Internet any minute now, with print sure to follow. But does all of this noise signify anything? Should those seeking trouble-free wheels be sure to buy one of J.D.’s winners? Hardly.
First, note the “initial” that qualifies “quality.” Power surveys car owners on “problems” encountered within their first 90 days of ownership. Most people understand that a car that’s reliable for 90 days isn’t necessarily reliable beyond that. But there’s a bigger issue. J.D. Power’s IQS has been redesigned (for the second time) to encompass a larger number of potential defects. And the more the IQS includes, the less it measures what most people want to know: vehicle reliability.
The previous redesign doubled the average number of reported problems per car by extending the IQS beyond defects (that can be fixed) to designed-in annoyances (that must be endured). For example, cupholder dissatisfaction famously slammed MINI’s score. The 2006 IQS report takes a step in the right direction by including subscores for "design quality" and "production quality." Combining two very different elements into a single score makes it unclear what the number represents. Yet this score receives 99 percent of the press coverage and 100 percent of the ad citations.
If you compare the rankings based on production quality alone, the brands’ relative positions change dramatically. BMW bounds 24 places to third; Buick jumps 14 to eighth; MINI ascends 13 to 16th; Mercedes-Benz climbs nine also to 16th; Subaru also gains nine to 19th. At the same time, Dodge drops eight to 27th; GMC plummets 13 rungs to 22nd; Nissan plunges ten, also to 22nd. Eight others change position by at least five slots. These include Chrysler, which shares many models with Dodge yet moves up five places, to fifth. Out of 37 brands, 16 rankings are heavily affected by the inclusion of design quality.
Beyond the cloudiness of the revised methodology, the way the results are reported and spun continues to put too much emphasis on relative rankings. In fact, absolute differences are often minuscule. Looking at defect rates alone, 22 out of 37 brands fall within one-tenth of a Problem per Car (PPC) of the 0.64 average. Thirty of 37 brands fall within two-tenths. Of the seven beyond this range, only one, Lexus, is on the top, and it only betters the average by 0.22 problems per car.
Stay with me here. The best brand, Lexus, has 0.42 problems per car, while the worst, Isuzu, has 1.10: a best-to-worst difference of 0.68 problems per car. Even this range results from a few especially low-scoring brands. The difference between number three (Toyota) and number 32 (Hummer) is a scant 0.27 problems per car. It’s ironic, since brands at the bottom of the chart receive the least attention in J.D. Power’s press releases. For years they didn’t even publicly release below-average scores.
Put another way, a Toyota compared to a Hummer has a one in four chance of having a single additional problem. Even comparing a car from Isuzu with one from Lexus, only two in three cars will have a single additional problem. What’s more, this additional problem is likely to be the only problem. Folks, we're talking about a single trip to the dealer for a single problem–which you still face nearly even odds of taking if you buy the best brand.
The reason why J.D. Power lumps design quality into the IQS is clear: without it, the differences between brands are rarely worth debating. And the smaller the differences, the less people care about IQS. And the less the pubic cares about IQS, the less automakers will pay to advertise IQS scores, and hire Power consultants to help improve them. This would truly be a problem– for J.D. Power.
J.D. needs to re-think their methodology and reporting. They should keep problems that require repair separate from other issues. Forget the brands, they don’t vary enough. Instead, emphasize model scores. Next, focus less on rankings and who is the best and more on the size of the differences and who is within spitting distance of the best. Finally, J.D. Power needs to shift their emphasis away from “initial quality” towards long-term durability. Manufacturers won’t like that a longer-term study keeps new models off J.D.-branded consumer radar, but anything less is, well, less.
Heck, J.D. Power might even rake in more cash this way. Anyone reasonably near the top—and not just those at the top—could advertise “ranked good enough in quality that you should focus on other criteria by J.D. Power and Associates.” No, it’s not punchy. Just the truth. http://www.thetruthaboutcars.com/the...jd-powers-iqs/[/QUOTE]
Happy shopping
The Truth About J.D. Power’s IQS
By Michael Karesh
June 13, 2006
Another year, another J.D. Power survey. Since the non-profit Consumer Reports organization prohibits carmakers from using its ratings in their ads, “ranked highest in initial quality by J.D. Power and Associates” should start flooding the airwaves and Internet any minute now, with print sure to follow. But does all of this noise signify anything? Should those seeking trouble-free wheels be sure to buy one of J.D.’s winners? Hardly.
First, note the “initial” that qualifies “quality.” Power surveys car owners on “problems” encountered within their first 90 days of ownership. Most people understand that a car that’s reliable for 90 days isn’t necessarily reliable beyond that. But there’s a bigger issue. J.D. Power’s IQS has been redesigned (for the second time) to encompass a larger number of potential defects. And the more the IQS includes, the less it measures what most people want to know: vehicle reliability.
The previous redesign doubled the average number of reported problems per car by extending the IQS beyond defects (that can be fixed) to designed-in annoyances (that must be endured). For example, cupholder dissatisfaction famously slammed MINI’s score. The 2006 IQS report takes a step in the right direction by including subscores for "design quality" and "production quality." Combining two very different elements into a single score makes it unclear what the number represents. Yet this score receives 99 percent of the press coverage and 100 percent of the ad citations.
If you compare the rankings based on production quality alone, the brands’ relative positions change dramatically. BMW bounds 24 places to third; Buick jumps 14 to eighth; MINI ascends 13 to 16th; Mercedes-Benz climbs nine also to 16th; Subaru also gains nine to 19th. At the same time, Dodge drops eight to 27th; GMC plummets 13 rungs to 22nd; Nissan plunges ten, also to 22nd. Eight others change position by at least five slots. These include Chrysler, which shares many models with Dodge yet moves up five places, to fifth. Out of 37 brands, 16 rankings are heavily affected by the inclusion of design quality.
Beyond the cloudiness of the revised methodology, the way the results are reported and spun continues to put too much emphasis on relative rankings. In fact, absolute differences are often minuscule. Looking at defect rates alone, 22 out of 37 brands fall within one-tenth of a Problem per Car (PPC) of the 0.64 average. Thirty of 37 brands fall within two-tenths. Of the seven beyond this range, only one, Lexus, is on the top, and it only betters the average by 0.22 problems per car.
Stay with me here. The best brand, Lexus, has 0.42 problems per car, while the worst, Isuzu, has 1.10: a best-to-worst difference of 0.68 problems per car. Even this range results from a few especially low-scoring brands. The difference between number three (Toyota) and number 32 (Hummer) is a scant 0.27 problems per car. It’s ironic, since brands at the bottom of the chart receive the least attention in J.D. Power’s press releases. For years they didn’t even publicly release below-average scores.
Put another way, a Toyota compared to a Hummer has a one in four chance of having a single additional problem. Even comparing a car from Isuzu with one from Lexus, only two in three cars will have a single additional problem. What’s more, this additional problem is likely to be the only problem. Folks, we're talking about a single trip to the dealer for a single problem–which you still face nearly even odds of taking if you buy the best brand.
The reason why J.D. Power lumps design quality into the IQS is clear: without it, the differences between brands are rarely worth debating. And the smaller the differences, the less people care about IQS. And the less the pubic cares about IQS, the less automakers will pay to advertise IQS scores, and hire Power consultants to help improve them. This would truly be a problem– for J.D. Power.
J.D. needs to re-think their methodology and reporting. They should keep problems that require repair separate from other issues. Forget the brands, they don’t vary enough. Instead, emphasize model scores. Next, focus less on rankings and who is the best and more on the size of the differences and who is within spitting distance of the best. Finally, J.D. Power needs to shift their emphasis away from “initial quality” towards long-term durability. Manufacturers won’t like that a longer-term study keeps new models off J.D.-branded consumer radar, but anything less is, well, less.
Heck, J.D. Power might even rake in more cash this way. Anyone reasonably near the top—and not just those at the top—could advertise “ranked good enough in quality that you should focus on other criteria by J.D. Power and Associates.” No, it’s not punchy. Just the truth. http://www.thetruthaboutcars.com/the...jd-powers-iqs/[/QUOTE]
Happy shopping
#41
Pole Position
I guess Ferraris and Lambos are the most reliable and proven quality vehicles on the road.....my gawd, how stupid and bias can you get?
#42
Pole Position
I had to stoop to the level of the child for a moment. No, I am not a badge blah blah blah. I never let magazines, surveys that have not changed since the days of the high top fade, dictate my buying decision. These sample surveys are not accurate. Did any of you get a survey from J.D Powers? I've bought 3 new cars to date going on 4 & I nevr got a survey from J.D. Powers.. I base my quality assessment on experiencing the product on my own or in my case buying them. That is more accurate than a study hall set of survey participants in my book.
#43
Pole Position
Lets see what J.D. Powers is all about..
The Truth About J.D. Power’s IQS
By Michael Karesh
June 13, 2006
J.D. needs to re-think their methodology and reporting. They should keep problems that require repair separate from other issues. Forget the brands, they don’t vary enough. Instead, emphasize model scores. Next, focus less on rankings and who is the best and more on the size of the differences and who is within spitting distance of the best. Finally, J.D. Power needs to shift their emphasis away from “initial quality” towards long-term durability. Manufacturers won’t like that a longer-term study keeps new models off J.D.-branded consumer radar, but anything less is, well, less.
Happy shopping
The Truth About J.D. Power’s IQS
By Michael Karesh
June 13, 2006
J.D. needs to re-think their methodology and reporting. They should keep problems that require repair separate from other issues. Forget the brands, they don’t vary enough. Instead, emphasize model scores. Next, focus less on rankings and who is the best and more on the size of the differences and who is within spitting distance of the best. Finally, J.D. Power needs to shift their emphasis away from “initial quality” towards long-term durability. Manufacturers won’t like that a longer-term study keeps new models off J.D.-branded consumer radar, but anything less is, well, less.
Happy shopping
So your whole rationale to discredit JD Powers is because a guy's rant, who runs a company that is a competitor to JD Powers? Got any other random articles you want to pull out of the internet to discredit CR as well?
Last edited by ST430; 06-23-09 at 05:57 AM.
#44
Lexus Champion
In other words JD Power is the giant they are trying to topple... the guy they're trying to replace.
In other words this article is about as credible as listening to your local bakery owner tell you why Walmart is the devil. He may have some good points, but he has every reason in the world to be biased against them, too.
#45
Super Moderator
Thread Starter
I did the JDPower survey for the LS last year after I bought it, but never received their survey forms for any of the new cars I bought before that, including the SC430 I bought from the same dealer 6 years ago.
I don't think they can survey EVERYONE who bought new cars though for their studies, so maybe they just send out their forms randomly to get a big enough sample ?
I don't think they can survey EVERYONE who bought new cars though for their studies, so maybe they just send out their forms randomly to get a big enough sample ?