I'm just wondering how long til the default quality settings on nVidia cards have similiar opptimisations to claw back the advantage and around it will go again.
That is what I am worried about
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm just wondering how long til the default quality settings on nVidia cards have similiar opptimisations to claw back the advantage and around it will go again.
I wish someone would explain to me how it can be reducing the IQ and unnoticeable at the same time. Very confusing.![]()
I think that the problem, some see it as a reduction of IQ and others see no difference or very very small but better fps and you got 2 company's and groups of people fighting over it.
As it's been said before, reviewers don't use default settings for benchmarks, well the good ones don't.
So how does this change anything, it certainly doesn't invalidate the benchmarks.
If this was the only setting you could have and no one had known about it maybe this would be a big issue but neither of those is true. I think this is a storm in a teacup and not something the majority of user's are going to care about or spend anytime worrying about. Are there any 6xxx series owner's on here that are worried about this, have you noticed any difference in real world game playing or anything else. I would like to hear from the people that actually have the cards rather then a bunch of people that don't but are telling everyone how bad it is and how much of an issue it is.
If this was the only setting you could have and no one had known about it maybe this would be a big issue but neither of those is true. I think this is a storm in a teacup and not something the majority of user's are going to care about or spend anytime worrying about. Are there any 6xxx series owner's on here that are worried about this, have you noticed any difference in real world game playing or anything else. I would like to hear from the people that actually have the cards rather then a bunch of people that don't but are telling everyone how bad it is and how much of an issue it is.
Only the Mipmap detail level defaults to High Quality for the 5000 series and CCC 10.11. The Catalyst A.I. Texture Filtering quality and Enable Surface Format Optimization are not even in CCC 10.11, this fetaure is only in the 10.10e unsupported driver and control panel.HD5 series default to highest settings anyway, nvidia default to quality which is or at least was comparable.
Only the Mipmap detail level defaults to High Quality for the 5000 series and CCC 10.11. The Catalyst A.I. Texture Filtering quality and Enable Surface Format Optimization are not even in CCC 10.11, this fetaure is only in the 10.10e unsupported driver and control panel.
I wish someone would explain to me how it can be reducing the IQ and unnoticeable at the same time. Very confusing.![]()
Because the graphics vendors are supposed to be selling us hardware that is superior than the previous generation, not cutting corners in terms of the workload rendered in order to artifically inflate performance.
It's not about the owners, it's about dodgy benchmark results from review sites.
This is where I could not disagree with you more. To me, and apparently ATI/AMD, it is all about the owners.It's not about the owners, it's about dodgy benchmark results from review sites.
THIS. The onus should be on review sites to examine the technology and settings before critical analysis. If card makers want to provide information that makes this easier, good on them. Transparency benefits everyone.Its the reviewer who must justify their techniques and processes to form a report which is knowledgeable and relevant.
Read the performance reviews and read the IQ reviews then come up with your own conclusion on how you spend your money.