• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia says ATI are cheating again

Weren't nvidia whinging about the same thing a few months back and some random rusian site picked up on it?
 
I find it quite funny the way people are saying "now that they've been caught with their pants down" with any sort of conviction, like just because it's being said it means it's definitely true. Additionally, these same people going on about double standards, I'm pretty sure they were people who refuted nVidia's dirty tricks with the likes of Batman AA. I wouldn't expect any less though to be honest.

What's important here is if this reduces image quality, if it does, then of course it's not good, however if it doesn't, then there really is no issue.

Most of you people complaining about "ATi" fans being hypocrites welcome the gross overuse of tessellation regardless of if there's a difference in image quality and go on about it being a display in raw power despite there being literally no visual difference. I see the fundamentals of both being pretty similar really.
 
This thread is going to make for some quality quotes the next time Nvidia is caught with their hand in the cookie jar.

Except that a lot of people don't accept nVidia do anything wrong, ever, otherwise you wouldn't get nearly as many people going on about how nVidia "just works" and "Never has any problems".
 
Isn't building firmware that will lower the clock speed for specific benchmark and stress testing tools to stop the card from killing itself by producing way too much heat and sucking way over the 300watt limit that the card claims to use also considered cheating then? :rolleyes:
 
Yeah the lower IQ is reported to be with the default settings now, obviously AMD IQ at default must be lower than Nvidia's at default, it wouldn't be reported by all these sites otherwise. So it looks like they introduced this in 10.10's to edge out extra performance over Nvidia at the sacrifice of image quality, I don't mind changing settings in the CP to lower IQ to get better FPS but I don't want lower IQ as the default.

And yet on the 6000 benchmark thread you happily told us all..

Makes no difference using performance settings over the default in the CCC for vantage from my experience.

So what's the issue? If it makes no difference to the benchmarks where has this magical 10% gain come from? :rolleyes:
 
Who really cares, i do like the ATI fire brigade though, come on lads fight that fire!

I've noticed lately that while you act like you're impartial, you only seem to acknowledge the presence of "ATi Fanboys" and ignore anyone who looks like an "nVidia fanboy".

Am I just seeing things, or are you actually doing that? The "ATi Fire Brigade" is no different to the "nVidia flamers".
 
I've noticed lately that while you act like you're impartial, you only seem to acknowledge the presence of "ATi Fanboys" and ignore anyone who looks like an "nVidia fanboy".

Am I just seeing things, or are you actually doing that? The "ATi Fire Brigade" is no different to the "nVidia flamers".

+1
 
Can anyone tell me what settings I need to change to reproduce these effects? I'm currently using default CCC profile (and Edge Detect under AA tab) with AMD Catalyst™ Accelerated Parallel Processing (APP) Technology Edition 10.11.

34977612.png
 
Both of them have been pulling stunts like this so long I don't know how any one can defend or oppose either on this topic.
 
Back
Top Bottom