• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia says ATI are cheating again

I've never used any of the software that you chaps are talking about, barely touched monitor calibration either, haven't needed to. Merely a difference between some games in darker, heavier colours that's all. Gears of War is the best one that springs to mind. If others see it different then that's all well and good. :)

Most monitors slowly kill my eyes without calibration (even if it's manual, using Windows calibration tool or Lagom's website).

Have we really driven it off topic now?

Oh, ATI are cheaters, fair enough. I hope some more renown websites will write about it soon.
 
What software did you test it in?

BTW, I have never seen any serious graphics designers or artists who would recommend one card over another as it's a gimmick that they would reproduce a different image. They might need different settings to match but the only thing that will matter will be a display.

It may differ in 3D applications not making a use of ICM/ICC profiles.

Well I'm not really an expert on this - but they were calibrated in 2D which produced similiar results but in games there was a noticeable difference in brightness and constrast.

Not really sure on the specifics of the software it was awhile ago. Had a friend come do my projector the difference is astounding.
 
Well I'm not really an expert on this - but they were calibrated in 2D which produced similiar results but in games there was a noticeable difference in brightness and constrast.

Not really sure on the specifics of the software it was awhile ago. Had a friend come do my projector the difference is astounding.

That's down to drivers/control panels afaik. You can always adjust the settings under Display settings tabs but I can see how they might differ in games at default. Most games can use calibrated settings though :)
 
End of the day not a hugely scientific test and more done out of casual interest.

I highly reccomend anyone with a projector getting it done tho I'm still blown away by the difference in movies.
 
Have to say i often noticed the ati default colours to be more vibrant than the nvidia counterparts. I rma'd an 8800gtx one time and swapped over to a 2900xt 1 gig for a few days, the difference in TF2 was immediately noticable, looked a lot more colourful.
 
Have to say i often noticed the ati default colours to be more vibrant than the nvidia counterparts. I rma'd an 8800gtx one time and swapped over to a 2900xt 1 gig for a few days, the difference in TF2 was immediately noticable, looked a lot more colourful.

Well its actually different for me. Going from a 4870x2 to a 470 the colours on the 470 are a lot more colourfull for me.
 
Surprise, surprise a graphics card comapny uses dirty tricks to sell cards they are as bad as eachother :(
The thing is its us the enthusiasts that get duped (some even going as far as "sticking up" for the offending company out of brand loyalty).
Not being that bothered about graphics (i used to own a 48K spectrum so im easily impressed) only about decent frame rates so i can shoot people in the face this current "storm" is a little bit irrelevant.

As for cards producing different colours!! Doesn't it just reproduce what it is told to?
 
I've still yet to see a IQ reduction, I'll point out again that the 10.10's look BETTER in the screen shots linked, that both on older drivers and on Fermi theres a noticeable difference in the brightness in the AF, theres "banding" at various distances where colour is inaccurate.

It seems people have decided to call this "better", when the new drivers offer a different IQ, its not surprising that 10.9, that doesn't support the 6870 and new quality AF, can't produce the same quality AF. In the newer drivers theres UNIFORM lighting from near to far, this can not be said for the 10.9 drivers or Fermi's usage.

I do love Nvidia are spinning better lighting uniformity as "worse IQ", just because AMD's last gen cards, and Nvidia's current gen cards seem to get the lighting incorrect.

Seriously, look at the pictures, not far off both the old drivers and Fermi pictures show differeing levels of brightness of the ground, the 10.10 on 68xx cards shows a more uniform lighting.


AMD increase IQ, Nvidia call foul just because its DIFFERENT to theirs, so they insist its worse.


As for Rroff again, I also find it hilarious he was being "savaged", I asked a simple question, where he got his numbers from, thats it, I'm not sure how thats being savaged. As said, theres several prediction threads on various forums, the one on semiaccurate has a LOT of talk about sticking to a 16 shader groups per cluster formation, it was WIDELY discussed for WEEKS before Rroff posted his "guess". 1536 is predicted by several people for the 6950 with 1920 for the 6970, mostly by people assuming a cluster size of 64 shaders in the week before Rroff made his bold pronouncement.

AS said, I was questioning the "its 7 to 14% faster" stuff you manage to come up with for every new card, that really is never correct, neither are your guesses. In Physx discussions you have this habit of months later, claiming victory in a past thread and using this as fact months later in a new thread to try and win some new argument.

Well yet again all the leaks in the past few days suggest a 1920 shader 6970, and a 3840 shader 6990, making Rroff wrong, again.

I'd also say, I'm not surprised if AMD go for a lower count 6950 vs the 5850. The 5850 only had a 10% drop in shaders, but was capable of the same clocks so the 33% cheaper 5850 was when overclocked, not more than 10% behind, often less, making the 5870 bad value. This time around I've suggested a bigger shader drop but with a lower drop in clock speed gives a similar overall drop in speed as last gen, but less overclocking head room and more difference between the cards, which will mean theres more reason to spend more on the 6970 than there was a 5970. I also won't be surprised to see 1920/1792 or 1664/1536 for the 6970, 6950 and probable at some point 6930.
 
Last edited:
If your not an ATI fanboy kylew - why do you get so upset when people say something negative about ATI/AMD?

^This.

The thing is, its like being at School, if people are calling you names then should just ignore them. Responding and trying to defend at every angle just leaves you mentally tired and its not worth it.

With regard to Sys22, I think his posts are funny, I would think that its fairly obvious to anyone he is on the wind up so replying, let alone acknowledging him means means he's achieved his objective.
 
My ATI X19xx XT card had noticeable vibrancy and sharpness in colours and graphics in games compared to every other Nvidia card I have owned. I noticed it during the summer when my 8800 GTS died and I put in my ATI card so I could play Deus Ex again (for the 20th time) and Deus Ex looked way better than it did on any of my Nvidia cards.

I thought at the time it was just that my ATI card might have had something different set up for the colours etc in the Catalyst Driver.

Hmm.
 
I don't get upset, I just find it annoying the way certain people constantly spread misinformation. You just don't seem to get many people spreading misinformation about nVidia, so it's less of an issue. But the misinformation about AMD/ATi is always along the lines of "drivers don't work/are crap" and "nVidia just works and never has problems". Both are untrue and it gets tiresome seeing people who actually believe this. You should know yourself that people do this, there are so many people around here who pretend (or even believe) that AMD drivers simply don't work, full stop, while nVidia's are perfect.

I still don't get how disliking nVidia = AMD fanboy though. Is there no such thing as simply disliking something and it being just that? I've pointed out quite a few times now that there doesn't actually seem to be too many AMD/ATi fanboys, but rather people who dislike nVidia for similar reasons. I don't see people going on about how great AMD are and how much they love them and how amazing they are, whereas you actually get people gushing over nVidia on these forums.

I agree with this 100%
And before im accused of fanboyism, ive said before that i dont actually buy graphics cards i get them given to me when my brother is done with them so i dont actuaaly have a preference as i get them for free. I personally do not like nvidia as a company and my opinion of them was due to the stuff they did to batman aa with physx whilst i still had my 8800ultras in sli. They actually went so far with there cutting out certain features for other brand of cards that they completely ruined the game which had ran fine before on my system with a driver update. That being said i dont hate ati or nvidia i just simply dont hear as many stories about how ati/amd has pulled dirty tricks or whatever.If my bro was getting rid of a nvidia card which was better than my 4870x2 id happily take it off his hands if it would benefit my system :)

And from someone whos has owned many ati and nvidia cards believe me when i say:
Ati drivers have problems now and then and so do nvidas drivers
 
Last edited:
Back
Top Bottom