• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will GTX570 be caught in new app detection cheat?

Well the zoomed in shots show is very noticeable difference on my 30" 2560x1200 NEC.

At the end of the DAY both ATI & NV should have an option to turn off all deliberate visual I.O degrading performance optimizations.
 
Last edited:
What a crock, this kind of thing really gets my goat.

And people can say about each game having driver optimisations all they like, funny how those 'optmisations' tend to show higher frame rates with worse IQ in commonly benchmarked games.
 
What a crock, this kind of thing really gets my goat.

And people can say about each game having driver optimisations all they like, funny how those 'optmisations' tend to show higher frame rates with worse IQ in commonly benchmarked games.

You make a good point & more randomness is needed really.
 
Nvidia has been caught cheating benchmarks before; so there is no point appearing bemused or whatever. Its a documented fact that Nvidia cheats benchmarks, and when rumbled you are rumbled.:p
 
You say scandal, but nothing actually came of it. :confused:

http://www.guru3d.com/article/radeon-hd-6850-6870-review/9

Check out those BFBC2 screen shots, the ground texture is quite noticeably sharper on the 6800s than on the nVidia card which is quite strange, no?

So the difference at default driver setting in-between AMD and NVIDIA is as far as we are concerned NIL

And yeah the dodgy IQ on the AMD cards is more apparent when you see it in action and not stills.
Fibrillation? The GeForce 400 series of more or less unheard of. It makes no difference here between the right and left sides actually recognize. The GeForce makes this so clearly better than the Radeon cards.

And people can say about each game having driver optimisations all they like, funny how those 'optmisations' tend to show higher frame rates with worse IQ in commonly benchmarked games.

Such as? name all these benchmarks with worse IQ and provide some proof.




Nvidia has been caught cheating benchmarks before; so there is no point appearing bemused or whatever. Its a documented fact that Nvidia cheats benchmarks, and when rumbled you are rumbled.:p

ATI quack anyone.
 
Last edited:
couple of things scream out to me hear.

Pre release card.

2nd rate website (never personally even heard of it before) with no original uncompressed screenshots, most site like Guru or hardocp do when they focus on IQ have uncompressed images for all to see not small gifts or redruced size screenshots

Anyway nvidia's reply has been posted now on the site, abit via a small 1 line edition to the page with a link right at the bottom in small text lol.

http://www.kitguru.net/components/g...ick-stam-addresses-hawx-cheating-allegations/

Before anyone says I'm some Nvidia fanboy I own a 5850 at the moment which i love and had a nice GTX 280 before that.
 
Last edited:
What is being experienced is not an “Antialiasing cheat” but rather a HawX bug that is fixed by our driver using an application specific profile.

In a nutshell, the HawX application requests the highest possible AA “sample quality” at a particular AA level from our driver. Without our driver fix, the game would be running 16xCSAA instead of standard 4xAA when you select 4xAA in-game. It runs the proper 4xAA with the driver fix. You defeat the fix by changing the .exe name, causing it to run at 16xCSAA.

Assuming its not just PR spin (its something thats easy to check so I dunno why they would BS about it and risk being caught out) - fits with what I said - theres a pre-defined AA compatibility profile for HAWX rather than something hidden away.
 
Last edited:
as much as id love this to be true, im sure it would have been implimented this in a far more subtle way.

'Oh, look, my hawx is now called hacks! how odd. well, im sure its perfectly fine...'
 
LOL............The image is zoomed in about 200% and you still can't hardly see a difference :rolleyes:

Maybe they need post some images with it zoomed in 500% :p :D


Stupid thread...
 
It is zoomed in more like 500% yeah its stupid imo - no original uncompressed high res images and you can't even see any difference in the original images without zooming in massive.

Added to the fact that it didn't take me 2 minutes of playing around to find it was a documented compatibility fix so less likely to be an intentional cheat just furthers my view that kitguru is a distinctly sub-par tech site.
 
Personally I can only see the quality difference in the massively zoomed up bit so I don't really see it as an issue - ATI/AMD have been using similiar selective optimisations in Cat AI for years.

This, i dare anyone to spot the difference at 60fps @ 1080
 
It is zoomed in more like 500% yeah its stupid imo - no original uncompressed high res images and you can't even see any difference in the original images without zooming in massive.

Added to the fact that it didn't take me 2 minutes of playing around to find it was a documented compatibility fix so less likely to be an intentional cheat just furthers my view that kitguru is a distinctly sub-par tech site.

This, i dare anyone to spot the difference at 60fps @ 1080

Doesn;t this prove what I have said for ages? That anything above 4xaa is just not noticable playing games?

Yet there are many people on here who swear blind that 16x and 32xaa makes a big difference.

If people need to zoom in by 200% and 500% on a still to spot the difference between 4xaa and 16xaa then 4xaa is enough IMO.
 
I don't personally see any difference between 4x and 16x AA unless I examine a screenshot - however I do notice a difference between 4x and 32x2 TRAA or 64x4 TRAA - with those 2 higher levels the whole screen looks cleaner and things like overhead cables don't have strange running edges on them.
 
Some site never heard of posts images zoomed in at a massive 500% and people running about calling some company **** or ******* lol.

Theres nothing in this thread at all, move along...
 
Doesn;t this prove what I have said for ages? That anything above 4xaa is just not noticable playing games?

Yet there are many people on here who swear blind that 16x and 32xaa makes a big difference.

If people need to zoom in by 200% and 500% on a still to spot the difference between 4xaa and 16xaa then 4xaa is enough IMO.

I agree with this, 4xAA is enough for me but I use 32x just because I can with hardly any performance hit.
 
Back
Top Bottom