• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

QA Consultants concludes that AMD has the most stable graphics driver in the industry.

Classic GPU section thread right here, my favourite bit was the revelation that commissioned is a fancy way of saying paid for. :D

It's funny to see the reactions on this forum to threads like this, funny and sad.

Why's nearly every discussion on this forum always the same provoking responses?

Classic General Discussion.
Classic Graphics Forum.
Classic fanboys.
Classic leftists.
Classic rightys.
Classic Brexiteers.
Classic Remoaners.
Classic Trumpsters.


Anything else I'm missing? Apart from a discussion and getting down to the labelling...
 
That quote is pure nonsense and meaningless, that's why I wouldn't even care to read it again.
You need to drop the whole AMD had better IQ thing because it's not true and it makes you appear overly biased and superficially knowledgeable. Reduction and cheating in image quality is scandalous and is not tolerated in the world of GPUs. Neither AMD nor NVIDIA tolerate the other doing it. And they both actively seek to expose the other.

In the past, NVIDIA actually exposed AMD doing FP16 render target reduction, AF quality reduction on HD5000 and HD6000 cards and the Tessellator reducing factor. And AMD exposed NVIDIA on the GPP matter to the press. If NVIDIA was doing any foul play, AMD would have at least "commissioned" another QA to expose that cheating in image quality.

Obviously none of this has happened, and countless other independent outlets have compared both in dozens of games and found nothing.
 
In the past, NVIDIA actually exposed AMD doing FP16 render target reduction, AF quality reduction on HD5000 and HD6000 cards and the Tessellator reducing factor. And AMD exposed NVIDIA on the GPP matter to the press. If NVIDIA was doing any foul play, AMD would have at least "commissioned" another QA to expose that cheating in image quality.


Oh god don't even start on about tessellation, nvidia were overdoing it to stupid levels just because it hurt amd performance more than it hurt their own.
 
Regardless of who commissioned the report, it shouldn't be dismissed purely on the basis of who paid for it, which is what has happened in here.
It should when it is riddled with inconsistencies like using the wrong driver for Quadros.

LegitReviews wrote about this, they found out that AMD supplied the QA company with the NVIDIA cards, they hand picked some OC'ed NVIDIA GPUs and passed them to the QA, and used reference AMD GPUs for their own side. OC'ed cards are less stable than reference. The GTX 1060 for example failed 10 times because it was factory OC'ed, while the 1080Ti only failed once because it was a reference design.
 
and used reference AMD GPUs for their own side. OC'ed cards are less stable than reference. The GTX 1060 for example failed 10 times because it was factory OC'ed, while the 1080Ti only failed once because it was a reference design.

Factory OC'd cards should be as stable as reference cards if the AIBs are doing their job. It would be a good idea to use like for like in a comparison though - when they don't get details like that right you wonder what else is inconsistent.
 
It should when it is riddled with inconsistencies like using the wrong driver for Quadros.

LegitReviews wrote about this, they found out that AMD supplied the QA company with the NVIDIA cards, they hand picked some OC'ed NVIDIA GPUs and passed them to the QA, and used reference AMD GPUs for their own side. OC'ed cards are less stable than reference. The GTX 1060 for example failed 10 times because it was factory OC'ed, while the 1080Ti only failed once because it was a reference design.

So are you suggesting that because AMD supplied the cards that they somehow tampered with them or something to cause them to fail?

Or that factory OC'd cards are inherently unstable? They shouldn't sell them if that's the case. Shame really, they are very popular, those factory OC'd cards :P

How do you know it failed 10 times because it was factory OC'd? Or is that something you've assumed or are claiming because it fits your argument?
 
So are you suggesting that because AMD supplied the cards that they somehow tampered with them or something to cause them to fail?

Or that factory OC'd cards are inherently unstable? They shouldn't sell them if that's the case. Shame really, they are very popular, those factory OC'd cards :p

How do you know it failed 10 times because it was factory OC'd? Or is that something you've assumed or are claiming because it fits your argument?

Starting to wonder if he's a returning ban trying to push an agenda or one of the beyond3d armchair gpu engineers.

OC'ed cards are less stable than reference. The GTX 1060 for example failed 10 times because it was factory OC'ed, while the 1080Ti only failed once because it was a reference design.

Sorry but that is utter BS, if that were the case these cards would be crapping out left right and centre in games as well.
 
You need to drop the whole AMD had better IQ thing because it's not true and it makes you appear overly biased and superficially knowledgeable. Reduction and cheating in image quality is scandalous and is not tolerated in the world of GPUs. Neither AMD nor NVIDIA tolerate the other doing it. And they both actively seek to expose the other.

In the past, NVIDIA actually exposed AMD doing FP16 render target reduction, AF quality reduction on HD5000 and HD6000 cards and the Tessellator reducing factor. And AMD exposed NVIDIA on the GPP matter to the press. If NVIDIA was doing any foul play, AMD would have at least "commissioned" another QA to expose that cheating in image quality.

Obviously none of this has happened, and countless other independent outlets have compared both in dozens of games and found nothing.

If AMD does nothing, the only logical explanation is that they are blind.
I have changed a Radeon card with a GeForce card and do not like the image quality of the GeForce. It is missing details like leaves on the trees, and it is missing 10-bit colour, and by default it is missing even 8-bit colours.
The others here repeat it multiple times, as well. Aren't you able to hear them?
 
How do you know it failed 10 times because it was factory OC'd? Or is that something you've assumed or are claiming because it fits your argument?

Almost two-thirds of NVIDIA’s GeForce failures came from one card. Would the results be the same if they tested another card with the same part number?
http://www.legitreviews.com/amd-cla...phics-cards-tested_206705#I0Qh4rdKCPJkDAAB.99

That card happens to be an OC'ed one vs other the 1080Ti card for example that is not OC'ed.
Why would you use OC'ed cards from NVIDIA and reference cards from AMD?

The whole test is structured in an inconsistent and shady manner IMO, reference vs OC'ed, wrong driver, cards supplied from AMD and not from a public store and so on.
Sorry but that is utter BS, if that were the case these cards would be crapping out left right and centre in games as well.
My experience over dozens of cards: if the card is factory OC'ed to the limits it is much less stable than a reference or a moderately OC'ed card. Especially in a torture test that lasts 12 days like this one.
 
Last edited:
http://www.legitreviews.com/amd-cla...phics-cards-tested_206705#I0Qh4rdKCPJkDAAB.99

That card happens to be an OC'ed one vs other the 1080Ti card for example that is not OC'ed.
Why would you use OC'ed cards from NVIDIA and reference cards from AMD?

My experience over dozens of cards: if the card is factory OC'ed to the limits it is much less stable than a reference or a moderately OC'ed card. Especially in a torture test that lasts 12 days like this one.


And who exactly are you mr "dozens of cards", some undercover reviewer or what?

And as for overclocks a lot of cards have pretty conservative factory oc's, not sure where you're getting this "clocked to the limit" line of thinking from.
 
And who exactly are you mr "dozens of cards", some undercover reviewer or what?

Depending how literally that dozens is - don't really need to be a reviewer - between my own use, some work in electronics retail and building/maintaining systems for friends and family I've had long term experience of easily more than a dozen factory overclocked cards in the last few years alone.
 
If AMD does nothing, the only logical explanation is that they are blind.
As I pointed out they are not blind. They actively seek to expose NVIDIA on any wrongdoing, stuff like GameWorks, GPP, MSAA vendor lock in Batman Arkham Asylum, and other issues.
I have changed a Radeon card with a GeForce card and do not like the image quality of the GeForce. It is missing details like leaves on the trees, and it is missing 10-bit colour, and by default it is missing even 8-bit colours.
There are no details missing in any comparison made by any independent or professional outlet. AMD does not speak about any missing details as well.
 
As I pointed out they are not blind. They actively seek to expose NVIDIA on any wrongdoing, stuff like GameWorks, GPP, MSAA vendor lock in Batman Arkham Asylum, and other issues.

There are no details missing in any comparison made by any independent or professional outlet. AMD does not speak about any missing details as well.

What about the differences in this then, I clearly see richer foliage on the first image:

Radeon:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_5870.png

GeForce:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_285.png

Yes, it is severe stutterring. Who knows what runs in the background - by the way, the two sides of the screen are different - it isn't one and the same part of the game. WTH?



It is the nVidia graphics drivers. They do the mess and render the images wrong.
You can't say the game engine somehow recognises it is being run on a GeForce and intentionally makes the image go wrong.

It is highly possible nVidia doesn't have the know-how and patents for best image quality.
Hence, all these results:







For members who don't understand the images. This is a test with circles where the goal is ideal circle - any deviation means the graphics card renders the image with worse quality.

And Radeon:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_5870.png

GeForce:
https://www.ixbt.com/video3/images/rv870-quality/ssaa8x_2_285.png

https://www.ixbt.com/video3/rv870-quality.shtml#p2
Research of quality of rendering of video cards AMD RADEON HD 5000
 
Very old news and long since not the case AFAIK plus fairly niche settings. AMD generally has been the better bet if you are a fan of supersampling.

FC2 not really a great example BTW if we want to quibble about issues like that - for a long time it didn't render shadows correctly on ATI/AMD GPUs amongst other image quality issues - some of them fixed by the hotfix that was put out fairly quickly after launch.
 
IMO, as I can't pass over any opportunity, this is more a highlight that Windows 10 is **** than anything else :p I bet half the problems or more from both sides are purely down to Windows 10 problems or being busy or maybe even doing update stuff in the background - I wonder if they took that into account? as a Windows 10 system will quite happily kick off updates and even restart itself when left running a benchmark which can cause all kinds of problems.
 
Do I have this right? Is an argument being made above for a Radeon 5870 rendering greenery better than a GTX 285?

Is that what is actually going on? I mean, my 5850 was a legendary card, I'd be the first to admit that. But I'm not going to sit here and argue that it was better than the green team's offerings of the time :confused:

:eek:

:D
 
Do I have this right? Is an argument being made above for a Radeon 5870 rendering greenery better than a GTX 285?
Is that what is actually going on?

No, it happens in more titles and with the recent cards, as well.
Not only FarCry 2 but also:

Counter-Strike Source (images provided by me),
World of Tanks (info provided by member @Panos ),
3DMark Time Spy (info provided by member @Panos ),
Final Fantasy XIV (info provided by member @Panos ),
Rise of the Tomb Raider (info provided by the YouTube video in the other thread - you see links to comments from there - you can find the video).

I switched last week from 1080Ti to V64. After clearing cache and settings, went to play World of Tanks with everything on max.
My first observation was that all trees and bushes looked thicker, with more foliage to the point it was difficult to aim from, yet very beautiful to look at.
Then on the Overlord map, was stunned to see a big thick black smoke on the horizon, like dozens of oil tankers were on fire literally. While on Paris map, the thousands of small windows mirroring draw my attention for first time because it pop out

I thought probably wrong, went to the laptop (GTX1060 6gb) and run the game with maxed out settings and went to the training mode.
The bushes and trees on same places as above, had half the foliage. While that smoke described above was more likely that someone set a car on fire, not ships burning at the landing of Normandie. As for Paris the mirrors look washed out, not reflecting the light and scene around them.

True the fps on the V64 is less by around 60 (110 over 170-175), but the game details are far better.
Added bonus now, is that I do not have tearing (freesync monitor) half way across the scene. Something that annoyed me with the 1080Ti and the inability to cap the fps in the driver settings.

If remember correctly the case on FFXIV was similar, yes? AMD was rendering more than the NV card.

Which is the same case in Time Spy also with the Graphic Test 2. And is known since 2016 that Nvidia is cutting graphic rendering due to it's separate execution path in the benchmark.
And this explains why on Graphic Test 1 a Vega 64 @ 1742 has the same perf at a GTX1080 @ 2190 and all the performance tanks by 12% on Test 2.

That looks similar to the World of Tanks.... The Vega 64 renders twice as much foliage on same settings, to the point that is more difficult to aim behind bushes than before.
But I prefer the looks of it.
 
So I get from this thread no-one should buy graphics cards with a factory overclock,which means literally all the AIB partner cards out there. Aw shucks,we need to change this website's name to stockclockers.co.uk! :(

Why's nearly every discussion on this forum always the same provoking responses?

Classic General Discussion.
Classic Graphics Forum.
Classic fanboys.
Classic leftists.
Classic rightys.
Classic Brexiteers.
Classic Remoaners.
Classic Trumpsters.


Anything else I'm missing? Apart from a discussion and getting down to the labelling...

A middle ground?? What heresy is this - everything is black and white,surely??
 
Last edited:
And who exactly are you mr "dozens of cards", some undercover reviewer or what?
Those are very old DX9 cards, comparison is made on the basis of SSAA8X which is an extremely taxing and niche form of AA. It's irrelevant in DX10 era cards and beyond. As the issue doesn't stand anymore, especially with the advent of TAA, and SMAA and the likes.

No, it happens in more titles and with the recent cards, as well.
Not only FarCry 2 but also:
All anecdotal are talk with no evidence or accurate apple to apple comparisons. If you want something documented here are some examples from the DX11 era cards:

FP16 Demotion: A Trick Used by ATI/AMD to Boost Benchmark Score Says NVIDIA
AMD has admitted that performance optimizations in their driver alters image quality in the above applications. The specific change involves demoting FP16 render targets to R11G11B10 render targets which are half the size and less accurate.
http://www.geeks3d.com/20100916/fp1...-by-ati-to-boost-benchmark-score-says-nvidia/
https://www.pcauthority.com.au/feat...rading-game-quality-says-nvidia--232215/page6

ATI/AMD to reduce AF16X quality in their default driver settings
So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.
http://www.guru3d.com/articles-pages/exploring-ati-image-quality-optimizations,1.html

AMD admits defeat in Tessellation
AMD is going for the driver hack to alleviate their tessellation deficiencies. Basically what they’ve said above is that the driver can artificially limit the tessellation factors. This classifies as a driver cheat for the simple reason that it breaks the DirectX 11 and OpenGL 4.0 APIs. Namely, the application explicitly passes tessellation factors to the tessellation units via the shaders. What AMD is doing here is short-changing the application.

It’s pretty much equivalent to reducing texture quality by silently using 16-bit textures instead of 32-bit ones, for example… or by using lower resolution textures than what the application is actually using (eg mipmap biasing). So basically it’s driver cheating. Trading in image quality for performance.


https://scalibq.wordpress.com/2011/01/19/catalyst-hotfix-11-1a-amd-admits-defeat-in-tessellation/
 
Back
Top Bottom