• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia and AMD graphics comparison.

Almost literally. :p

To do a fair comparison you're going to need static lighting or turn lighting off completely, assuming it's not possible to set the same time of day. Fury X definitely looks more contrasty (less dull) in that last scene but then there is more light hitting everything and highlighting them.

If you pause the video at 2:16 and 3:15 you'll see that the 980Ti side looks brighter and less washed out, different lighting conditions are being used throughout so it's a bit misguided to focus on a single scene and claim one GPU has better IQ than the other.

The textures at 3:15 look horrendous on the 980 in that scene. It's a little brighter yes but overall image quality looks very poor. Again the textures on the rocks and on the ground look a bit xbox 360ish. The gun looks blurry too.
 
I thought both sets of images looked rubbish to be honest

I hope the person who made the video was running the monitor at it's native resolution.

I am also concerned about the difference in the amount of memory used by the cards as this hints at something not being right somewhere.
 
Pause it at 3:22 and look at the rocks in the middle background - that is like the difference between it on my 780 and a 980ti (my 780 has that lack of shadow definition - on the 980ti I've seen it on it didn't lack that shadow definition) - something odd going on in that engine IMO.

I thought both sets of images looked rubbish to be honest

I hope the person who made the video was running the monitor at it's native resolution.

I am also concerned about the difference in the amount of memory used by the cards as this hints at something not being right somewhere.

I'm gonna swing a guess that like the earlier builds of BF4 just after release some graphic settings aren't sticking and/or not reflecting what settings are actually used - which was fixed in a branch newer than SWBF is using.
 
I am also concerned about the difference in the amount of memory used by the cards as this hints at something not being right somewhere.

Not really, it hints at one card having more memory. You'll see it in every game that the memory usage is different card to card. AMD and nVidia both use different amounts even with the same amount of memory too from what I've seen.
 
Not really, it hints at one card having more memory. You'll see it in every game that the memory usage is different card to card. AMD and nVidia both use different amounts even with the same amount of memory too from what I've seen.

Yeah games will often cache more data if there is more free VRAM and/or be more lax about freeing data that they "might" need later as sometimes its better for performance not to be too aggressive in freeing up no longer used resources.
 
I'm gonna swing a guess that like the earlier builds of BF4 just after release some graphic settings aren't sticking and/or not reflecting what settings are actually used - which was fixed in a branch newer than SWBF is using.

Maybe. My resolution setting is very stubborn and goes back down by itself sometimes.
 
I thought both sets of images looked rubbish to be honest

I hope the person who made the video was running the monitor at it's native resolution.

I am also concerned about the difference in the amount of memory used by the cards as this hints at something not being right somewhere.

Running native is irrelevant when recording from the GPU buffer and not a camera recording the screen.
 
Not really, it hints at one card having more memory. You'll see it in every game that the memory usage is different card to card. AMD and nVidia both use different amounts even with the same amount of memory too from what I've seen.

In most games memory usage between the brands does not usually vary by as much as it did in the OP video.

The only 2 examples which spring to mind where it did vary by as large amount were

1. Using 2160p in BF4 where the NVidia cards used a good 1gb less with DX11 than the AMD cards using Mantle.

2. Comparing a TitanX to a Fury X on AOTS at the new ultra settings @2160p where again the NVidia card used a good 1gb less.

What is interesting about the two examples I have given is they were highlighting when things were not right with the game/software.
 
Running native is irrelevant when recording from the GPU buffer and not a camera recording the screen.

One weakness NVidia cards have is if you do not use the native resolution sometimes the image quality can look total garbage. I think this is the problem Gregster found with his original video where people commented.
 
from the sound of some people I'm glad they aren't police officers (I hope). You could be inspecting a crime scene and with one person bleeding all over and the other fine they'd come to the conclusion that the different haircolour and standing in a slightly different position meant there was no way of assessing which one looked worse for wear or if they always looked that way lol. Silly excuses.

Far more likely though, as others pointed out, there could just be an issue or glitch with some settings of some sort not running quite right. I'd be surprised to see any huge difference in image quality considering most of the rest of the video didn't show that level of distinction even if the ti did look worse in general. I'd be interested in seeing more info on this, whether debunked or factual there's no harm in just inspecting the idea so would love to get some tests from people.

One weakness NVidia cards have is if you do not use the native resolution sometimes the image quality can look total garbage. I think this is the problem Gregster found with his original video where people commented.
Wow really? That's kind of poor to be honest. If you bought a 4k monitor and moved a game down to 1440p later on because it had issues / was requiring too much performance so needed to go down some rest to get a decent framerate then that is quite a disadvantage in my eyes.
 
Last edited:
One weakness NVidia cards have is if you do not use the native resolution sometimes the image quality can look total garbage. I think this is the problem Gregster found with his original video where people commented.

Could well be it if that's the case. I'll give nVidia the benefit of the doubt :p
 
If things are still as they were when I had my 8800 GT 512, Nvidia drivers default to a lesser texture quality that activates various image quality optimisations, including anisotropic.
 
Far more likely though, as others pointed out, there could just be an issue or glitch with some settings of some sort not running quite right. I'd be surprised to see any huge difference in image quality considering most of the rest of the video didn't show that level of distinction even if the ti did look worse in general. I'd be interested in seeing more info on this, whether debunked or factual there's no harm in just inspecting the idea so would love to get some tests from people.

Looks to be 2 different issues to me but I don't have the tools to check the density of compression artefacts, etc. the 980ti has a bit of overall softening of the image and pixel value bleed on edges that doesn't "usually" happen during rendering and more commonly due to being run through lossy compression too many times.

There is definitely some differences in the lighting though the shadow definition and contrast levels are completely different in some bits.
 
Back
Top Bottom