• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Vs. Nvidia Image Quality - Old man yells at cloud

Do you really believe this? Would you say the watch dogs 2 image is more vibrant on the Vega? because they are near identical bar that weird floor tile texture being different.

Idk, don't have an nvidia gpu on hand to do a head-to-head. Was more curious about the performance aspect he mentioned. So if anyone could test those settings...
 
The difference between Quality (top) and High Quality (bottom) in The Division.

106VqjG.jpg


Less than 1 fps in it - interestingly it shifted the load slightly higher on the GPU and took a little load off the CPU - assumedly with optimisations on the CPU is doing a bit more work for those before handing stuff to the GPU.

I tried to get some images for quality comparison but the game is so dynamic it is hard to get an exactly the same lighting, etc. situation in a static position between restarts of the game and I don't have the tools to hand to ensure I can capture the exact same frame from the benchmark.
 
Last edited:
some dude on reddit on this topic

if someone with nvidia gpu could test

From your quote:

"...the first is to gain FPS comparable to AMD, the second is to use less vram..."

Why would nvidia target equal FPS as what AMD gets? :confused:
And why would they need textures compression in order to save VRAM? What would happen if game developers start releasing high-res (4K or even 8K) textures packs for the 16 GB framebuffer of the Radeon VII?
Actually they should do it.

 
The difference between Quality (top) and High Quality (bottom) in The Division.


Less than 1 fps in it - interestingly it shifted the load slightly higher on the GPU and took a little load off the CPU - assumedly with optimisations on the CPU is doing a bit more work for those before handing stuff to the GPU.

I tried to get some images for quality comparison but the game is so dynamic it is hard to get an exactly the same lighting, etc. situation in a static position between restarts of the game and I don't have the tools to hand to ensure I can capture the exact same frame from the benchmark.

Good stuff, did you use the rest of the settings as well?
 
What would happen if game developers start releasing high-res (4K or even 8K) textures packs for the 16 GB framebuffer of the Radeon VII?
Actually they should do it.[/MEDIA]

They will probably lock off options, so without enough vram you'll be limited to lower detail. As happened in a few games already.
 
Good luck to match this on an nvidia: (CS Source, map Militia, resolution 1920 x 1080, all settings maxed out both in-game and Radeon Settings, Ryzen 5 2500U + Radeon RX 560X):

This is in the house, behind the front door. What you should look for is multiple dust particles visible in the air (The same can be seen in the Dust and Dust 2 tunnels, as well as darker zones in other maps):



Nice dithering.

ak7o0ql.jpg


No dithering.
 
On the Radeon and its settings, you can also worsen the quality settings, so that matches the maximum geforces can provide. The nvidia panel has no option for quality texture filtering, it all goes performance texture filtering.

Why I would want to worse the image quality? Is madness, all those years we wanted the best graphic card to push best image quality at high framerates.
 
Why I would want to worse the image quality? Is madness, all those years we wanted the best graphic card to push best image quality at high framerates.

More FPS.

AMD has a performance, standard and high mode for texture filtering. I'm guessing nvidia only support "performance" or "standard" equivalent.
 
More FPS.

AMD has a performance, standard and high mode for texture filtering. I'm guessing nvidia only support "performance" or "standard" equivalent.

7-8 years ago, I used to do it in order to get higher 3D Mark points.
But since then realised it is not worth it the efforts, so just always look for the best image quality possible. Unless it is unplayable, of course.
Also, Radeon Settings option for texture filtering quality actually provides more FPS with the setting being High, instead of Standard or Performance.
 
Why I would want to worse the image quality? Is madness, all those years we wanted the best graphic card to push best image quality at high framerates.

To get the nvidia framerate with equal image quality. Then, suddenly, the benchmarks charts will look different.
 
To get the nvidia framerate with equal image quality. Then, suddenly, the benchmarks charts will look different.
More FPS.

AMD has a performance, standard and high mode for texture filtering. I'm guessing nvidia only support "performance" or "standard" equivalent.

Lets be bit realistic shall we?
We have Freesync/Gsync monitors, so 120fps and 150fps are barely any visible difference on FPS. Even 100fps to 130fps there is no difference on experience.
Why would we want "more fps" at the cost of worse image quality is beyond me.

I do not give a tosh about benchmarks, because I do play games with my graphic cards and thats the purpose spend all that money for, not running benchmarks which are one off every blue moon.
 
Lets be bit realistic shall we?
We have Freesync/Gsync monitors, so 120fps and 150fps are barely any visible difference on FPS. Even 100fps to 130fps there is no difference on experience.
Why would we want "more fps" at the cost of worse image quality is beyond me.

I do not give a tosh about benchmarks, because I do play games with my graphic cards and thats the purpose spend all that money for, not running benchmarks which are one off every blue moon.

Because the dumb masses look at benchmarks and see Nvidia cards usually get slightly better FPS, so they buy them. Even though the difference is pretty irrelevant and after about 80fps picture quality matters more.
 
Because the dumb masses look at benchmarks and see Nvidia cards usually get slightly better FPS, so they buy them. Even though the difference is pretty irrelevant and after about 80fps picture quality matters more.
I'm never fond of benchmarks anyway to be honest. If there's a few FPS in it between an NV or AMD card, both options would be a good choice for example. Small changes to image quality is something that nobody will notice however. And I suspect AMD may win some and NV win others (different games).
Gaming experience to me is a combination of being realistic (hence RT is good), image quality and also FPS. It's also a reason I won't go 4K because realism and image quality to me also means setting "Ultra" everything or as much as posslble anyway. Using all of the effects the game offers also means getting my £ worth from the R&D that went into it all too :)
 
Because the dumb masses look at benchmarks and see Nvidia cards usually get slightly better FPS, so they buy them. Even though the difference is pretty irrelevant and after about 80fps picture quality matters more.

@LtMatt huh we found the solution for AMD get a speed boost on "reviewers" benchmarks, worsen the default graphics quality like Nvidia does :D
Free marketing advice......
 
More FPS.

AMD has a performance, standard and high mode for texture filtering. I'm guessing nvidia only support "performance" or "standard" equivalent.

Standard = Quality on nVidia, Performance = Performance and High = High Quality. AFAIK AMD doesn't have the same equivalent option as High Performance on nVidia. If there was any significant difference in it one or the other company wouldn't hesitate to take the other to task as it would be a huge PR win.
 
The difference between Quality (top) and High Quality (bottom) in The Division.

106VqjG.jpg


Less than 1 fps in it - interestingly it shifted the load slightly higher on the GPU and took a little load off the CPU - assumedly with optimisations on the CPU is doing a bit more work for those before handing stuff to the GPU.

I tried to get some images for quality comparison but the game is so dynamic it is hard to get an exactly the same lighting, etc. situation in a static position between restarts of the game and I don't have the tools to hand to ensure I can capture the exact same frame from the benchmark.

On the top image it looks ever so slightly near black is being crushed.
 
On the top image it looks ever so slightly near black is being crushed.

The lighting, etc. is never quite the same on the menu there - it pulses and changes a bit over time - it isn't related to the image settings.
 
Last edited:
Back
Top Bottom