• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD will need a higher ROPS count for Navi 2X GPUs, for 4K gaming

Soldato
Joined
30 Jun 2019
Posts
8,066
ROPs (Render Output Unit) are very important for rendering more pixels, for higher resolutions.

To my knowledge, there are only 2 consumer graphics cards with more than 64 ROPs, the GTX 1080 TI and the RTX 2080 TI.

To double the pixel rate (GPU clock x ROP count) of my current Graphics card (R9 390), I'd need to buy either a super overclocked Navi RX 5700 XT, or (expensive) Nvidia cards such as the GTX 1080 TI, or RTX 2080 TI.

My old R9 390 has the same number of ROPs (64) as most modern GPUs.

Alternatively, AMD could just use much higher GPU core clocks, which in my view, would be a worse option.

EDIT - just noticed that the Xbox X series GPU will have 80 ROPs, with a slightly lower clock than the Navi RX 5700 XT, so pixel rates of ~152,400‬ (1905 MHz x 80 ROPs) should be achievable for Navi 2X GPUs.
 
Last edited:
Apparently my titan X Pascal has 96Rops so assume the Titan Xp has the same.

Just looked and even the 980Ti has 96 ROPs so I don't think it's the be all and end all of performance
 
Nope, but lower pixel rate is a big disadvantage, and games will focus more and more on higher resolutions with the new console releases this year.

Also, the 980Ti has a fairly low GPU clock of 1000, which reduces the pixel rate quite a bit.
 
Last edited:
Apparently my titan X Pascal has 96Rops so assume the Titan Xp has the same.

Just looked and even the 980Ti has 96 ROPs so I don't think it's the be all and end all of performance

He is right and it's the same reason why I doubt next gen consoles will have any 8k games.

Even if you have the raw processing power, if the pipeline isn't built for it you can get a bottleneck.

There are some 4K vs 8K 2080ti benchmarks out there and in some games the performance difference is massively different on 8k - because the gpu pipeline is bottlenecked - what I mean by this is there are games that can run at solid 60fps 4K on the 2080ti but when you change the resolution to 8k you get 15-20fps and not the 30fps that you would logically expect.

edit: Here is some other examples

Far Cry 5 on a 2080ti NVLink 4k High settings: 105fps
Same settings but now at 8k: 37fps

Shadow of Tomb Raider on 2080ti NVLink 4k High settings: 126fps
Same settings but now at 8k: 40fps

And then take F1 2018, a game obviously less reliant on ROPs
F1 2018 on 2080ti NVLink High setting: 140fps
Same settings but now at 8k: 80fps

https://www.tweaktown.com/articles/...vlink-8k-60fps-gaming-now-reality/index6.html

So for F1 2018, it's very close to the 50% performance loss I would expect with perfect scaling of 4k to 8k pixels. But for Tomb raider and Far Cry 5 the performance loss is close to 70% - indicating the GPU is suffering from a bottleneck in it's pipeline.
 
Last edited:
I suppose I'm not surprised, 8K resolution is 33.177‬ million pixels, 4K resolution is 8.294 million pixels. Rendering 4x the number of pixels should logically need a GPU with about 4x the pixel rate.

That's a lot of ROPs...

I think that's why AMD has mentioned aiming for 4K in games, not 8K.
 
Last edited:
ROPs (Render Output Unit) are very important for rendering more pixels, for higher resolutions.

To my knowledge, there are only 2 consumer graphics cards with more than 64 ROPs, the GTX 1080 TI and the RTX 2080 TI.

To double the pixel rate (GPU clock x ROP count) of my current Graphics card (R9 390), I'd need to buy either a super overclocked Navi RX 5700 XT, or (expensive) Nvidia cards such as the GTX 1080 TI, or RTX 2080 TI.

My old R9 390 has the same number of ROPs (64) as most modern GPUs.

Alternatively, AMD could just use much higher GPU core clocks, which in my view, would be a worse option.

EDIT - just noticed that the Xbox X series GPU will have 80 ROPs, with a slightly lower clock than the Navi RX 5700 XT, so pixel rates of ~152,400‬ (1905 MHz x 80 ROPs) should be achievable for Navi 2X GPUs.

There Rops where a real issue running up to Polaris but not so much anymore. The lack of ROPS showed badly with Fuji when cutting a load of shaders from the Fury X to the Fury had such a minor impact on performance. After that point memory bandwidth was AMD's gpu bottle neck
 
He is right and it's the same reason why I doubt next gen consoles will have any 8k games.

Even if you have the raw processing power, if the pipeline isn't built for it you can get a bottleneck.

There are some 4K vs 8K 2080ti benchmarks out there and in some games the performance difference is massively different on 8k - because the gpu pipeline is bottlenecked - what I mean by this is there are games that can run at solid 60fps 4K on the 2080ti but when you change the resolution to 8k you get 15-20fps and not the 30fps that you would logically expect.

Btw 8K is four times the pixels of 4K, not double:

https://uk.pcmag.com/tvs/93009/what-is-8k-should-you-buy-a-new-tv-or-wait

8K is a higher resolution than 4K—and that's it. 1080p screens have a resolution of 1,920 by 1,080 pixels. 4K screens double those numbers to 3,840 by 2,160 and quadruple the number of pixels. 8K doubles the numbers again, to a resolution of 7,680 by 4,320. That's four times the number of pixels as 4K, which means it's 16 times that of a 1080p TV.
 
Nah, unfortunately Sony / AMD have (reportedly) agreed on the 'standard' 64 ROPs for the PS5 GPU:

Specs here:
https://www.techpowerup.com/gpu-specs/playstation-5-gpu.c3480

However, the important thing is that the overall pixel rate is very similar to the Xbox Series X GPU (both over 140GPixel/s), due to the much higher GPU clock rate of 2233 MHz.

If you combine the above clock rate with 80 ROPs, an impressive pixel rate of 178,640‬ is possible (maybe for higher end Navi 2x GPUs).
 
Last edited:
I'm on a 28" 4K monitor and was wondering, how does VSR / DSR on a 1080p monitor, rendering at 4K resolution, compare to native 4k resolution?

Has anyone compared it? Does VSR / DSR @ 4K on a small 1080p monitor get rid of the aliasing / jagged edges in games?
 
I'm on a 28" 4K monitor and was wondering, how does VSR / DSR on a 1080p monitor, rendering at 4K resolution, compare to native 4k resolution?

Has anyone compared it? Does VSR / DSR @ 4K on a small 1080p monitor get rid of the aliasing / jagged edges in games?

It will help but 4K resolution by itself is not enough to get rid of all jaggies, some form of anti aliasing is still needed. Only at 8k resolution is the need for anti aliasing removed from games
 
I tried 8K on my 4K monitor on GTA V (frame scaling enabled) and it looked even better than MSAA 8x antialiasing, surprisingly I could get around 20 FPS even on an old GPU.

Should be possible to run some games at 8K with next gen cards I think, if not running all settings at max.
 
Last edited:
Back
Top Bottom