• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

Yeah that's what I was thinking at first too as noticed the same kind of issue with my ryzen 2600 when getting into crowded areas, even with medium crowd density, however, that doesn't happen with stock game on my ryzen 5600 i.e. when the 4-8k texture packs are removed, the problem goes away, perhaps it is just a combination of everything going on in addition to the 4k-8k texture packs i.e. overloading the cpu so cpu can't feed rest of system quickly enough? But then if you look at the stats, cpu usage is much the same overall throughout, it's mostly just the vram usage changing.

Good read that link too.

Could be related to this part
28.2.2 Texture Bandwidth
Texture bandwidth is consumed any time a texture fetch request goes out to memory. Although modern GPUs have texture caches designed to minimize extraneous memory requests, they obviously still occur and consume a fair amount of memory bandwidth.

Modifying texture formats can be trickier than modifying frame-buffer formats as we did when inspecting the ROP; instead, we recommend changing the effective texture size by using a large amount of positive mipmap level-of-detail (LOD) bias. This makes texture fetches access very coarse levels of the mipmap pyramid, which effectively reduces the texture size. If this modification causes performance to improve significantly, you are bound by texture bandwidth.

Texture bandwidth is also a function of GPU memory clock.
 
Didn't @Poneros highlight that CB2047 was notoriously heavy on the CPU with RT enabled, and that's before you start chucking mods/texture packs into the mix.
 
Could be related to this part
28.2.2 Texture Bandwidth
Texture bandwidth is consumed any time a texture fetch request goes out to memory. Although modern GPUs have texture caches designed to minimize extraneous memory requests, they obviously still occur and consume a fair amount of memory bandwidth.

Modifying texture formats can be trickier than modifying frame-buffer formats as we did when inspecting the ROP; instead, we recommend changing the effective texture size by using a large amount of positive mipmap level-of-detail (LOD) bias. This makes texture fetches access very coarse levels of the mipmap pyramid, which effectively reduces the texture size. If this modification causes performance to improve significantly, you are bound by texture bandwidth.

Texture bandwidth is also a function of GPU memory clock.

Nice one shanks, finally some actual proper substance to this topic now ;) :) :D

Didn't @Poneros highlight that CB2047 was notoriously heavy on the CPU with RT enabled, and that's before you start chucking mods/texture packs into the mix.

Yup do recall of that and someone else or another site stating the same. IIRC, I had all RT maxed including lighting set to psycho.
 
Ok since you seem to know a lot on this topic @humbug , you should be able to answer a question I have been asking for a long time now and no one else has attempted to answer it but you seem to be up to the job :)

So, lets get to it:

1. Can you explain why in the case of tommys 3080 in FC 6 when his FPS drops to single digits... his fps does not return to normal if he wonders of to a new area or looks elsewhere or at the very least go up when he reduces the settings? As would happen in 99% of games where vram symptoms would manifest

Yet, as per my videos, you can see the vram is exceeding 10GB yet it doesn't stutter and doesn't drop to single digits? Why is that?

Where as in my showcase of CP 2077 when using several 4-8k texture packs, you can clearly see frame latency issues and fps drops as it gets closer to 10GB, however, once leaving the area, the fps returns to normal figures and frame latency goes back to normal:


Can you explain?

Genuinely would love to know about this, as I said, I'm not a "game" developer so educate me please

In your case the VRAM usage is Allocated vs Dedicated.

Allocated its filling the VRAM with what it thinks it might need later.
Dedicated is what the GPU is actually streaming, if you look at that it never actually goes over 8.5GB.

So 8.5GB is what's actually needed, the other 1.5GB is just something the GPU has pre-loaded but isn't actually using at the time, so while yes your VRAM is full it doesn't actually need to be, its just making use of spare capacity because (just in case)
---------

This is an example of VRAM buffer over spill to System RAM, the game at given setting (changed from 1440P to 4K) needs more than the card has so its shunted it to system RAM which is much slower and that created a massive performance bottleneck, just as you explain with tommys 3080 its unrecoverable, i don't know why that is, there could be all sorts of reasons for it most of which i probably don't understand, only the devs of the game know what is going on there and maybe its something they need to look in to, my guess is once once the system RAM has been employed as a buffer it can't revert back to its previous state without a game restart.

Edit: one other thing, when running close to buffer capacity the GPU is having to swap files a lot, it can't just store unused files that are either streamed out or pre-loaded and scrub or use when convenient, it needs to load and stream only what it actually needs at a given time.
Over time that can lead to render stalls, where assets aren't scrubbed out fast enough before it needs to fetch the next file, that will cause a stall as the render waits for that task to be completed, you would see that as stutter.

Its why i hate the (you only need) argument, its a sort of 'That'll do' argument, it will but its better to have spare capacity so the system can swap files. you get a smoother game.

Edit 2: when he changes from 1440P to 4K you see his VRAM go from 7.5GB to 7.8GB and system RAM from 11.8GB to 12.8GB.
I would expect to see that but when he reduces the graphics settings to try and recover frame rates his System RAM stays at near 13GB, which suggests what i said earlier, the system isn't able to reverse the System RAM buffer allocation without a game restart.

https://youtu.be/lbQBmuYrzOM?t=487
 
Last edited:
ugh.... straight to 8GB and

uHW6pRB.png
 
Performance is perfectly good, 60+ at 1440P with everything on Ultra, the GPU has the grunt, it is lacking VRam for what it is, i said this already when it was launched, the fact that they did it again with the 3070 is astonishing to me.

R9 390 (8GB) $329, June 2015
GTX 1070 (8GB) $379 June 2016
RTX 2070 (8GB) $499 < #### me :o July 2019
RTX 3070 (8GB) $499 October 2020

 
Performance is perfectly good, 60+ at 1440P with everything on Ultra, the GPU has the grunt, it is lacking VRam for what it is, i said this already when it was launched, the fact that they did it again with the 3070 is astonishing to me.

R9 390 (8GB) $329, June 2015
GTX 1070 (8GB) $379 June 2016
RTX 2070 (8GB) $499 < #### me :o July 2019
RTX 3070 (8GB) $499 October 2020

Looks like not all the 4K Textures are rendering properly either there, aside from the stuttering and eventual crash. This is from the patch notes that were released before Christmas.
HD Texture Pack — Some assets appearing blurry
Developer comment: We have made some changes for the HD Texture Pack on PC that should decrease the blurriness that appeared for some players when using the HD Texture Pack. When looking into these reports, we are seeing players using graphics cards with less than 12 GB of VRAM available. When using the HD Texture Pack with less than the minimum required VRAM available, the performance and the look of the game can be worse than without the pack.

Does this look familiar @tommybhoy? Looks like we have another local system issue. :p

giphy.gif
<< Debunker rn

What we were saying, but ohhhh no. :cry::cry:
I'm actually quite surprised the 2070 can even run it at playable FPS at 1440P max settings, even if it does crash and stutter due to 8GB.
 
You guys need to learn the difference between 8GB and 10GB ;) :cry:

Surprised you could even get that running at them settings with a 2070 tbh, what happens if you enable FSR?

In your case the VRAM usage is Allocated vs Dedicated.

Allocated its filling the VRAM with what it thinks it might need later.
Dedicated is what the GPU is actually streaming, if you look at that it never actually goes over 8.5GB.

So 8.5GB is what's actually needed, the other 1.5GB is just something the GPU has pre-loaded but isn't actually using at the time, so while yes your VRAM is full it doesn't actually need to be, its just making use of spare capacity because (just in case)
---------

This is an example of VRAM buffer over spill to System RAM, the game at given setting (changed from 1440P to 4K) needs more than the card has so its shunted it to system RAM which is much slower and that created a massive performance bottleneck, just as you explain with tommys 3080 its unrecoverable, i don't know why that is, there could be all sorts of reasons for it most of which i probably don't understand, only the devs of the game know what is going on there and maybe its something they need to look in to, my guess is once once the system RAM has been employed as a buffer it can't revert back to its previous state without a game restart.

Edit: one other thing, when running close to buffer capacity the GPU is having to swap files a lot, it can't just store unused files that are either streamed out or pre-loaded and scrub or use when convenient, it needs to load and stream only what it actually needs at a given time.
Over time that can lead to render stalls, where assets aren't scrubbed out fast enough before it needs to fetch the next file, that will cause a stall as the render waits for that task to be completed, you would see that as stutter.

Its why i hate the (you only need) argument, its a sort of 'That'll do' argument, it will but its better to have spare capacity so the system can swap files. you get a smoother game.

Edit 2: when he changes from 1440P to 4K you see his VRAM go from 7.5GB to 7.8GB and system RAM from 11.8GB to 12.8GB.
I would expect to see that but when he reduces the graphics settings to try and recover frame rates his System RAM stays at near 13GB, which suggests what i said earlier, the system isn't able to reverse the System RAM buffer allocation without a game restart.

https://youtu.be/lbQBmuYrzOM?t=487

Just to let you know, I will read this when I am less tired, had fun shenanigans getting the qd-oled monitor order in until 3am in the morning :p :(
 
Surprised you could even get that running at them settings with a 2070 tbh, what happens if you enable FSR?

Why and why would i run FSR when the GPU is perfectly capable of 60+ at native?
-----------

You know why i hate this the most? The "you don't need more VRAM" argument is the same as the "You don't need more than 4 cores" argument

There is no such thing as an arbitrary limit on what is enough VRAM, game developers work to with in the limits of what is available, in the case of VRAM more = higher resolution textures, less = lower resolution textures.
With that Nvidia are actively stagnating game development, just as Intel were, but what really gets me is when consumers defend this Male Bovine Manure.

Stop it!
 
Performance is perfectly good, 60+ at 1440P with everything on Ultra, the GPU has the grunt, it is lacking VRam for what it is, i said this already when it was launched, the fact that they did it again with the 3070 is astonishing to me.

If the 'performance is perfectly good', why would it need more VRAM?

R9 390 (8GB) $329, June 2015
GTX 1070 (8GB) $379 June 2016
RTX 2070 (8GB) $499 < #### me :o July 2019
RTX 3070 (8GB) $499 October 2020

Colour depth hasn't change, nor has the standard resolutions. 32bit 1440p back at June 2015 is the same as 32bit 1440p today. 2017 Shadow of War 4k texture pack ran fine with a 6GB 980Ti. 2022 FC6 requires 12GB for same with quarter resolution effects.
 
Why and why would i run FSR when the GPU is perfectly capable of 60+ at native?
-----------

You know why i hate this the most? The "you don't need more VRAM" argument is the same as the "You don't need more than 4 cores" argument

There is no such thing as an arbitrary limit on what is enough VRAM, game developers work to with in the limits of what is available, in the case of VRAM more = higher resolution textures, less = lower resolution textures.
With that Nvidia are actively stagnating game development, just as Intel were, but what really gets me is when consumers defend this Male Bovine Manure.

Stop it!
If the 'performance is perfectly good', why would it need more VRAM?
:cry:.
 
If the 'performance is perfectly good', why would it need more VRAM?



Colour depth hasn't change, nor has the standard resolutions. 32bit 1440p back at June 2015 is the same as 32bit 1440p today. 2017 Shadow of War 4k texture pack ran fine with a 6GB 980Ti. 2022 FC6 requires 12GB for same with quarter resolution effects.

If the 'performance is perfectly good', why would it need more VRAM?
Are you trying to wind me up? I'm not starting from the beginning of all this again.

Colour depth hasn't change, nor has the standard resolutions. 32bit 1440p back at June 2015 is the same as 32bit 1440p today. 2017 Shadow of War 4k texture pack ran fine with a 6GB 980Ti. 2022 FC6 requires 12GB for same with quarter resolution effects.

Do you know what a texture is?

Image one is the 2K Albedo for a rock asset, its 4.38MB
Image two is the Normal Map for the same asset, its 7.54MB

In the Mega link below that i have included those and the same textures in 4K form, a "HD Texture Pack" if you like, the Albedo is 13.4MB, the Normal Map is 28.4MB

Each asset contains at least 3 such textures, Albedo, Normal and Specular, sometimes there are also Ambient Occlusion, Dirt Maps exte.... the point being the 2K Asset texture pack is around 25MB, the "HD Pack" is around 100MB.

You can have them, no licensing, they are useless without the 3D asset anyway.

https://mega.nz/file/hEs3ER7A#PbWkch0IZLNs3nMVXQh1Tyoe_OcBhXtiuN475mgNnDY

w1SaaU2.jpg


5TocDbQ.jpg
 
Back
Top Bottom