• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

If you look back you will notice i referred to streaming data. You thought I was talking about streaming textures. That was your mistake. You also took the attitude that streaming only refers to textures which should be loaded. The only reason you load textures is that your game engine does all the grunt work for you, in the same way as you load a word document. There are far more processes going on in the background involving many packets of data, also known as streaming. It's up to you if you wish to start again.
 
If you look back you will notice i referred to streaming data. You thought I was talking about streaming textures. That was your mistake. You also took the attitude that streaming only refers to textures which should be loaded. The only reason you load textures is that your game engine does all the grunt work for you, in the same way as you load a word document. There are far more processes going on in the background involving many packets of data, also known as streaming. It's up to you if you wish to start again.


You know what, lets try something different, why don't you explain to me why my GPU can't run the HD Texture pack.
 
Trying being the best descriptor.



Apart from jibbing in to defend random chums, from what I have read he knows very little. Much like not knowing what hardware is in consoles, just dig than hole wronkly. :cry:


The more this goes on and the more people get involved the more i'm felling like i'm having to explain the obvious.

At that point its difficult to know if i'm being trolled or not.
 
You know what, lets try something different, why don't you explain to me why my GPU can't run the HD Texture pack.
Given the context of the previous posts, I think this is a valid question and I am looking forward to seeing the answer.
And you can explain why other games with just as good quality or better textures run fine with 10gigs of VRAM?
And you can also explain why other games with just as good quality or better textures can run with 8gigs of VRAM too @humbug?
 
Right exactly @LtMatt Because not all games are created equal. That's the obvious answer.

Just because a game looks just as good (to a subjective eye) that doesn't make said game comparable in any way shape or form, including resource intensity.

sO tHen Wy cAN NOt aLl rUn 8Gij BiTTEIs
 
In your case the VRAM usage is Allocated vs Dedicated.

Allocated its filling the VRAM with what it thinks it might need later.
Dedicated is what the GPU is actually streaming, if you look at that it never actually goes over 8.5GB.

So 8.5GB is what's actually needed, the other 1.5GB is just something the GPU has pre-loaded but isn't actually using at the time, so while yes your VRAM is full it doesn't actually need to be, its just making use of spare capacity because (just in case)
---------

This is an example of VRAM buffer over spill to System RAM, the game at given setting (changed from 1440P to 4K) needs more than the card has so its shunted it to system RAM which is much slower and that created a massive performance bottleneck, just as you explain with tommys 3080 its unrecoverable, i don't know why that is, there could be all sorts of reasons for it most of which i probably don't understand, only the devs of the game know what is going on there and maybe its something they need to look in to, my guess is once once the system RAM has been employed as a buffer it can't revert back to its previous state without a game restart.

Edit: one other thing, when running close to buffer capacity the GPU is having to swap files a lot, it can't just store unused files that are either streamed out or pre-loaded and scrub or use when convenient, it needs to load and stream only what it actually needs at a given time.
Over time that can lead to render stalls, where assets aren't scrubbed out fast enough before it needs to fetch the next file, that will cause a stall as the render waits for that task to be completed, you would see that as stutter.

Its why i hate the (you only need) argument, its a sort of 'That'll do' argument, it will but its better to have spare capacity so the system can swap files. you get a smoother game.

Edit 2: when he changes from 1440P to 4K you see his VRAM go from 7.5GB to 7.8GB and system RAM from 11.8GB to 12.8GB.
I would expect to see that but when he reduces the graphics settings to try and recover frame rates his System RAM stays at near 13GB, which suggests what i said earlier, the system isn't able to reverse the System RAM buffer allocation without a game restart.

https://youtu.be/lbQBmuYrzOM?t=487

Just read this now, thanks.

So pretty much what I thought hence why the developers have said the same on the ubi forums, just like they did with the texture rendering/loading issue.

I get what you're saying with regards to allocated and dedicated vram, in the CP video, it does seem once ded. hits 8.4/8.5GB, that's when the frame latency goes to **** and fps drops then once the ded. vram drops <8.4/8.3gb, things return to normal.

Not sure if you seen my FC 6 video as it was just hyperlinked to my channel but this one here:


You can see the ded. vram is comfortably above 8.6GB vram there but it is still a pretty smooth frame latency line and fps is ok too. Not sure what tommys ded. usage was as can't recall if he had that setup on his overlay?

Does seem that shankly is onto something with regards to the pipeline and texture bandwidth bit with regards to CP anyway.
 
Just read this now, thanks.

So pretty much what I thought hence why the developers have said the same on the ubi forums, just like they did with the texture rendering/loading issue.

I get what you're saying with regards to allocated and dedicated vram, in the CP video, it does seem once ded. hits 8.4/8.5GB, that's when the frame latency goes to **** and fps drops then once the ded. vram drops <8.4/8.3gb, things return to normal.

Not sure if you seen my FC 6 video as it was just hyperlinked to my channel but this one here:


You can see the ded. vram is comfortably above 8.6GB vram there but it is still a pretty smooth frame latency line and fps is ok too. Not sure what tommys ded. usage was as can't recall if he had that setup on his overlay?

Does seem that shankly is onto something with regards to the pipeline and texture bandwidth bit with regards to CP anyway.

tommys OSD didn't show dedicated vs allocated, usually if that's so, if you're using something like the defaults on MSI AB its reading how much on the VRAM has been allocated, so not dedicated, IE what's actually being used to render the scene. how are you showing Dedicated and Allocated, is this an option unique to Ampere?

There is some minor stutter at about 2:10 but other than that yes it looks pretty smooth.

RE: Shankly, yeah if you haven't got spare capacity in the VRAM the GPU doesn't have the luxury of queueing files for swapping out, so my guess is instead its queueing in a slower buffer, fetching those files takes a lot longer than if it was taken from the VRAM, that's an increase in latency which is a sudden increase in frame time and that's stutter.

When you went under the bridge, the system didn't have the assets under the bridge in queue so when you jumped in to the ditch the system was like "oh #### i don't have that ready" and had to fetch those assets from outside the VRAM and there is your frames time stall.

Again my guess is in my case when i crashed my VRAM was so constrained there was no room for what it needed to render so it just hangs while trying cram something in to a space that doesn't exist, throws a wobbler and crashes, games crashing due to a lack of VRAM is quite common, i guess if its full that's it there is nothing to be done about it other than crash.
 
tommys OSD didn't show dedicated vs allocated, usually if that's so, if you're using something like the defaults on MSI AB its reading how much on the VRAM has been allocated, so not dedicated, IE what's actually being used to render the scene. how are you showing Dedicated and Allocated, is this an option unique to Ampere?

There is some minor stutter at about 2:10 but other than that yes it looks pretty smooth.

RE: Shankly, yeah if you haven't got spare capacity in the VRAM the GPU doesn't have the luxury of queueing files for swapping out, so my guess is instead its queueing in a slower buffer, fetching those files takes a lot longer than if it was taken from the VRAM, that's an increase in latency which is a sudden increase in frame time and that's stutter.

When you went under the bridge, the system didn't have the assets under the bridge in queue so when you jumped in to the ditch the system was like "oh #### i don't have that ready" and had to fetch those assets from outside the VRAM and there is your frames time stall.

Again my guess is in my case when i crashed mt VRAM was so constrained there was no room for what it needed to render so it just hangs while trying cram something in to a space that doesn't exist, throws a wobbler and crashes, games crashing due to a lack of VRAM is quite common, i guess if its full that's it there is nothing to be done about it other than crash.

Don't think so, you need to faff with msi ab/rt:

SRJ44V6.png


PSJEIyQ.png


FGJ0ZLi.png


 
Didn't @Poneros highlight that CB2047 was notoriously heavy on the CPU with RT enabled, and that's before you start chucking mods/texture packs into the mix.
My old tests, much more demanding now but haven't saved new ones. Cyberpunk is definitely an extremely cpu demanding game once RT gets enabled. End of year can't come soon enough tbh, I can't wait to compare this to Zen 4 + DDR5.

https://i.imgur.com/Z7fTEAc.jpg
https://i.imgur.com/5TEp0bE.png
https://i.imgur.com/RdXteF5.png

zro60TQ.gif
 
The more this goes on and the more people get involved the more i'm felling like i'm having to explain the obvious.

At that point its difficult to know if i'm being trolled or not.

Its sad behaviour, what makes that embarrassing is it means more of them should reflect and engage brain before posting.

Ugh, how I miss good salted popcorn. I used to go to the cinema just to buy it and leave, pre-covid. :D

No need for salt just collect some from the playbus gang. :cry:

Given the context of the previous posts, I think this is a valid question and I am looking forward to seeing the answer.

Indeed. I'm still waiting for an explanation on how nvidia knows all about it when it comes to all their hardware in consoles! :cool:

You only need 4 cores.....

..and 8 Gb vram <choo choo>
 
Its sad behaviour, what makes that embarrassing is it means more of them should reflect and engage brain before posting.

Not that you'll see this since:

Generally people use ignore because:

1. they can't comeback to any arguments
2. don't like to read/hear the truth
3. are clueless hence the 2 points above

But I think a few are starting to pick up that you're coming across as very bitter going by your posts in recent threads lately (more so than usual) and as a result, your posts are starting to come across as even more silly now, which are even less to what have been some good discussions happening at long last.

I know it's a bitter pill to swallow but move on if you can't add anything worthwhile to a discussion without coming across like you have a grudge, I imagine a couple will start reporting posts if it keeps up.

My old tests, much more demanding now but haven't saved new ones. Cyberpunk is definitely an extremely cpu demanding game once RT gets enabled. End of year can't come soon enough tbh, I can't wait to compare this to Zen 4 + DDR5.

https://i.imgur.com/Z7fTEAc.jpg
https://i.imgur.com/5TEp0bE.png
https://i.imgur.com/RdXteF5.png

zro60TQ.gif

What res. was your testing done at? And what cpu?
 
Given the context of the previous posts, I think this is a valid question and I am looking forward to seeing the answer.

And you can also explain why other games with just as good quality or better textures can run with 8gigs of VRAM too @humbug?

Oh i can indeed answer that honestly which many people on these forums sadly do not seem to be able too.
From what you are saying then and kudos its actually the truth, is that some games are so much better programmed than others
and also gpu vendors deliberatley work with manufacturers to gimp games on the competition. Finally we actually get to the truth lmao.

PS: oh the fuss on this thread about the site deliberatley disabling resize bar in testing was great, how short memories are eh, as in having a go at the site for not including all the results in 1 area whereas was it not just a few months back that a certain benchmark thread had its rt disabled as when there was a need for it then it would be added.

Pathetic really from the usual trio.
 
Back
Top Bottom