• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

With older CPUs like Ryzen 3000, sure. With Intel CPUs? Maybe. No negative with Ryzen 5000/RDNA2 though, as I mentioned in my post.

I'll play devils advocate though for you. Let's image there is one or two games out there where FPS are 1-2 lower with SAM on when using Ryzen 5000/RDNA2, what about all the other games that are up to 29% faster (Forza 5) and the multiple games with 20% improvements, ala Forza 4, Halo, Horizon Zero Dawn, Valhalla etc?

You don't need to do any homework with Ryzen 5000 and RDNA2, don't worry. ;) If you recall, support for Ryzen 3000 was added much later by motherboard manufacturers with updated BIOS.

I'll say it again, there is no logical reason to disable SAM with Ryzen 5000 and RDNA2. If you are using this configuration and can prove me wrong, please provide some clear examples, backed up with data Grim. I'll be waiting. :)

Did you watch that HUB video? :p He showed at the end a wide range of games with SAM on 6800 and 5950x:

OZpT5bt.png


i.e. it's not a leave it on and forget setting.... as there is proof that it does harm performance and most importantly, is that it's "random" as to what setup or/and game will see a benefit or which one will see a decrease in perf. hence why HUB are leaving it off for the time being.
 
Did you watch that HUB video? :p He showed at the end a wide range of games with SAM on 6800 and 5950x:

OZpT5bt.png


i.e. it's not a leave it on and forget setting.... as there is proof that it does harm performance and most importantly, is that it's "random" as to what setup or/and game will see a benefit or which one will see a decrease in perf. hence why HUB are leaving it off for the time being.
That’s old data from when the product launched and they tested it I believe.. Using old BIOS, GPU/Chipset drivers etc. Look at the list of games they tested, it’s from their launch testing of the feature.

Let’s take the worst example, Apex. 10% performance regression. Now look how this runs on newer BIOS and drivers.

No -10% performance drop anymore.
 
Oh god I’ve just seen that, only high settings too and that 3070 is choking. :cry:

Elephant in the room that one mate, it stands to reason that obvious beseeches to provide facts and added bonus they are from upstanding sources (disclosed on another thread) and here we have upside down opinion asking the same questions..

I will drop another nugget from the same review:

..for the RTX 3070 here being that VRAM usage can be a problem and it was clearly a problem in far cry 6. Disabling the HD texture pack would solve the performance related issues there,
but I feel when spending $500 or let's be honest, a lot more than $500 on a GPU for that premium experience disabling high quality texture packs when available, isn't something most gamers
will want to do
!


..of course there is still that little matter of VRAM and although 8 gigabytes is still enough for the most part, we are seeing examples where it just isn't enough. That said, the far cry 6
results were certainly troubling given I was using the second highest quality preset but with HD textures enabled. Under these conditions, the 6700XT was capable of 60 FPS 4K while keeping
the minimum frame rate above - well above 50 FPS. The RTX 3070 on the other hand was miles off playable performance with constant stuttering below 10 FPS.
 
Elephant in the room that one mate, it stands to reason that obvious beseeches to provide facts and added bonus they are from upstanding sources (disclosed on another thread) and here we have upside down opinion asking the same questions..

I will drop another nugget from the same review:
Hope they put it up to Ultra and test the 3080, Tommy will be vindicated. :p
 
That’s old data from when the product launched and they tested it I believe.. Using old BIOS, GPU/Chipset drivers etc. Look at the list of games they tested, where are all the recent games I just mentioned?

Let’s take the worst example, Apex. 10% performance regression. Now look how this runs on newer BIOS and drivers.

No -10% performance drop anymore.

That's a 6900xt though... Is there footage with the 6800? Which is kind of the point of that HUB video again i.e. it's random depending on what CPU or/and GPU you use.

Elephant in the room that one mate, it stands to reason that obvious beseeches to provide facts and added bonus they are from upstanding sources (disclosed on another thread) and here we have upside down opinion asking the same questions..

I will drop another nugget from the same review:

But weren't you agreeing with hums "opinion" of them being nvidia shills just yesterday? :cry:

I think everyone has acknowledged that 8GB VRAM is "generally" not enough for "4k", not only that but neither is the grunt of a 3070 quite enough for 4k unless using DLSS/FSR and/or reducing settings, same way the 6700 doesn't have enough grunt despite having 12GB vram for all the other titles they tested.

It's quite funny though that video as people on even amd reddit noted, 3070 comfortably beats the 6700 in every game and res. except that one title at 4k with no FSR (which both gpus would need to use if you want a locked 60 let alone more FPS but nope, "zOMG, 3070/8GB vram is a fail gpu" :cry:)

Also find it amusing how no other title at 4k shows similar fps drop despite there being games that easily use more than 8gb at 4k such as cp2077..... ;)

Hope they put it up to Ultra and test the 3080, Tommy will be vindicated. :p

They did:


:)
 
Hope they put it up to Ultra and test the 3080, Tommy will be vindicated. :p

This is the point. We, or well I at least - never bothered in the 3070 thread as I assume most didn't buy that card for 4k res gaming. However sites like HU always bench across the board and it will highlight it. However for the 3080 thread which I was vocal in (as Jensen said it was flagship and 4k capable), it was dismissed sorry, debunked by an disreputable source but its only a matter of time till that noose drops. ;)
 
That's a 6900xt though... Is there footage with the 6800? Which is kind of the point of that HUB video again i.e. it's random depending on what CPU or/and GPU you use.



But weren't you agreeing with hums "opinion" of them being nvidia shills just yesterday? :cry:

I think everyone has acknowledged that 8GB VRAM is "generally" not enough for "4k", not only that but neither is the grunt of a 3070 quite enough for 4k unless using DLSS/FSR and/or reducing settings, same way the 6700 doesn't have enough grunt despite having 12GB vram for all the other titles they tested.

It's quite funny though that video as people on even amd reddit noted, 3070 comfortably beats the 6700 in every game and res. except that one title at 4k with no FSR (which both gpus would need to use if you want a locked 60 let alone more FPS but nope, "zOMG, 3070/8GB vram is a fail gpu" :cry:)

Also find it amusing how no other title at 4k shows similar fps drop (and 3070 comfortably beats 6700 there) despite there being games that easily use more than 8gb at 4k such as cp2077..... ;)



They did:


:)
It’s old data from HUB, that’s why Apex shows -10%. It will make no difference if you use a 6900 XT or a 6800 on the latest BIOS and drivers. They both have the same memory capacity too. That 6800 data is old and taken from their launch data.
 
It’s old data from HUB, that’s why Apex shows -10%. It will make no difference if you use a 6900 XT or a 6800 on the latest BIOS and drivers. They both have the same memory capacity too. That 6800 data is old and taken from their launch data.

That might well be true but at the same time, you kind of need some new evidence to debunk that....

Anyone with a 6800 and 5950x to test? :)
 
It’s old data from HUB, that’s why Apex shows -10%. It will make no difference if you use a 6900 XT or a 6800 on the latest BIOS and drivers. They both have the same memory capacity too. That 6800 data is old and taken from their launch data.

But you must show me, no.. prove evidence, walls of text - links to many videos grrr.. :cry:
 
That might well be true but at the same time, you kind of need some new evidence to debunk that....

Anyone with a 6800 and 5950x to test? :)
Look at the YouTube comments, I requested that video to prove the HUB data was old and I asked that user to test it in one of his other videos. I can download Apex and prove it’s the same for 6800 XT and 6900 XT as I have those to hand.No more 10% performance regression in Aprx.
 
This is the point. We, or well I at least - never bothered in the 3070 thread as I assume most didn't buy that card for 4k res gaming. However sites like HU always bench across the board and it will highlight it. However for the 3080 thread which I was vocal in (as Jensen said it was flagship and 4k capable), it was dismissed sorry, debunked by an disreputable source but its only a matter of time till that noose drops. ;)

A 2080ti was also said to have been a flagship 4k capable card on release..... Would you still classify it as a flagship 4k capable card like you would a 3090?

Do you classify a 6800xt as a 4k capable card, if so, why not a 3080 based on HUBs latest video since you now seem to hold them as a "reputable" source....

iOUsXK7.png


Also, define what you mean by "only a matter of time until that noose drops"? Until said card is no longer capable of 4k60? If so, we have that already, it's called dying light 2 as shown, even the 3090 is not 4k60 capable here.....

But you must show me, no.. prove evidence, walls of text - links to many videos grrr.. :cry:

Still nothing of value to add to any discussion and dodging all kinds of questions I see.... How embarrassing :)

You do realise that it is matt making this claim with zero evidence to prove otherwise? Like I said, it might very well be true but at the same time, based on that whole point of the hub video with how "random" the results, how do you and Matt know that for certain? Hence why I am asking for "proof"....

Again, post something of value with something to back up your nonsense and people will take you more seriously.

Look at the YouTube comments, I requested that video to prove the HUB data was old and I asked that user to test it in one of his other videos. I can download Apex and prove it’s the same for 6800 XT and 6900 XT as I have those to hand.

I would be interested in seeing that if you have the time and as I said, it is possibly true but again, there are several other titles there all showing decrease in perf., which once again..... ties in with the whole point of that HUB video, it's "random" and not a guaranteed uplift in performance across "every" game on "every" kind of setup.

It’s like some folk just don’t want to see HUB enable SAM as the overall performance charts will change. :p

I think a lot of people do want to see SAM/rebar enabled (me being one) as it can boost perf. substantially but at the same time, HUB have proved why it is an enigma.
 
you may be happy, but the average user can't be expected to do their own benchmark and jumping in and out oft be bios trying to figure out if they should keep it on or off for every game they may play

Just wanted to point out that SAM can be toggled from the Radeon control panel as long as it's been enabled in the BIOS, that was added to the drivers several months ago now.
 
@Nexus18 I can debunk world war z and battlefield v too and show there’s no regression there with SAM on like shown in their charts. If you want more proof the data is old, here’s the video it originally came from. I don’t have the other games, but they are only -1% anyway so easily margin of error. Still no logical reason to disable SAM with Ryzen 5000 and RDNA2. :)
 
@Nexus18 I can debunk world war z and battlefield v too and show there’s no regression there with SAM on like shown in their charts. If you want more proof the data is old, here’s the video it originally came from. I don’t have the other games, but they are only -1% anyway so easily margin of error. Still no logical reason to disable SAM with Ryzen 5000 and RDNA2. :)

I'm not disputing the age/source of the data :) Only enquiring about how you/we know for "certain" that using the same combo of hardware that SAM now provides a benefit to those games as of right now, again, like I said, it is possible, especially since I stated that in response to pete above:

It's called optimisation.

VRAM, CPU, GPU, RAM, storage drive etc. all work together hence why driver updates of any kind, be it GPU, chipset etc. or even just a game patch can provide improvement to performance. I expect there is a lot more to it than just simply adding "resize_bar=1" or "resize_bar=0" to turn it on and off...

This image demonstrates how they interlink quite well hence again why being able to have access/control to both ends could allow one side i.e. an all AMD powered system see more of a benefit:

nSMnp6s.png


PCI interface and it's drivers also plays a part in it too, some have said PCI 4 sees more of a benefit than PCI 3....

I suspect a large part of SAM/resizable bar seeing more of a benefit for AMD than for nvidia also comes down to the games being optimised for consoles.

It is certainly something that I would like to see one of the companies provide more of an in depth look at especially since that recent HUB video on it, why is it some games see no difference at all, some see decrease in perf. and some see a huge gain regardless of it being an intel, amd or nvidia system.....

Just want some proof, that's all ;)
 
I’m really only saying it won’t provide a negative, ie a -10% performance deficit. That’s why I maintain there’s no logical reason to disable it. It helps, or does nothing. It does not hinder. Not saying there will be huge gains s in those titles, I showed you that Apex example as proof that -10% is wrong, Apex sees no gain but no deficit either now.
 
It's called optimisation.

VRAM, CPU, GPU, RAM, storage drive etc. all work together hence why driver updates of any kind, be it GPU, chipset etc. or even just a game patch can provide improvement to performance. I expect there is a lot more to it than just simply adding "resize_bar=1" or "resize_bar=0" to turn it on and off...

This image demonstrates how they interlink quite well hence again why being able to have access/control to both ends could allow one side i.e. an all AMD powered system see more of a benefit:

nSMnp6s.png


PCI interface and it's drivers also plays a part in it too, some have said PCI 4 sees more of a benefit than PCI 3....

I suspect a large part of SAM/resizable bar seeing more of a benefit for AMD than for nvidia also comes down to the games being optimised for consoles.

It is certainly something that I would like to see one of the companies provide more of an in depth look at especially since that recent HUB video on it, why is it some games see no difference at all, some see decrease in perf. and some see a huge gain regardless of it being an intel, amd or nvidia system.....

None of this is a reason to ignore Resizable Bar, the fact that its "inconsistent across many games" is the reason HUB are giving to continue to ignore it, its a cynical and contrived excuse to exclude something that benefits AMD more than Nvidia. even his audience know it, they have been calling him out on it which is why he made two polls on it and despite over 70% say yes to Resizable Bar he still wont do it.

Its free performance, sometimes as much as a whole step up in GPU tier.
 
It's called optimisation.

VRAM, CPU, GPU, RAM, storage drive etc. all work together hence why driver updates of any kind, be it GPU, chipset etc. or even just a game patch can provide improvement to performance. I expect there is a lot more to it than just simply adding "resize_bar=1" or "resize_bar=0" to turn it on and off...

This image demonstrates how they interlink quite well hence again why being able to have access/control to both ends could allow one side i.e. an all AMD powered system see more of a benefit:

nSMnp6s.png


PCI interface and it's drivers also plays a part in it too, some have said PCI 4 sees more of a benefit than PCI 3....

I suspect a large part of SAM/resizable bar seeing more of a benefit for AMD than for nvidia also comes down to the games being optimised for consoles.

It is certainly something that I would like to see one of the companies provide more of an in depth look at especially since that recent HUB video on it, why is it some games see no difference at all, some see decrease in perf. and some see a huge gain regardless of it being an intel, amd or nvidia system.....

It's still either on or off

Yes there will be driver optimizations but there is know way in which Intel/AMD can make it better on their respective platform and hinder Nvidia/competitor without serious repercussion. Apart from that they would just shoot themselves in the foot doing it.
 
It's still either on or off

Yes there will be driver optimizations but there is know way in which Intel/AMD can make it better on their respective platform and hinder Nvidia/competitor without serious repercussion. Apart from that they would just shoot themselves in the foot doing it.
Each vendor is responsible for their own optimisations. No one is going to do any additional work for another vendor.
 
Back
Top Bottom