• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD fine wine

Just recently got an AMD card in April and really enjoying the performance- massive upgrade over my 60ti. Managed to OC it and get a better timespy score than stock 4080 for 50% less price so happy with that!

Wanted to gauge thoughts as to whether we might think driver updates (aka fine wine) might get another 10-15% out of the 7900xt to beat the 4080 at raster?
well nvidia havent fixed the cpu overhead in a decade yet and none really buys 4080 anyway.

Enjoy the upgrade
 
That’s true. 4080 will cost less in the long run with leccy.

Are you Martin Lewis?

Leccy has dropped twice already this year.

Energy companies thought they’d see how hard they could rinse it and it backfired. Instead of people using lots and paying out the backside they bought blankets.

As such there is now too much of it.

So you can stop worrying about it now.
 
Even now with a bit of overclocking you can gain 6-7% performance at the cost of power draw to almost hit the stock performance of a 4080 at rasterisation.

Hoping fine wine brings it to match the 4080 without overclocking.
I should add the 5700 XT went from ~RTX 2070 equivalent to 2070S-2080 territory in many games, so was in the overtaking lane against its current gen rivals too IIRC, not just the 1080 Ti.

Arguments about power draw come down to amount played and what you're playing. Something like the 7900 XT is powerful enough that, with some games on my 4K TV locked at its max refresh (60 Hz), the 7900 XT is drawing ~160W. My old 5700 XT couldn't lock that FPS, but would be drawing its max of 225W attempting to do so.

The main examples of power efficiency making a real difference in cost comes from NVidia graphs, like at how 50p per kWh playing games 30 hours per week (on games that max your card to 100%), you'll be better off by like £40 per year on NVidia than AMD or something (I don't recall the exact details and can't find the slide on it). I wish I could play triple AAA games that much, but sadly I can't.

I think the 7900 XT is an easy win against the competition in pure rasterization + the juicy VRAM; the sticking point is FSR2 not being as good as DLSS, and there being a giant question mark over FSR3. It's not the even playing field like with the 5700 XT vs 1080 Ti where neither had an advantage in tech. NVidia has an ace up its sleeve in DLSS that AMD have yet to match. Something similar could be said about ray tracing, but that's a headache to pick apart, especially with so much DLSS3 polluting the ray tracing stats and NVidia's focus on what I assumed were obsolete / soon to be obsolete resolutions. Plus there's a giant gulf between how RDNA3 handles lighter ray tracing workloads (RE4, Forza Horizon 5, Shadow of the Tomb Raider), versus something like Cyberpunk, which seems designed to crush everything except the 4090 (or limp through with DLSS3 on other 4000 series cards).

Then there's the whole UE5 situation coming up.
 
Last edited:
I should add the 5700 XT went from ~RTX 2070 equivalent to 2070S-2080 territory in many games, so was in the overtaking lane against its current gen rivals too IIRC, not just the 1080 Ti.

Arguments about power draw come down to amount played and what you're playing. Something like the 7900 XT is powerful enough that, with some games on my 4K TV locked at its max refresh (60 Hz), the 7900 XT is drawing ~160W. My old 5700 XT couldn't lock that FPS, but would be drawing its max of 225W attempting to do so.

The main examples of power efficiency making a real difference in cost comes from NVidia graphs, like at how 50p per kWh playing games 30 hours per week (on games that max your card to 100%), you'll be better off by like £40 per year on NVidia than AMD or something (I don't recall the exact details and can't find the slide on it). I wish I could play triple AAA games that much, but sadly I can't.

I think the 7900 XT is an easy win against the competition in pure rasterization + the juicy VRAM; the sticking point is FSR2 not being as good as DLSS, and there being a giant question mark over FSR3. It's not the even playing field like with the 5700 XT vs 1080 Ti where neither had an advantage in tech. NVidia has an ace up its sleeve in DLSS that AMD have yet to match. Something similar could be said about ray tracing, but that's a headache to pick apart, especially with so much DLSS3 polluting the ray tracing stats and NVidia's focus on what I assumed were obsolete / soon to be obsolete resolutions. Plus there's a giant gulf between how RDNA3 handles lighter ray tracing workloads (RE4, Forza Horizon 5, Shadow of the Tomb Raider), versus something like Cyberpunk, which seems designed to crush everything except the 4090 (or limp through with DLSS3 on other 4000 series cards).

Then there's the whole UE5 situation coming up.
UE5 is purported to even the field and RT doesn't have to be hardware path traced, also most new AAA Games with UE5 require 12gb upwards at 1440p and higher (I would expect at 4K), benefiting AMD cards.....
 
Last edited:
I should add the 5700 XT went from ~RTX 2070 equivalent to 2070S-2080 territory in many games, so was in the overtaking lane against its current gen rivals too IIRC, not just the 1080 Ti.

Arguments about power draw come down to amount played and what you're playing. Something like the 7900 XT is powerful enough that, with some games on my 4K TV locked at its max refresh (60 Hz), the 7900 XT is drawing ~160W. My old 5700 XT couldn't lock that FPS, but would be drawing its max of 225W attempting to do so.

The main examples of power efficiency making a real difference in cost comes from NVidia graphs, like at how 50p per kWh playing games 30 hours per week (on games that max your card to 100%), you'll be better off by like £40 per year on NVidia than AMD or something (I don't recall the exact details and can't find the slide on it). I wish I could play triple AAA games that much, but sadly I can't.

I think the 7900 XT is an easy win against the competition in pure rasterization + the juicy VRAM; the sticking point is FSR2 not being as good as DLSS, and there being a giant question mark over FSR3. It's not the even playing field like with the 5700 XT vs 1080 Ti where neither had an advantage in tech. NVidia has an ace up its sleeve in DLSS that AMD have yet to match. Something similar could be said about ray tracing, but that's a headache to pick apart, especially with so much DLSS3 polluting the ray tracing stats and NVidia's focus on what I assumed were obsolete / soon to be obsolete resolutions. Plus there's a giant gulf between how RDNA3 handles lighter ray tracing workloads (RE4, Forza Horizon 5, Shadow of the Tomb Raider), versus something like Cyberpunk, which seems designed to crush everything except the 4090 (or limp through with DLSS3 on other 4000 series cards).

Then there's the whole UE5 situation coming up.

UE5 situation?
 
UE5 situation?
I expect the UE5 situation will fall in AMD's favour (favouring more VRAM and less dependence on hardware ray tracing), but that's speculation on my part. There's no guarantee UE5 will meet the hype it's set itself up for, and we might still have a future filled with competing engines that UE5 will get lost in. But as for now, there's still a distinct possibility UE5 will break all latest gen graphics cards, regardless of VRAM.
 
I expect the UE5 situation will fall in AMD's favour (favouring more VRAM and less dependence on hardware ray tracing), but that's speculation on my part. There's no guarantee UE5 will meet the hype it's set itself up for, and we might still have a future filled with competing engines that UE5 will get lost in. But as for now, there's still a distinct possibility UE5 will break all latest gen graphics cards, regardless of VRAM.

Thanks. What do you mean by break?
 
The fear is all the best current graphics cards take a large nosedive in performance, and it taking a massive refresh in next-gen graphics cards to get on top of future engines like UE5

Personally I'm hoping the next gen challenges (including UE5) will revolve around consoles' superior Direct Storage, and that we can make up for it as PC gamers with more RAM and graphics cards with lots of VRAM, and that our NVME's / SATA SSDs will be okay enough, assuming there's plenty of capacity higher up the chain.
 
I expect it'll get that, over the lifetime of the card, but the 4080 will likely improve too, so the biggest jump might come from VRAM limited scenarios like in HUB's 3070 v 6800 video.

That will be the day, lol. All those recommendations for 6xxx series since it has 16GB over lowly 12GB on the nvidia side :))

That’s true. 4080 will cost less in the long run with leccy.

:cry:How long per day over how many years do you have to play games for your GPU to use ~£450 extra over the savings in leccy a 4080 would bring?

Is not even about that, but about the heat! Nice if you're not living in relative warm area, but else... good luck to you! I don't even want to think how a power hungry overclocked Intel CPU does alongside a 7900xtx overclocked as well!
But will be nice during winter at least! :))
 
That will be the day, lol. All those recommendations for 6xxx series since it has 16GB over lowly 12GB on the nvidia side :))





Is not even about that, but about the heat! Nice if you're not living in relative warm area, but else... good luck to you! I don't even want to think how a power hungry overclocked Intel CPU does alongside a 7900xtx overclocked as well!
But will be nice during winter at least! :))

But the issue is the power draw has gone mad,for most Nvidia and AMD dGPUs.

The RTX4070TI has a higher TDP/board power than a GTX980TI or GTX1080TI,and it is meant to be decent. This is an RTX3060 class chip too.

It makes me wonder how Navi 32 is going to look like too! :(
 
Last edited:
Is not even about that, but about the heat! Nice if you're not living in relative warm area, but else... good luck to you!

And good luck to you to sir, I've got a 4070 for those low power heat needs in one of the rooms upstairs, need it as cool as, so doesn't need tweaking, just run it out the box for high fps 1080p 240hz gaming and the 12gb shouldn't run out till 50 series surely?:p

However swapped out the struggling 3080 for XTX@4K native 65" QD-OLED gaming and tuned direct through AMD'S driver suite and it's pulling the same watts as my 3080 was with much higher fps for 20% less than a 16Gb GPU that costs well over a grand.

Different needs for different use mate.;):)
 
Last edited:
Fine wine will come from things like FSR3 (if you count that) and maybe a little from HAGs eventually.

AMD would stand to gain if more games like Warhammer 3 actually adopted DX12 or native Vulkan, so I'm sure that would have an effect on the aggregate results you see on websites like techpowerup.
 
Last edited:
UE5 is purported to even the field and RT doesn't have to be hardware path traced, also most new AAA Games with UE5 require 12gb upwards at 1440p and higher (I would expect at 4K), benefiting AMD cards.....

The non-hardware render paths are not great really, especially as hardware solutions are coming for things like wave optics https://ssteinberg.xyz/2023/03/27/rtplt/ and some significant performance advances coming in hardware.
 
Fine wine will come from things like FSR3 (if you count that) and maybe a little from HAGs eventually.

AMD would stand to gain if more games like Warhammer 3 actually adopted DX12 or native Vulkan, so I'm sure that would have an effect on the aggregate results you see on websites like techpowerup.

Does techpowerup update their graphs to reflect improvements in driver performance?
 
Back
Top Bottom