• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I'd want to keep GPU power consumption to a sensible amount- talking efficiency.

Just… why though? Heat? Noise? Both of those are fair enough concerns.

Random efficiency ratings… sort of meaningless isn’t it, unless they have an impact on your experience of using your PC? As @kieran_read has said, the actual cost is pretty nominal.

I’m willing to learn though. Why does it bother you?
 
Just… why though? Heat? Noise? Both of those are fair enough concerns.

Random efficiency ratings… sort of meaningless isn’t it, unless they have an impact on your experience of using your PC? As @kieran_read has said, the actual cost is pretty nominal.

I’m willing to learn though. Why does it bother you?

Just a general cost per watt / cost per fps.

I expect a flagship card to use more power. Would I buy a 4060 if it used 400W? Nope.
 
I think it was HUB who investigated something quite similar to this, what is happening with Nvidia GPU's when they run out of VRam is the texture resolution is much reduced, the textures are visibly blurry in comparison to a GPU with more VRam, so it doesn't manifest its self, necessarily, on the performance charts, and it can look like the GPU is using less VRam, because it is with it running a reduced texture quality, you would actually have to compare the visuals of the game to see its not just as simple as 'number lower'
Just looking at the numbers and concluding "Nvidia better at managing VRam" in a sense yes, the question is the side effects of that and weather its a reasonable argument vs just having a higher VRam pool. I would argue it isn't but Nvidia always find a way of giving you less and then somehow have its fans die on the hill of advocating less is better.
Its quite amazing actually.


Link to hub?

Also what i am talking about happens on gpu's with a lot more vram such as a 4090/5090. But i am sure you will ignore that.

You know what else is amazing? DLSS, RT, FrameGen, but AMD always find a way of giving you less and then somehow have its fans die on the hill of advocating less is better. It is indeed quite amazing actually.

Yea. I can do that too :p:cry:
 
Link to hub?

Also what i am talking about happens on gpu's with a lot more vram such as a 4090/5090. But i am sure you will ignore that.

You know what else is amazing? DLSS, RT, FrameGen, but AMD always find a way of giving you less and then somehow have its fans die on the hill of advocating less is better. It is indeed quite amazing actually.

Yea. I can do that too :p:cry:

RT, i mean RT on the 4070 is no more or less usable than it is on the RX 7800 XT.

No one is saying DLSS isn't good, it is.... i wouldn't say that about FrameGen, i have no doubt AMD will come up with a 4X FrameGen and it will be just as bad.

I don't put a 20%+ monetary premium on DLSS, i would only use upscaling tech as an absolute last resort and so far i have never needed it.
 
Last edited:
I think it was HUB who investigated something quite similar to this, what is happening with Nvidia GPU's when they run out of VRam is the texture resolution is much reduced, the textures are visibly blurry in comparison to a GPU with more VRam, so it doesn't manifest its self, necessarily, on the performance charts, and it can look like the GPU is using less VRam, because it is with it running a reduced texture quality, you would actually have to compare the visuals of the game to see its not just as simple as 'number lower'
They're not just visibly blurry, they automatically downgrade or just don't load in at all, it's used as a fallback to prevent slideshow fps in some engines when you overshoot your vram pool, games like Hogwarts, Forspoken, Halo, Forza, AFOP, SWO.

TLOU at launch totally dropped parts of the game with blacked out textures, and this isn't exactly new news, it was highlighted in 2023=old news although Hub was one of the first to explain it.

Avatar devs(pre release) explained Snowdrop engine has built in ~auto reduction of textures when you run settings higher than available vram.

DF highlighted a vram fallback in Indy when the top half of the screen outputs lower textures with the bottom half running normal textures.
 
Last edited:
I honestly find it hilarious people go on about power draw.

Buys a £1000 graphics card but cry's about spending an extra £20 a year on electric, if you can't afford to power the card maybe you shouldn't be buying cards in that tier in the first place (Not directed at you just a general thing)

https://www.theenergyshop.com/guides/electricity-cost-calculator you can work it out so easy 160 watts is £260 a year ASSUMING you spend 100% of the time at full power draw, every hour, every second every millisecond at 100% power draw at all times.

That's just not going to happen.

Now lets say someone plays 4 hours a day (28 hours a week (I should point out the average game time per week is 8 hours)) that's £46.37 a year again assuming that in that period of time 100% of the time you are drawing 100% power.... which again... does not happen in a gaming session, depends on the game, what you're doing, the environment.

A generous assumption assumption is it costs £20 more a year to run, which is not worth even talking about when you're spending £1000 on a card if you can't afford £20 on electric buy a cheaper card and set your priorities right.

It's just when you're talking about the level we are on this shouldn't even be part of the conversation, the cost different is negligible to non existent if you are a light gamer, barely noticeable if you are a medium gamer. Even if you are a heavy gamer you would need to be clocking 8 hours a day to warrant the difference EVEN THEN you would have to be someone who keeps the card for at least 4 years or more to factor in the difference in prices to make a saving.

This is all based on UK electric costs we have some of the higher costs, go to America and Canada their costs are even lower so the difference is even narrower.

I wish people would move away from the focus on power, performance should be the selling point.


No one here actually cares about power draw; the only time you see someone bring it up is when they are trying to use it to win a fanboy argument, but outside of the eforum arguments, no one in the real world actually cares how much watts their GPU uses


The only people who really care about power draw is miners and the person managing their data centres power bill
 
Last edited:

I hate this modern day nonsense where reviews only come out the day before launch. The average impatient person will mindlessly buy before checking enough reviews, which is what these companies are banking on, especially Nvidia. I guess the reviewers will have pricing info to help them do their comparisons, will be interesting to see if reviewers slam these GPUs or are impressed by them.

So i assume that confirms they go on sale on the 6th, and then i am flying out of the country on the 8th. I hope they still have stock when I get back :(

If they remain in stock, that would be a bad indicator that nobody wants them (they aren't worth having) or far less likely, that AMD prepared enough stock for launch. If they are good, most likely they will sell like hot cakes and launch supply will be sold out. Even if there's more supply than 5000 series, it just means they last longer before being sold out. Maybe 5 minutes instead of 1 minute :cry:
 
We need to accept that even if AMD price these cards reasonably well, poor stock flow can be “simulated” by retailers to allow price gouging. So expect large mark ups and low stock at most, if not all retailers.

It is the way.
 
Last edited:
RT, i mean RT on the 4070 is no more or less usable than it is on the RX 7800 XT.

No one is saying DLSS isn't good, it is.... i wouldn't say that about FrameGen, i have no doubt AMD will come up with a 4X FrameGen and it will be just as bad.

I don't put a 20%+ monetary premium on DLSS, i would only use upscaling tech as an absolute last resort and so far i have never needed it.

So no link then :cry:

Nuff said really.

You was loving FSR 3 FrameGen not long ago too. Too funny.

Anyway, i replied to a post. Now we being steered off topic. Going to be interesting to so how many 7000 series users upgrade to 9000 series soon :D
 
Going to be interesting to so how many 7000 series users upgrade to 9000 series soon
I think if you are on a 7800XT or 7900 series it is a neglible gain (unless you value RT and FSR), unless priced very keenly due the 16gb RAM on the 9070.

6800XT/6900XT, i'm tempted to say the same. If I didn't need a GPU for another machine I would be certainly waiting if the new cards are priced £600+ until the inevitable price drop, or even skip completely. In fact I may do so anyway if it all gets a bit scalpy like the recent NV launch and go 2nd hand (a first for me).

Edit - for context, I am currently on a 6900XT.
 
Last edited:
No one here actually cares about power draw; the only time you see someone bring it up is when they are trying to use it to win a fanboy argument, but outside of the eforum arguments, no one in the real world actually cares how much watts their GPU uses


The only people who really care about power draw is miners and the person managing their data centres power bill
Objectively incorrect.
 
I hate this modern day nonsense where reviews only come out the day before launch.

AMD has learned from past mistakes and waits until the last possible moment to allow for driver and game patches to be available to the reviewers.

Day one performance has always been historically bad for AMD. Look at the punch that the 7900XTX giving out now versus release day.

AMD are also very good at drip feeding little performance nuggets so we have a good idea already of what you are buying.

There is also a sense with this launch that AMD want to sync up their new CPU and MB launch as much as possible so you can buy an all AMD system in one purchase. Looks like board partners have been given final samples of the new CPU in the past week.
 
No one here actually cares about power draw; the only time you see someone bring it up is when they are trying to use it to win a fanboy argument, but outside of the eforum arguments, no one in the real world actually cares how much watts their GPU uses


The only people who really care about power draw is miners and the person managing their data centres power bill

Wouldn't say I don't care , more power means more heat , more noise and overall system heat , not just electric cost , if one is 150w more that's noticeable in my opinion
 
No one here actually cares about power draw; the only time you see someone bring it up is when they are trying to use it to win a fanboy argument, but outside of the eforum arguments, no one in the real world actually cares how much watts their GPU uses
Did you just compare real world to enthusiasts forums? Are you feeling ok? :D In real world hardly anyone cares about anything above xx60 class cards. Power usage is largely irrelevant (as it's generally low) in that class of GPUs.
 
Last edited:
GPU sales data says otherwise
Yes, as GPU data sales show huge dominance of xx60 class GPUs, where power usage is very similar and rather low, so largely irrelevant. That said, I care a lot about power usage of my 4090 and plenty of people here do too, for various reasons - which is why we power limit and undervolt it. In winter it's fine, in summer is unbearable heat as it's a space heater.
 
Last edited:
personal view, iGPU and streaming will pretty much kill off this industry so let's just embrace the 600 watt GPUs as a final stand for our community and way of life. Let's go down with the ship.

9090XTX Toxic - what a card !
 
Back
Top Bottom