• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX 7900 GRE as of 20/03/2024 - Not such a flawed product?

In short, I am happy to take a power hit if it means the image looks better.
I suspect it is a driver bug, the 7900 XTX recorded 88 watts when released, but in TPU's latest review it has literally halved. The 4070 Super will still be lower, I expect, but nowhere near the gap there is now.
 
I suspect it is a driver bug, the 7900 XTX recorded 88 watts when released, but in TPU's latest review it has literally halved. The 4070 Super will still be lower, I expect, but nowhere near the gap there is now.
If its higher then expected then yes it be a bug, I think most people were like, omg AMD is higher then Nvidias but really people just ignore the image processing pipeline and with my ownership between AMD and Nvidia of the number of cards I've had between them, AMD always looked nicer out of the box and I would attribute this to higher power usage anyways.
 
and I would attribute this to higher power usage anyways.
I have about as much knowledge of video playback as I do about building a skyscraper but my guess is that video playback power consumption is similar to idle and multimonitor power consumption, i.e. how much of the card can you turn off and not crash. I think cards with smaller dies, narrower bus and less memory capacity find it easier to power down than the big beasts.

I don't think the video playback hardware is substantially different between a low power card and a high power card within the same generation and I would be surprised if any quality or visual differences are accounted for by higher power consumption, because cards within the same class tend to have similar power consumption, except for the 7900 GRE/XT/XTX which seem to have had bugged drivers for awhile now and the Intel Arc cards which don't seem to have the sophisticated power down technology that AMD/nvidia do, without making a lot of BIOS changes.
 
Last edited:
I don't get the pricing on this card. OCUK has three cards between £530 and £550 which is a decent price for the performance. The problem is that the 7800XT now appears to be terribly overpriced and needs a significant price cut. That brings other problems in that the whole AMD line up under the 7800XT would be affected and also need large price cuts so they have got themselves in a right pricing mess with the introduction of this new card. All gpu's are terribly overpriced these days so for the money AMD has turned out a cracking card. It's a shame that they can't seem to lower power draw though.
 
I have about as much knowledge of video playback as I do about building a skyscraper but my guess is that video playback power consumption is similar to idle and multimonitor power consumption, i.e. how much of the card can you turn off and not crash. I think cards with smaller dies, narrower bus and less memory capacity find it easier to power down than the big beasts.

I don't think the video playback hardware is substantially different between a low power card and a high power card within the same generation and I would be surprised if any quality or visual differences are accounted for by higher power consumption, because cards within the same class tend to have similar power consumption, except for the 7900 GRE/XT/XTX which seem to have had bugged drivers for awhile now and the Intel Arc cards which don't seem to have the sophisticated power down technology that AMD/nvidia do, without making a lot of BIOS changes.
playback does use power beyond idle, since its actively doing something not to mention all vendors encode and decode engines to the way they handle an incoming image and a usable image output.

Also I believe all vendors have moved their media engines off the GPU it self and onto self contained units so between say a 7600 and a 7900XT, the media playback should be roughly the same since they should be using the same media engine ( Nvidia and Intel as well ).

but their should be a difference between each vendor when it comes to power consumption because they will all would have designed and programmed each unit to interpret colour, white and blacks, 0-255 spectrum of dynamic range very differently.

There is a reason why also Cameras all look different in their photos ( when using Raws), and if you ever see people owning 2 or more vendors making an image look the same is because they tweaked it to do so intentionally after the fact.

Another odd test you can try is, play a 1080p or 4K video on your phone and see how much it warms it up, if its getting hot it means its processing something.
 
playback does use power beyond idle, since its actively doing something not to mention all vendors encode and decode engines to the way they handle an incoming image and a usable image output.
Yeah, I don't mean it is similar to idle in the sense that nothing is happening, just that it isn't using the whole GPU and doesn't need the full GPU/memory clocks, which I believe is how AMD are able to slash the power consumption for 7900 cards so drastically between drivers, because they can optimise them to reach the optimal power down state, while keeping enough of the GPU alive for playback purposes.

If they apply the same optimisations to the GRE, I'd guess the 64 will fall to 30 odd, unless there's something substantially different between the GRE and other cards (I don't think so, but the GRE uses slower memory and this might have an impact and there's also a difference between memory brands).

Also I believe all vendors have moved their media engines off the GPU it self and onto self contained units so between say a 7600 and a 7900XT, the media playback should be roughly the same since they should be using the same media engine ( Nvidia and Intel as well ).
Yeah, they have, though there's still an interaction with the rest of the card to some degree.
 
Last edited:
Yeah, I don't mean it is similar to idle in the sense that nothing is happening, just that it isn't using the whole GPU and doesn't need the full GPU/memory clocks, which I believe is how AMD are able to slash the power consumption for 7900 cards so drastically between drivers, because they can optimise them to reach the optimal power down state, while keeping enough of the GPU alive for playback purposes.

If they apply the same optimisations to the GRE, I'd guess the 64 will fall to 30 odd, unless there's something substantially different between the GRE and other cards (I don't think so, but the GRE uses slower memory and this might have an impact and there's also a difference between memory brands).


Yeah, they have, though there's still an interaction with the rest of the card to some degree.
I know AMD had bugs with their clocks going up and down for no reason before but perhaps a badly tuned memory clock tuning or think it was a game running due to the incoming work load.

I don't disagree rest of the card shouldn't make a difference but it won't be by much in all honesty but I believe AMD power usage for media playback is higher then Nvidias either way and I believe the reason for that is what I've said so far.

Mean when I had my 3070ti, running a high bit rate HVEC file would set alight the decoder.
 
but I believe AMD power usage for media playback is higher then Nvidias either way and I believe the reason for that is what I've said so far.
We're going a bit off topic from the GRE here, but I don't believe that's the case, because if you look at TPU's results more generally, AMD cards have similar or less video playback usage than competing nvidia cards, for example:

RX 6600: 12 watts
RTX 2060: 14 watts
RTX 3050: 11 watts

RX 6600 XT: 10 watts
RTX 3060: 15 watts

RX 6700 XT: 15 watts
3060 Ti: 17 watts

3090 Ti: 41 watts
6900 XT: 42 watts

It isn't until the big beasts and especially higher-end RDNA 3 cards that AMD have screwed up and are beaten by Ada and that has been the case since day 1 reviews, though they have definitely improved since then.

The GRE appears to follow that pattern of being messed up in day 1 reviews.
 
High power usage for multi monitor and playback has, I think, a lot to do with VRAM clocks. That's why some reviews list the clocks.
TPU now does, I'm sure I first saw this elsewhere though.
So yes, video playback has the full VRAM clocks.

While drivers can hopefully fix that, sometimes a manual profile might work too.

Since all the new GPUs now have so much cache, I have wondered why the cache cannot be used for video playback leaving the main VRAM idling at 10Mhz or so.
 
We're going a bit off topic from the GRE here, but I don't believe that's the case, because if you look at TPU's results more generally, AMD cards have similar or less video playback usage than competing nvidia cards, for example:

RX 6600: 12 watts
RTX 2060: 14 watts
RTX 3050: 11 watts

RX 6600 XT: 10 watts
RTX 3060: 15 watts

RX 6700 XT: 15 watts
3060 Ti: 17 watts

3090 Ti: 41 watts
6900 XT: 42 watts

It isn't until the big beasts and especially higher-end RDNA 3 cards that AMD have screwed up and are beaten by Ada and that has been the case since day 1 reviews, though they have definitely improved since then.

The GRE appears to follow that pattern of being messed up in day 1 reviews.
the media engines are different in each gen granted, the ones in the 7000 series are different to those in the 6000 series but those big gpus just look like outliers more then anything.

the RX6600 and 6700XT are different beasts in performance but their wattage are about the same, if anything within margin of error for video playback.

but on the topic of the GRE, yeah, the drivers do seem funky, its rather obvious that AMD never wanted this segment covered as it meant price corrections to take place.

I do dislike that AMD always had a GPU at this point but never intended to wider audience to own it.
 
50 Watts while watching a video, 35 watts idle, the idle seems a little high to me, but meh.... its not something i'm concerned about, that idle cost for me is about £2 a year and my computer is on all day.
 
This might be relevant: https://www.tomshardware.com/pc-com...king-limit-will-be-removed-in-a-future-driver

Overclocking the 7900GRE is artificially limited. Allegedly AMD has now said that's due to a bug and that it will be fixed. Absolutely nothing to do with maintaining a larger gap between the 7900GRE and the much more expensive 7900XT.

I bought a 7800XT on a whim a week or two ago. If I'd have known that a 7900GRE would be available at the same price, I would have waited a fortnight and bought that instead. It's not hugely better, but it is better.

Oh well. I've made purchasing decisions that turned out worse than that one. It happens sometimes with PC hardware.
 
This might be relevant: https://www.tomshardware.com/pc-com...king-limit-will-be-removed-in-a-future-driver

Overclocking the 7900GRE is artificially limited. Allegedly AMD has now said that's due to a bug and that it will be fixed. Absolutely nothing to do with maintaining a larger gap between the 7900GRE and the much more expensive 7900XT.

I bought a 7800XT on a whim a week or two ago. If I'd have known that a 7900GRE would be available at the same price, I would have waited a fortnight and bought that instead. It's not hugely better, but it is better.

Oh well. I've made purchasing decisions that turned out worse than that one. It happens sometimes with PC hardware.
You still have a great card, i wouldnt get too hung up on it. Side by side you probably couldnt tell them apart.
 
imo, AMD is putting out better products the last few gens. I say that as someone who had a 3070(well a bunch of stuff in my mining rig) and currently has a 4080 and has been happy with it.

AMD is offering better value, which is kind of weird to me. With nvidia's resources, they should be able to drown any competitor with unattainable performance/dollar. Instead, their new product stack barely beats the old one. What are they doing with their development time and resources? Maybe people are right, maybe they don't pay much attention to the GPU market anymore.
Or they're in such a strong position, they know they can release nonsense and people will buy. I wish I had bought an XTX tbh. At the time, I was trying very hard to get playable framerates in VR iracing/ams2/ACC/AC, and couldn't find a 4090 for a month, and the XTX was said to have problems in VR.

Well the quest 3 finally offers the clarity that I don't need to run 1.8x super sampling to see the car in front of me, so hopefully my next purchase will be a lot more discerninng and I'll be able to wait 4-5 months for the full picture.
 
£520-550 is something i can buy (1k+, Never) so would this be a good buy? gaming at 1080/144hz
reviews look not bad. been out of loop buying pc parts, till last 2-3 months. recently upgraded to 5800xd & msi g271qpf 27 inch.
so looking for a card to go with these 2 upgrades

 
£520-550 is something i can buy (1k+, Never) so would this be a good buy? gaming at 1080/144hz
reviews look not bad. been out of loop buying pc parts, till last 2-3 months. recently upgraded to 5800xd & msi g271qpf 27 inch.
so looking for a card to go with these 2 upgrades

Looks good and will last you a fair while esp considering your recent upgrades, good solid card for the budget.
 
Back
Top Bottom