• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX 7900 GRE as of 20/03/2024 - Not such a flawed product?

cant-deal.gif

:D


Says there is a bug with overclocking AMD are aware of and will address
 
They do run better RT, you're right.

I would say i don't understand why they should only be 2% faster in 90% of rasterization games vs the 7800 XT, or at least that's what i though after watching the HUB review, then i watch the GN review and his results put HUB's results in stark contrast, in every game Steve Burke tested the GRE was around 10% faster than the 7800 XT, putting it right in the middle between the 7800 XT and 7900 XT.

Yet again i find my self questioning HUB's results when it comes to AMD GPU's. :mad:

I _____ knew it....

HUB, just stick to reviewing the cards they send you, i looked in all the usual places both UK and US and i can't find the "reference model" anywhere, like the 7700 XT reference model i don't think it actually exists in this global release.

Its a different card to the China only model that you have, its not that "board partners have given them a higher TDP" it has a higher TDP from the China only model, looking at the reviews it seems they ALL DO.

You _____ up, you assumed the China model and the gload release version are the same card so you used your China model data and called it good, everyone else used the cards they were sent and that made your results the outlier.


 
Last edited:
I _____ knew it....

HUB, just stick to reviewing the cards they send you, i looked in all the usual places both UK and US and i can't find the "reference model" anywhere, like the 7700 XT reference model i don't think it actually exists in this global release.

The reference model is available on a UK site
 
So, the RX 7900 GRE Nitro+ compares favourably to the RTX 4070 Super?

92rQuYc.jpeg


That's something, at least. Good cost per frame on the GRE models a well.
 
Last edited:
So, the RX 7900 GRE Nitro+ compares favourably to the RTX 4070 Super?

That's something, at least.
TPU has the average gaming power consumption of the Nitro at 296 watt and the 4070 Super at 218, so it is annoying HUB didn't include those numbers (unless I missed it).

The video playback number is weirdly high too at 64 watt though the Pulse got the same number.
 
Yup, the RTX 4070 S is pretty power efficient. At least they got rid of those huge power spikes on RDNA3 (vs RDNA2) though.

I'd lean towards the GRE Nitro now if was I buying, because of the greater amount of VRAM.

Looks like they might need to re-review again with fixed drivers for the GRE, to show correct OC results :cry:

Maybe the current BIOS is locked down, as the GRE was intended for prebuilt systems originally?

AMD y u no let people overclock the memory on the GRE more than a little?
 
Last edited:
And it looks like AMD will fix overclocking on 7900 GRE:
 
And it looks like AMD will fix overclocking on 7900 GRE:

That would be great news, does that mean the memory will go higher ? because if it does it will push closer to the 7900XT
 
And it looks like AMD will fix overclocking on 7900 GRE:
I call BS on that. The card been out since last July and only now they tell us the artificial OC limitation is a bug?!?
 
Last edited:
TPU has the average gaming power consumption of the Nitro at 296 watt and the 4070 Super at 218, so it is annoying HUB didn't include those numbers (unless I missed it).

The video playback number is weirdly high too at 64 watt though the Pulse got the same number.
Can I be honest, everytime I have gone back between Nvidia and AMD, I have always noticed far more then subtly that video playback on AMD is far more punchier in colours, saturation and even how whites look like.

Nvidia has always had a tinge in the whites and this is not touching any of the monitor settings to adjust anything.

In short, I am happy to take a power hit if it means the image looks better.

But also the question is, why do people think video playback is nothing? Video playback processing does require power especially when playing back high bit rate videos.

4K blurays really spin up which ever player you are using in contrast to full HD ones, the PS5 really spins up when using 4K blurays compared to non 4K ones.
 
Can I be honest, everytime I have gone back between Nvidia and AMD, I have always noticed far more then subtly that video playback on AMD is far more punchier in colours, saturation and even how whites look like.

Nvidia has always had a tinge in the whites and this is not touching any of the monitor settings to adjust anything.

In short, I am happy to take a power hit if it means the image looks better.

But also the question is, why do people think video playback is nothing? Video playback processing does require power especially when playing back high bit rate videos.

4K blurays really spin up which ever player you are using in contrast to full HD ones, the PS5 really spins up when using 4K blurays compared to non 4K ones.

Placebo or there is a setting you haven't adjusted somewhere on either amd or/and nvidia side, this has been covered by a few people with the required hardware to measure if there would be any differences:


Tim here from Hardware Unboxed.
Part of my monitor review workflow involves testing monitors on both Nvidia and AMD GPUs. Two separate test systems, both running default settings in their respective control panels.
Currently the Nvidia system has an RTX 3090 and the AMD system an RX 5700 XT
I've never spotted a difference between them in color reproduction. I've measured it using my tools in the desktop, web browsers, games. Taken side by side photos and captures. Never spotted any differences. They produce identical images.
Because this comes up every so often I did look into it to see if it was worth making a video on but the conclusion was there was no difference so it wasn't worth making a video. Since I can't reproduce it I have to assume it's some sort of configuration issue.
EDIT: Back in the day I used to see this occasionally when Nvidia would accidentally default to the wrong RGB range (limited instead of full) but in this particular case apparently that is not the problem so I don't really know how in this case the difference is happening. And those limited/full range issues were a while ago, would have to be several years now
 
Placebo or there is a setting you haven't adjusted somewhere on either amd or/and nvidia side, this has been covered by a few people with the required hardware to measure if there would be any differences:

Ah there we go, I wanted to say in before Nexus18.

Answer this, different levels of imaging processing and desired output requires different amount of power used, if image is processed differently then the output will look different, unless you think this isnt the case.
 
Last edited:
Ah there we go, I wanted to say in before Nexus18.

Answer this, different levels of imaging processing and desired output requires different amount of power used, if image is processed differently then the output will look different, unless you think this isnt the case.

Post some empirical evidence and not anecdotal evidence then your point would be more valid.

Again, you should know for most things regarding these things, there is ways to achieve same or better outcome with less power and by being more efficient, not everything comes down to just hardware and power consumption....
 
Back
Top Bottom