• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Last edited:
@terley

I already posted these. If you believe these benchmarks, then I have a bridge to sell you.
got proof that the benchmarks you're quoting uncited are running the same settings as the one reported from HKEPC?

Because I've done testing myself and the 7900xt is no way only 3fps higher than the 6800xt you're quoting


20250213164610-1.jpg


In fact it can only get close to that on low settings..

Btw the 6800xt in that link you edited in to your previous reply... was running high, not ultra

Screenshot-2025-02-16-002811.png
 
Last edited:
Look, running a game at 1080p at hundreds of fps (with FSR and frame gen to-boot) when it is CPU bound, is not a GPU benchmark. That is why it is so close to the RTX 4090 whilst performing only a bit above the 6800 XT. AMD also benefits in CPU bound scenarios due to having what is now widely accepted, a more efficient driver when it comes to using CPU resources.


I don't understand why this kind of thing needs to be explained on an enthusiasts forum.
 
Last edited:
MSRP means nothing these days, Nvidia has cards with an MSRP of $999 but are retailing for $1500 and cards with $2000 MSRP retailing for $3500

There just no logic here, MSRP is meaningless
What I mean is that if AMD announce the MSRP as (say) $400 or $1000, then AIBs won't be $850, will they.

MSRP is the lowest possible sale price and it's the baseline for all other prices, so very meaningful. These leaked prices are meaningless.
 
Btw the
Look, running a game at 1080p at hundreds of fps (with frame gen to-boot) when it is CPU bound, is not a GPU benchmark. That is why it is so close to the RTX 4090 whilst performing only a bit above the 6800 XT. AMD also benefits in CPU bound scenarios due to having what is now widely accepted, a more efficient driver when it comes to using CPU resources.


I don't understand why this kind of thing needs to be explained on an enthusiasts forum.
It doesn't.. and I explained the exact same thing a while back in this thread..

 
The HKEPC benchmark for the 9070 XT was at high, not ultra. Used both FSR and FG.


I bet your 7900 XT beats it now.
 
Last edited:
The HKEPC benchmark for the 9070 XT was at high, not ultra.


I bet your 7900 XT beats it now.
no it wasn't..

高 is High
极高 is Extremley High (Ultra)

Untitled.png
 
You can literally see for yourself by going into the benchmark and changing it to ultra settings, then changing the language to Chinese. The characters match.
 
If you read the article it is only 4-8% slower than the 4090, but also only 14% faster than the 6800 XT.

The 6800 XT must be a a beast. Should buy that.

All this discussion about this terrible benchmark at terrible settings. It's so poorly optimised, that it makes old AMD GPUs not too far off a 4090... perhaps the benchmark tool is just biased towards AMD and performs better on AMD GPUs? Like some other games.

Rumours of $600+ 9070 pricing a few pages back, not sure what to believe on those. If they're right... then I should have bought a £600 7900XT (while they were available) for more VRAM and similar performance. Otherwise, it will be a repeat of previous generations where we wait till half a year or a year after release for AMD GPUs come down to a price where folks actually want to buy them. Or this could be the retailers feeling greedy after getting away with silly Nvidia pricing, thinking they can do the same for AMD GPUs (or tariff-related even).

But yeah, a 9070 non-XT for way more than £500... not worth it when we've seen how cheap one could get 7900XTs for recently.
 
That Monster Hunter benchmark is so bogus. It's basically a CPU benchmark, with a 285k at that. You'll get very little info by comparing other cards, unless you're using a 285k (lol).
 
That Monster Hunter benchmark is so bogus. It's basically a CPU benchmark, with a 285k at that. You'll get very little info by comparing other cards, unless you're using a 285k (lol).

Already done, the 9070XT scores 10k more than a 7800XT at what we assume is the same settings. No info though on how that 285k was set up in the leak though, so I set mine up at Intel baselines.
 
That Monster Hunter benchmark is so bogus. It's basically a CPU benchmark, with a 285k at that. You'll get very little info by comparing other cards, unless you're using a 285k (lol).
You would think so but no. Running PresentMon and observing GPUbusy behaviour with 4090 I am almost always GPU bound in that benchmark, not CPU. It's just a very badly optimised code, that hogs the GPU for no good visible reason, it seems.
 
All this discussion about this terrible benchmark at terrible settings. It's so poorly optimised, that it makes old AMD GPUs not too far off a 4090... perhaps the benchmark tool is just biased towards AMD and performs better on AMD GPUs? Like some other games.

Rumours of $600+ 9070 pricing a few pages back, not sure what to believe on those. If they're right... then I should have bought a £600 7900XT (while they were available) for more VRAM and similar performance. Otherwise, it will be a repeat of previous generations where we wait till half a year or a year after release for AMD GPUs come down to a price where folks actually want to buy them. Or this could be the retailers feeling greedy after getting away with silly Nvidia pricing, thinking they can do the same for AMD GPUs (or tariff-related even).

But yeah, a 9070 non-XT for way more than £500... not worth it when we've seen how cheap one could get 7900XTs for recently.
I just played Indy on Supreme/max settings on 1440p and get respectable 70-90 fps- the more I think about it there is no reason to upgrade for me.

BUT if the 9070XT is a crazy £599 then that would tempt for FSR4.
 
affordable 4k gaming cards inc

Nvidias 5000 series not gonna be in stores for months

time to go amd radeon
A 9070XT with 90fps in 4k+AMD FG for Black Myth should be enough for most but the problem is that this game will be in almost all of the benchmark reviews but its not in the supported FSR4.0 list.

Many top games are in the list but I hope that specific example is not used, Nvidia has form in picking on one game and getting reviewers to shill it to death in the launch reviews.

Control (2019), Cyberpunk (2020), Black Myth (2024/5).
 
Last edited:
A 9070XT with 90fps in 4k+AMD FG for Black Myth should be enough for most but the problem is that this game will be in almost all of the benchmark reviews but its not in the supported FSR4.0 list.

Many top games are in the list but I hope that specific example is not used, Nvidia has form in picking on one game and getting reviewers to shill it to death in the launch reviews.

Control (2019), Cyberpunk (2020), Black Myth (2024/5).
I played Black Myth on a mates 4090- Geez that game is so demanding. He’s getting 45fps on native at 4k with cinematic and RT max settings. Even with frame gen and DLSS quality it barely hits 60fps.

lol :Dwe had to drop settings to Ultra/very high to get 90 fps.
 
Last edited:
I played Black Myth on a mates 4090- Geez that game is so demanding. He’s getting 45fps on native at 4k with cinematic and RT max settings. Even with frame gen and DLSS quality it barely hits 60fps.

lol :Dwe had to drop settings to Ultra/very high to get 90 fps.
Path tracing on its own knocks off about 20-30fps
 
Tin foil hat time: Maybe these leaked prices are testing the waters, whether intentional or not you'd hope someone at AMD is gauging the communities reaction to these leaked prices.

Will that form part of the decision on price, we'll only know when they announce them.
Could be. There are so many factors at play. They could probably sell at $850 for the big boy Red Devil for about a month regardless of supply, as in reality against a £1k 5070ti that you can only buy via F5-mashing it looks a good deal, certainly if other AIBs are around $750.

But when the dust settles in 6 months' time, it's pretty bad and does very little for the GPU market. Everything they've done up to this point has been pretty sound, so you would hope they price sensibly and not just push out a product with a less terrible price than the competition for the short term.

$599 for the reference XT would cause a massive stir. Of course some retailers will probably still sell AIBs for $800, but it'll age like (ahem) fine wine when the market settles.
 
Back
Top Bottom