• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Surely if that were true AMD would have more than 10% marketshare?
Depends on what people put value on and it seems a lot put a lot of value on what I would call fluff(Frame gen, RT in its current iteration, up-scaling, AI in its current iteration). Now I'm not saying that those feature sets are fluff universally, only that I personally find them non essential myself.
 
So instead of spending the same amount of money to get more performance you choose to get the same performance but save £100?

Why am i finding that hard to believe, could it be that outside of the money is no object type of people I've never heard of someone deciding on what GPU to buy based exclusively on performance, how most people seem to set a budget rather than a minimum required FPS.

I got a little more performance by saving £90, yes.
 
Last edited:
So instead of spending the same amount of money to get more performance you choose to get the same performance but save £100?

Why am i finding that hard to believe, could it be that outside of the money is no object type of people I've never heard of someone deciding on what GPU to buy based exclusively on performance, how most people seem to set a budget rather than a minimum required FPS.

You have now.

I buy based on my assessment of what's worth the price. I set an upper limit to how much I'll spend, not an amount that I decide I will spend. So sure, if I find a card that I consider to have enough performance and to be worth the price and it's £100 less than my upper limit I'll buy it. Then probably spend that £100 on something else, like a bunch of cheap games I'll probably leave in my libraries and forget I bought.
 
Surely if that were true AMD would have more than 10% marketshare?
they have more than 10% in some markets its 50%
9070xt brings affordable 4k gaming.

ppl may talk about a 5090 but they wont buy one as it cost 5x more currently (if u even can buy one for the next 6 months) than a 9070xt with no real difference in gameplay.
if people did think about things like that as utuber reviewers are lazy and dont test such.
fps dont matter as gameplay stays the same.
 
In a month's time, we'll have been put out of our misery and will know, whether this is the generation where GPUs were no longer worth buying or the generation where AMD finally made their move to offer the customers what they want: decent GPUs at decent prices.

5060/ti rumours suggest still 8GB (with maybe 16GB expensive variants) cards at $350-450. And I remember the days when £450 was the price of an 80 tier card...

Right now, outside of fanboys defending billion/trillion dollar companies, the general sentiment everywhere seems to be that the current GPU gen is almost as bad as the cryptoboom days, if not worse in many aspects. Even the reviewers/influencers/etc share the hope that AMD has an opportunity to save this generation with well priced products. I just hope they have enough stock if they do.

9jl9vy.jpg
 
I got a little more performance by saving £90, yes.
So you don't set a performance goal and buy the cheapest thing that achieves that goal, you set a budget and try to get the most bang for your buck. If you had set a performance target i would've expected you to say something along the lines of i got X performance for Y price, not to be looking at it as more performance by saving £90 because like you seem to be saying the key metric here for you is getting as close to your performance target as possible.
You have now.

I buy based on my assessment of what's worth the price. I set an upper limit to how much I'll spend, not an amount that I decide I will spend. So sure, if I find a card that I consider to have enough performance and to be worth the price and it's £100 less than my upper limit I'll buy it. Then probably spend that £100 on something else, like a bunch of cheap games I'll probably leave in my libraries and forget I bought.
Not sure what I've done now but whatever.

Setting an upper limit on how much you'll spend is incongruent with not an amount that I decide I will spend, like i said most people seem to set a monetary budget (as in this is the most I'm willing to spend) rather than a FPS budget (as in this is the most FPS I'm willing to pay for).

Most people do not set a monetary budget and buy something with less performance simply because it's cheaper, they buy the thing that has the most performance within what they have budgeted to spend.
 
In a month's time, we'll have been put out of our misery and will know, whether this is the generation where GPUs were no longer worth buying or the generation where AMD finally made their move to offer the customers what they want: decent GPUs at decent prices.

5060/ti rumours suggest still 8GB (with maybe 16GB expensive variants) cards at $350-450. And I remember the days when £450 was the price of an 80 tier card...

Right now, outside of fanboys defending billion/trillion dollar companies, the general sentiment everywhere seems to be that the current GPU gen is almost as bad as the cryptoboom days, if not worse in many aspects. Even the reviewers/influencers/etc share the hope that AMD has an opportunity to save this generation with well priced products. I just hope they have enough stock if they do.

9jl9vy.jpg


Thinking back to when
Screenshot-2025-02-09-095959.png
 
Surely if that were true AMD would have more than 10% marketshare?

100%. Problem is, people are blinded with GPU's. They go tribal, where they're in one camp and refuse to see reason. It's best not to argue, and let AMD fans stay in AMD threads, and Nvidia fans stay in Nvidia threads.

The problems, frustration and arguing tends to happen when one tribe invades the other tribes thread and causes verbal warfare to breakout.
 
I remember getting a B grade 2080 because I thought any 80 class GPU over £500 was a joke. The. I got a 3080 FE for just under £680 and thinking it was a bargain. Then I saw the MSRP of the 7900 XT and XTX and instead bought a used 4080 for £900 and felt like it was massively overpriced even at that used price.

It’s not inflation that’s causing these prices. It’s greed.
 
they have more than 10% in some markets its 50%
9070xt brings affordable 4k gaming.

ppl may talk about a 5090 but they wont buy one as it cost 5x more currently (if u even can buy one for the next 6 months) than a 9070xt with no real difference in gameplay.
if people did think about things like that as utuber reviewers are lazy and dont test such.
fps dont matter as gameplay stays the same.
But we're talking about GPU aren't we? We're in the GPU section in a thread about RDNA4 no?
Why do people keep replying to me talking about the wider AMD business when I'm clearly talking about the GPUs. Are people not picking up the context clues, e.g. the sub-forum and the thread title?
Also 4K? Really? With 16GB VRAM in 2025? It may be enough but I think we're getting to the point where it's borderline. Aren't there already examples of the 5080 hitting VRAM limits at 4K?

I remember getting a B grade 2080 because I thought any 80 class GPU over £500 was a joke. The. I got a 3080 FE for just under £680 and thinking it was a bargain. Then I saw the MSRP of the 7900 XT and XTX and instead bought a used 4080 for £900 and felt like it was massively overpriced even at that used price.

It’s not inflation that’s causing these prices. It’s greed.
Yeah, it's been going this way for a while I feel. And lots of people seem to just blame the current generation for bad prices, complain when people do pay the prices and state how they're not going to bother and they're going to stick with their card from the previous generation. But the previous generation was a stepping stone that allowed the current generation to get where it is. And the one before that contributed to the prices of the previous generation, etc.

100%. Problem is, people are blinded with GPU's. They go tribal, where they're in one camp and refuse to see reason. It's best not to argue, and let AMD fans stay in AMD threads, and Nvidia fans stay in Nvidia threads.

The problems, frustration and arguing tends to happen when one tribe invades the other tribes thread and causes verbal warfare to breakout.
I like to think among the enthusiast community that brand loyalty isn't such a thing. Obviously there are a few and they're probably quite vocal, but many are open to be persuaded by the other side. I'm not saying these people sit on the fence without a preference, but for example just because I prefer Nvidia (from memory I would say that overall I've had less issues with Nvidia) doesn't mean I don't buy AMD (which has led to some regret, for example going crossfire FuryXs over SLI 980Tis).

As I've said, I'm open to a 9070 if it can hit a performance point and a price point that I'm happy with. That's why I'm in this thread, I realise it's a rumour thread but I was hoping for some more solid info by now. That said, I'm not set on a 9070 either, I'm still quite tempted by a 5080 (although again the 16GB VRAM is something I'm not happy with). I think this generation is the one I've been most torn bout since the FuryX/980Ti decision.
 
Also 4K? Really? With 16GB VRAM in 2025? It may be enough but I think we're getting to the point where it's borderline. Aren't there already examples of the 5080 hitting VRAM limits at 4K?

I would go take a peek at the "nvidia-rtx-50-series-technical-general-discussion" thread, they simply lose their minds when you mention vram.

As I've said, I'm open to a 9070 if it can hit a performance point and a price point that I'm happy with. That's why I'm in this thread, I realise it's a rumour thread but I was hoping for some more solid info by now. That said, I'm not set on a 9070 either, I'm still quite tempted by a 5080 (although again the 16GB VRAM is something I'm not happy with). I think this generation is the one I've been most torn bout since the FuryX/980Ti decision.

I am in a similar predicament but it is what it is.
 
Last edited:
I like to think among the enthusiast community that brand loyalty isn't such a thing. Obviously there are a few and they're probably quite vocal, but many are open to be persuaded by the other side. I'm not saying these people sit on the fence without a preference, but for example just because I prefer Nvidia (from memory I would say that overall I've had less issues with Nvidia) doesn't mean I don't buy AMD (which has led to some regret, for example going crossfire FuryXs over SLI 980Tis).

As I've said, I'm open to a 9070 if it can hit a performance point and a price point that I'm happy with. That's why I'm in this thread, I realise it's a rumour thread but I was hoping for some more solid info by now. That said, I'm not set on a 9070 either, I'm still quite tempted by a 5080 (although again the 16GB VRAM is something I'm not happy with). I think this generation is the one I've been most torn bout since the FuryX/980Ti decision.
I think brand loyalty is much less of a thing for actual enthusiasts who have been involved in this hobby for a while i.e. most of us here. I, and a lot of us, have had cards from both camps (and three back when Matrox existed) every generation since graphics cards were a thing because we know either have offered attractive options. Moreover, we understand competition is paramount to a healthy industry, the lack of which we're seeing the ramifications from presently.

I really do hope AMD is cooking something with 90xx/RDNA4 that are "true gamer's cards" offering great performance for a non-insane price points.

But go to Reddit and other "mainstream" forums, and it's an ocean of mouth-breathing brand loyalists as far as the eye can see -- basically children and arrested-development kid-ults who can't see past their own nose.
 
Mindshare is mostly nonsense, and condescending to the customer. People buy Nvidia because they innovate new technologies add a ton of features, and then bring those technologies and features to games and applications. They add additional value, and many PC gamers have interest in areas outside gaming otherwise we would just get consoles. You can argue we don't need those technologies if you want, but as those techniques mature that is becoming an increasingly difficult argument. Nvidia brought tessellation, CUDA, Frame reprojection, NVENC, Variable refresh rate, Ai Upscaling, Ray tracing, Frame generation, RTX HDR, Broadcast, Reflex, mega geometry, Neural rendering during that time AMD has just been reactive and didn't innovate, they have some of those features now but only because Nvidia did it first, and a lot of the time those features were tacked on to existing hardware not planned or designed for.

In the 5000 series alone Nvidia has added DLSS transformer model, mega geometry, Neural rendering MV-HEVC, 422 chroma encode/decode. Transformer model is looking like a big reason to choose NVidia unless FSR4 is extremely special, and the improvements to NVENC are massive for content creators and VR users.
 
The 16gb VRAM thing is from Daniel Owen's video where he sets the texture cache pool to Supreme or whatever at 4k in Indiana Jones and the 5080 drops to 3fps. The whole thing is vastly overblown.

But everyone is just parroting "16gb can't do 4k" without watching the video and ignoring context.
 
In the 5000 series alone Nvidia has added DLSS transformer model, mega geometry, Neural rendering MV-HEVC, 422 chroma encode/decode. Transformer model is looking like a big reason to choose NVidia unless FSR4 is extremely special, and the improvements to NVENC are massive for content creators and VR users.

Yeah I would use the newer DP standard and the additional encode/decoders as key selling points (coming from an Ampere card that lacks this). The AI improvements for local LLM's and other apps would be a welcome benefit too. However that said a £600 9070XT is still tempting at this stage.
 
Mindshare is mostly nonsense, and condescending to the customer. People buy Nvidia because they innovate new technologies add a ton of features, and then bring those technologies and features to games and applications. They add additional value, and many PC gamers have interest in areas outside gaming otherwise we would just get consoles. You can argue we don't need those technologies if you want, but as those techniques mature that is becoming an increasingly difficult argument. Nvidia brought tessellation, CUDA, Frame reprojection, NVENC, Variable refresh rate, Ai Upscaling, Ray tracing, Frame generation, RTX HDR, Broadcast, Reflex, mega geometry, Neural rendering during that time AMD has just been reactive and didn't innovate, they have some of those features now but only because Nvidia did it first, and a lot of the time those features were tacked on to existing hardware not planned or designed for.

In the 5000 series alone Nvidia has added DLSS transformer model, mega geometry, Neural rendering MV-HEVC, 422 chroma encode/decode. Transformer model is looking like a big reason to choose NVidia unless FSR4 is extremely special, and the improvements to NVENC are massive for content creators and VR users.

You say all that but I'm more interested in 9070/XT over 5070/ti , if the value is better with rasterization, playable RT and hardware upscaling is good

What's Frame reprojection ? Is that frame generation? Couldn't care less for that its even more worthless the lower GPU you go because you need certain base FPS
 
Last edited:
Snip due to all being utter tripe

lol at so much revisionist nonsense. ATI did tessellation, not Nvidia. AMD did multi monitor gaming, upscaling was a console tech long before Nvidia “invented it”.
VRR was a laptop battery saving tech before Gsync was “invented”

Lots of the tech you attributed to Nvidia “inventing” was actually pioneered long before you think it was, by companies other than Nvidia.

I’m happy to give credit where it’s due but you are literally convincing yourself Nvidia “invented” technologies that were copies of existing technology.

Nvidia take someone else’s idea = innovation.
AMD take someone else’s idea = reactionary.
 
Last edited:
You say all that but I'm more interested in 9070/XT over 5070/ti , if the value is better with rasterization, playable RT and hardware upscaling is good

What's Frame reprojection ? Is that frame generation? Couldn't care less for that its even more worthless the lower GPU you go because you need certain base FPS

I am surprised how many people don't understand that MFG is basically a gimmick to fool people into thinking that they can buy the 5070 and it will outperform the 4090 at a 1/3 of the price (if your lucky).

I think we're going to have to get used to wishy washy graphics in the future because a lot of it is going to be upscaling, ai powered MFG with a few more software bells and whistles added to boost performance stastics on paper but make everything look a bit more **** and with the copium in people whether it's Nvidia or AMD fanboys, they'll fool themselves into thinking this is progress.
 
Back
Top Bottom