• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Insanity is doing the same thing over and over again expecting a different outcome.

AMD have and have not in the past tried to battle Nvidia on price, including selling at below cost, ATI tried it before them, none of it ever worked, why? Because this market only ever blames Nvidia's opponents for Nvidia's deeds, they have the perfect market position, they can do as they please and when they do AMD cop the fallout from it further diminishing AMD's mindshare and bolstering Nvidia's.

This market is insane and its not going to take much more before even AMD recognise that as a hard truth.
 
With the amount of additional non-gaming utility with Nvidia cards and the way it all "just works" you are effectively getting a suite of products with team green's offering. If you're the average Windows user, AMD (for now at least) are only really offering you one function: gaming... It's a little bit like the difference between buying a PC and a console: even if the gaming experience was similar, I'd always expect to pay more for the PC because of everything else it can do.

I don't follow your logic here dude... you say "average Windows user" and then say "additional non-gaming utility".

What utility is this?

"Average windows user" uses MS Office, browses the web, listens to music, does some light photo editing, and maybe dabbles in some gaming. Where do CUDA and other differentiated features from NV fit into this?

I would consider myself an "above average" Windows user, and I don't have any use for it... when I game, I want raw raster with high IQ, high fps, and stability. With that criteria, the experience between the two at any given tier of card for the past several generations has been indistinguishable. The comparison to consoles is pretty comical :cry:

Does this sound crazy to anyone else?

Yes!

Whilst the Steam survey data is an interesting data point, it has been shown to have huge swings in its findings due to the random nature of the sampling.

TBH, I don't understand why Steam doesn't have it in their T&Cs that they can just auto-collect hardware data from users... it'd be an incredible data set.

You literally mention the "Average" gamer.... the "Average" gamer doesn't care about 90% of this software outside of DLSS and FSR they don't turn this stuff on unless it's on by default I'm a game.

I agree, all these bells and whistles only bother folk like us who are sat here debating. I would argue "average gamer" doesn't even really grasp things like DLSS/FSR...

"Average gamer" is the guy who boots up a new game and doesn't even bother entering the settings menu before playing... and Nvidia smashes it with the marketing, game sponsorship and headline press coverage, so they will always be top of mind for folks that just want to get on with having fun!

If you think Nvidia GPU's are too expensive take it up with them, its not AMD's job to get in to a price war with Nvidia on your behalf, its AMD's job to make as much money as they can from their products, just as it is Nvidia's...

Yes, 100%.
 
Bad size news (for me at least).

I did some quick "measuring" in PowerPoint of the Powercolor card images that TPU captured. They are bigguns:
  • Red Devil ~360mm (though I think 356mm has been confirmed?)
  • Hellhound ~340mm
  • Reaper ~310mm (this is the only one that would work for me)
[edit] OCD made me do the others I could find :p
  • AMD Reference (XT) ~291mm
  • XFX Merc ~340mm
  • XFX Swift ~306mm
  • Sapphire Pulse ~282mm
  • ASRock Challenger ~294mm
  • ASRock Steel Legend ~290mm*
  • ASRock Taichi ~326mm*
  • Asus TUF ~339mm* (edit: updated but might still be wrong - see Grim5's post below)
*skewed pic, lower confidence





 
Last edited:
My case has 470mm of GPU clearance :D

These GPUs are too small for me

That 390mm for the Asus TUF 9070XT must surely be an error? The Asus TUF RTX5090 575watts GPU is only 350mm
 
Last edited:
Just to clear something up - my £600 figure is based on a mid-level AIB in the wild. I don't think the MSRP for the reference should be that much.
 
The argument about prices needs to be put in context of AMDs own claims to be chasing for additional market share this round. NOT based on what Nvidia are doing with their latest stealth shift up one tier nonsense.

If this is indeed AMD’s intent, then they absolutely do need to do a Ryzen 2017 moment. Ignore the competition “tiering” and go for the mindshare with a major price saving. Pricing 15% below Nvidia for equivalent perf has not worked for AMD when it comes to market share.

Let’s be brutally honest here, these RDNA4 GPUs have been designed from the outset to replace the existing 7800 XT and 7700 XT price points. This means that they were designed and built to make a profit at those respective prices. So if AMD price above their own current tiers and prices, it would be pure greed… PERIOD.

If they price these according to their “perceived” performance vs Nvidia, then they can kiss goodbye to the last 10% market share they have.

A 9070 XT that competes with a 5070Ti for about the same price as a 5070. Thats going to win mindshare.

A 9070 that beats (not just competes) with a 5070 for £100 less, that’s going to win mindshare.

So in both cases think ~30% better price/performance (from the Nvidia baseline). Thats what AMD need to be doing and let’s be clear they absolutely can do this, because yet again Nvidia have left the goal wide open.
 
Last edited:
The argument about prices needs to be put in context of AMDs own claims to be chasing for additional market share this round. NOT based on what Nvidia are doing with their latest stealth shift up one tier nonsense.

If this is indeed AMD’s intent, then they absolutely do need to do a Ryzen 2017 moment. Ignore the competition “tiering” and go for the mindshare with a major price saving.

Pricing 15% below Nvidia for equivalent perf has not worked for AMD.

Let’s be brutally honest here, these RDNA4 GPUs have been designed from the outset to replace the existing 7800 XT and 7700 XT price points. This means that they were designed and built to make a profit at those respective prices. So if AMD price above those current tiers and prices would be pure greed PERIOD.

If they price these according to their “perceived” performance vs Nvidia, then they can kiss goodbye to the last 10% market share they have.

A 9070 XT that competes with a 5070Ti for about the same price as a 5070. Thats going to win mindshare.

A 9070 that beats (not just competes) with a 5070 for £100 less, that’s going to win mindshare.
I fear they're going to do what they always do (recently) and follow nV's lead: they will price a hair below the 5-series equivalents, and then when they don't sell, throw up their arms in incredulity. There are various reasons they will take this approach, foremost being the rule of committee and fact that execs need to answer to a board who want to play it safe (and in turn decimate their already paltry market share). The other reason is they don't want to lower prices and enjoy what nV has been doing in constantly lowering the price/improvement bar.

While it may turn out to be a decent chip, from a moving the needle perspective, it will be dead in the water.
 
I fear they're going to do what they always do (recently) and follow nV's lead: they will price a hair below the 5-series equivalents, and then when they don't sell, throw up their arms in incredulity. There are various reasons they will take this approach, foremost being the rule of committee and fact that execs need to answer to a board who want to play it safe (and in turn decimate their already paltry market share). The other reason is they don't want to lower prices and enjoy what nV has been doing in constantly lowering the price/improvement bar.

While it may turn out to be a decent chip, from a moving the needle perspective, it will be dead in the water.

Agreed but as I posted AMD have already budgeted in a profit at the 7800 XT and 7700 XT price points. So nobody is expecting them to sell them at a loss. But if they price higher than those respective prices, then wave goodbye to any remaining market share or mindshare.
 
For me, if we assume the 9070 sits between the 5070 and 5070ti, and the 9070xt matches the 5070ti. Then I think $500 for the 9070 and $600 for the 9070ti is acceptable. I would certainly be happy with it being less than that. But at that stage the 9070 would be cheaper and faster than the 5070, and the 9070xt would be $150 cheaper for the same performance as the 5070ti. And both could be the clearly better option.
 
For me, if we assume the 9070 sits between the 5070 and 5070ti, and the 9070xt matches the 5070ti. Then I think $500 for the 9070 and $600 for the 9070ti is acceptable. I would certainly be happy with it being less than that. But at that stage the 9070 would be cheaper and faster than the 5070, and the 9070xt would be $150 cheaper for the same performance as the 5070ti. And both could be the clearly better option.

I would agree this would be acceptable but then would caveat that FSR and RT need to be much improved. AND they (AMD) need to get both the 9070 XT and 9070 to market before the Nvidia 5070 and 5070Ti cards are reviewed. AMD need these not just reviewed but reviewed WELL and before Nvidia get to market. Get ahead of Nvidia for once because reviewing against previous gen Nvidia looks better than new gen Nvidia. When someone thinks, “wait to see what Nvidia do” and finds out they waited a month for a 5070 or 5070Ti review, to find they are no faster but a lot more expensive in comparison. That’s how to win mindshare.

- Good prices
- Much improved upscaling
- Almost matching Nvidia’s still current 2nd and 3rd tier performance at significantly lower prices.
- Launched before Nvidia for a change.
 
Last edited:
I don't follow your logic here dude... you say "average Windows user" and then say "additional non-gaming utility".

What utility is this?

"Average windows user" uses MS Office, browses the web, listens to music, does some light photo editing, and maybe dabbles in some gaming. Where do CUDA and other differentiated features from NV fit into this?

I would consider myself an "above average" Windows user, and I don't have any use for it... when I game, I want raw raster with high IQ, high fps, and stability. With that criteria, the experience between the two at any given tier of card for the past several generations has been indistinguishable. The comparison to consoles is pretty comical :cry:



Yes!



TBH, I don't understand why Steam doesn't have it in their T&Cs that they can just auto-collect hardware data from users... it'd be an incredible data set.



I agree, all these bells and whistles only bother folk like us who are sat here debating. I would argue "average gamer" doesn't even really grasp things like DLSS/FSR...

"Average gamer" is the guy who boots up a new game and doesn't even bother entering the settings menu before playing... and Nvidia smashes it with the marketing, game sponsorship and headline press coverage, so they will always be top of mind for folks that just want to get on with having fun!



Yes, 100%.
Thanks for at least trying to engage with at least some of my reasoning. I don't mean to be rude but I think one or two people here have totally missed my point.

If I'm using Windows (as opposed to Linux), I'm sacrificing things like customisation, open-source code, security, privacy, stability etc. I'm doing that because Windows might be compatible with the software I like to use, its user friendliness, and out-of-the-box hardware support. I don't want to have to think too much about it, the thing just does what I need it to do. That doesn't mean I'm stupid, I just don't want to spend time tinkering and troubleshooting things as that isn't a profitable use of my time.

If I have an AMD card on Windows, it's an uphill battle getting it to run things like local LLM's which more and more people are doing now and have become easier to operate thanks to software like LM studio. Whereas Linux plays much nicer with AMD cards.
On Windows, I install my Nvidia card, the drivers/app, and I'm good to go. I've got the convenience of Windows, and can use all of the bells and whistles on my Nvidia card. The AMD card on Windows is going to be a gaming focused card for most people. If they want the extra utility and software perks they will unfortunately have to pony up for the green card. And they do. So AMD must (at least for now) be cheap enough to account for that (generally speaking) it is a single-purpose product as opposed to the multi-purpose Nvidia card.

The people that have misunderstood my argument are just proving my point by saying things like the above "when I game, I want raw raster with high IQ, high fps, and stability"...

Anyway. That's enough energy spent on that. Hopefully AMD give us a good price now and make things interesting for gamers this gen.
 
Last edited:
Don’t take this the wrong way, but what you described yourself as is not a typical Windows user is the point. I work in IT Support and have done for over 30 years.

Typical home Windows users don’t even know how to change their desktop resolution or refresh rate, let alone 3D control panel gaming settings.

The vast majority of PC “gamers” are on an OEM desktop, or laptop with a “whatever the OEM” could get for the cheapest budget and lowest power requirements. That’s why utter crud like a 4060 laptop GPU are among the biggest “sellers”. It just happened to be what was in the PC/laptop when the person bought it.

That’s a typical Windows user.
 
Last edited:
The point is that given Nvidia's "suite of products" beyond just gaming and their extra software tools, their products are more versatile.
Being someone who does not pay any attention to the bolted software extras I would assume you are talking about streaming software ?

The Ryzen app does loads of things including instant replays I struggle to see what else I would need , but would not mind knowing what I have been missing out on to inform my next purchase
 
Last edited:
Honestly think the actual use for cuda etc is very niche. I use windows for a lot of stuff. 24/7 really between work (dev) and personally photoshop, gaming, other dev work and a multitude of other things and I can't think of one situation where having an AMD card has hindered me in the last five years. I don't doubt it has it's uses but it's small in the great scheme of things.
 
Don’t take this the wrong way, but what you described yourself as is not a typical Windows user is the point. I work in IT Support and have done for over 30 years.

Typical home Windows users don’t even know how to change their desktop resolution or refresh rate, let alone 3D control panel gaming settings.

The vast majority of PC “gamers” are on an OEM desktop, or laptop with a “whatever the OEM” could get for the cheapest budget and lowest power requirements. That’s why utter crud like a 4060 laptop GPU are among the biggest “sellers”. It just happened to be what was in the PC/laptop when the person bought it.
I agree with your description of what the lowest common denominator Windows user is capable of. But there are many people who use their PC's for productivity (whether professionally or as a hobby) that still prefer Windows because they don't have the time or will to invest in Linux. Perhaps I should have been clearer in defining what I meant by the average Windows user.

In the final analysis, Windows users tend to buy AMD cards purely for the min/max value they bring to their gaming. If they want/need their GPU to do more than game, they are likely to buy an Nvidia card. That's why the green team can and does charge more. That is also why they seem to be caring less and less about gamers. AMD has an opening here and it would be a shame if they didn't take advantage of it, for people who only/mostly care about gaming, and don't want to pay over the odds for features they are willing to forego. Of course mindshare and brand image are a factor, but those have been built up over time. This is why I still stand by my conclusion that AMD's cards must be significantly cheaper than their Nvidia equivalents.

I'm rooting for AMD here but my expectations are that they will blow it. What gives me hope is the messaging that has been coming from people like Frank Azor which sounds like they might have finally learned their lesson and will (hopefully) give gamers what they want at a reasonable price.
 
Last edited:
Don’t take this the wrong way, but what you described yourself as is not a typical Windows user is the point. I work in IT Support and have done for over 30 years.

Typical home Windows users don’t even know how to change their desktop resolution or refresh rate, let alone 3D control panel gaming settings.

The vast majority of PC “gamers” are on an OEM desktop, or laptop with a “whatever the OEM” could get for the cheapest budget and lowest power requirements. That’s why utter crud like a 4060 laptop GPU are among the biggest “sellers”. It just happened to be what was in the PC/laptop when the person bought it.

That’s a typical Windows user.
The amount of people you see on the likes of Reddit who claim to need good gaming performance but then also state they need CUDA but can't say why is pretty high. You also see loads of posts with high end equipment and turns out they have been running a 60Hz profile on their 144/165Hz monitor for over a year is quite alarming. People pick up on things that one person may require (i.e. CUDA) so they believe they need it also. It is also regurgitated in the recommendatiosn across several forums about the better CUDA performance but for the average user, It is not required.
 
Yes @Octoplex7 but we are discussing typical windows users here. You are not describing a typical windows user (or even PC gamer) you are describing you.

I think this is going off topic though.
 
Last edited:
Back
Top Bottom