• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I don't care what the GPU looks like
Each to their own obviously, hell some folk don’t even want glass panel cases while other want it ‘360’
Mine is pimped out like an 80’s disco, vertically mounted watercooled gpu, fan leds that react to sound, the whole shebang - and yet the whole thing is hid under my desk (for the time being) so I don’t even notice it when gaming. But I know she looks purdy and that makes me happy. :D
 
What I am trying to say is that mid range new gen pricing should reflect last gen end of life pricing to be truly competitive bringing higher tier performance to mid range.

Otherwise should we just ignore that last year the 7900xt was discounted to £543, and say that because it was £900 on release, that its successor is decent for $550?

Sorry not fully explaining myself but maybe @Joxeon might articulate it better!
It gets messy like that , you can compare again when the 9070xt hits its lowest price during its cycle
 
Each to their own obviously, hell some folk don’t even want glass panel cases while other want it ‘360’
Mine is pimped out like an 80’s disco, vertically mounted watercooled gpu, fan leds that react to sound, the whole shebang - and yet the whole thing is hid under my desk (for the time being) so I don’t even notice it when gaming. But I know she looks purdy and that makes me happy. :D

The only RGB I want is keyboard and even then I use fixed static red. I'm function over form.

Lighting around USB ports would be useful though.
 
The 5070 ti is $750 and AMD needs to buy market and mind share, if the 9070xt cuts into the 5070 and 5070ti's sales Nvidia has more than enough room to cut price, if in 3 months Nvidia cut the 5070ti by $100 who's going to buy amd? It'll just be the same situation as last gen where the amd card is $50 cheaper and it does nothing for amd's market share.



We need to wait for the benchmarks to say this and see my reply to Humbug.
Not a single card is available for the equivalent of $750 at the moment in the UK and it will likely be months before that is the case (generally). More like £850 on OC at the moment and practically nothing in stock.

So in the mean time , AMD have a chance to sell thousands of 9070XT cards at £600-£650 (Gibbo confirmed they have 000’s in stock or due by Friday).

I’m building a new PC next week and will buy a 9070XT at those prices. I would have possibly bought a 5070Ti at £750 but simply can’t. I would have bought a 5080 but that card was shockingly poor value.
Both 9070XT and 5070Ti are cheaper than the 3080 I bought for £800 due to supply constraints and give me a modest uplift to game better at 4K.
 
Each to their own obviously, hell some folk don’t even want glass panel cases while other want it ‘360’
Mine is pimped out like an 80’s disco, vertically mounted watercooled gpu, fan leds that react to sound, the whole shebang - and yet the whole thing is hid under my desk (for the time being) so I don’t even notice it when gaming. But I know she looks purdy and that makes me happy. :D
Birds of a feather it seems
 
In reality then, how much more are the top AIB's really worth when compared to the entry level ones? They don't really seem that much faster looking at boost clocks, so does it just come down to noise and temps?
 
Last edited:
The 5070 ti is $750 and AMD needs to buy market and mind share, if the 9070xt cuts into the 5070 and 5070ti's sales Nvidia has more than enough room to cut price, if in 3 months Nvidia cut the 5070ti by $100 who's going to buy amd? It'll just be the same situation as last gen where the amd card is $50 cheaper and it does nothing for amd's market share.
For all we know, AMD could have a fudge figure built in to existing costs so they could also cut price.
 
This is about on par with a 4080 and may oc to a 5080 as that's not exactly worlds faster than the 4080. I would say it and the 5070ti are in the 5080s performance tier. 5080 at anything but msrp is a terrible buy.
We won't know until reviews, but using TPU's relative performance charts and adding 36% to the GRE (that was the raster advantage HWU came up with according to AMD's charts) has the 5080 at 20% ahead, which to me is a tier up.

As for overclocking we'll find out, but I can get +10% from my card which is already 5% faster than the FE used in the charts so that cancels that out.

As I say I'm not defending my purchase, it was way over what it should have cost, but I wanted the performance and got it when I could, and I'm loving it so far.

Plus if I decide I'd rather save some money, with the stock situation I could probably sell without a loss and grab a 9070 XT!
 
Last edited:
Both 9070XT and 5070Ti are cheaper than the 3080 I bought for £800 due to supply constraints and give me a modest uplift to game better at 4K.
Tbf the 3080 was a top tier card, the 3090 was only around 15% faster.

The 5090 leads the 5070ti and 9070XT by around 70% so these pair are modern versions, more in line with the performance of what a 3060ti delivered back then when compared to a 3090.
 
Is this an nvidiaesq statement? Sometimes I think you lean too much to the one brand but hard to tell just by text. What people may not have picked up on this gen is the element of compression. I was drawn to this in todays AMD reveal where the second guy talking about the hardware mentions 'enhanced memory compression'. Now nvidia seem to have done this in their presentation through neural texture or something like that. So perhaps this has some reasoning to why 16GB seems to be "enough"?

Maybe you're right, maybe it's the compression thing. It just seems there's always an excuse when it's AMD. When AMD released the 8GB 290Xs then 4GB (which the 980s had) wasn't enough. Then when the FuryX released with 4GB, suddenly HBM was magic and it made 4GB basically infinity memory. So 4GB wasn't an issue then.
As for preferring one brand over the other, I think it's fair to say that I do tend to lean towards Nvidia more. I give AMD plenty of chances though, but each time I do they give me a kick to the nuts but I'm not allowed to mention it. Now I'll admit Nvidia cards aren't perfect either but they don't seem to abuse me in the same way (they just seem to steal money for cigarettes or something).

I’ve my Pc watercooled for literally no other reason other than aesthetics- and I’m nearly 50…..make of that what you will :cry:
I know what you mean, I do like the look, well, not even the look, more the idea of my PC being watercooled. It does look nice and the temperatures are great and it can also be quiet. Of course I'm too lazy to go to the effort of watercooling my PC these days but I keep telling myself I will again, one day...
 
Less VRAM than the 7900XT though.
How important that will be though we'll have to see.
I know last gen when AMD had 20/24GB and Nvidia had 16GB it was quite a big deal, now that AMD have 16GB I'm guessing it won't be such an issue.

Haha. Standard. Upscaling and RT wasn't important either, those will start being important too now. Lol.
 
Maybe you're right, maybe it's the compression thing. It just seems there's always an excuse when it's AMD. When AMD released the 8GB 290Xs then 4GB (which the 980s had) wasn't enough. Then when the FuryX released with 4GB, suddenly HBM was magic and it made 4GB basically infinity memory. So 4GB wasn't an issue then.
As for preferring one brand over the other, I think it's fair to say that I do tend to lean towards Nvidia more. I give AMD plenty of chances though, but each time I do they give me a kick to the nuts but I'm not allowed to mention it. Now I'll admit Nvidia cards aren't perfect either but they don't seem to abuse me in the same way (they just seem to steal money for cigarettes or something).

I can understand your angle. But as you put it, why only 16GB on rdna4.. I remembered seeing this in the slides and nvidia seem to double down on it (in their presentation) due to historically not prioritising memory. If both popular enthusiast tiers now have it in common you would think this might be why now (well that's how I processed the past couple of months anyway).
 
Regarding the sapphire card with 16 pin connector,

Toms hardware has an interesting article on it, seems sapphire have installed some safe guards to stop the power cable and card from melting and ruining the card.

No doubt to get repaired it will have to be RMA'd but at least it won't be a fire hazard

 
Haha. Standard. Upscaling and RT wasn't important either, those will start being important too now. Lol.

Nvidia and it's fans have made it a necessity, I have seen many many many times people have used that as an argument for buying Nvidia.

Eventually you have to follow the crowd
 
Nvidia and it's fans have made it a necessity, I have seen many many many times people have used that as an argument for buying Nvidia.

Eventually you have to follow the crowd

It's valid that's why. I know having crappy upscaling put me of AMD quite a bit in the past. One less thing now if they got it right.
 
It's valid that's why. I know having crappy upscaling put me of AMD quite a bit in the past. One less thing now if they got it right.
I have had an RT card for 18 months. It's nice but it's hardly essential.

It 100% is the future but even now cards are not capable of it properly without shortcuts.
 
Back
Top Bottom