• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Slightly off topic, I know the 9070XT is pipped to have roughly the same RT performance as the 4070ti, What sort of perfomance bump was there from the 3080 to 4070Ti for ray tracing?
22%

tEz8Yni.png
 
I don't understand. Why is it not 18%? I've been staring at the image like it is some sort of brain teaser lol.
:cry:

When some one asks how much faster is A than B you need to calculate how much performance B needs to gain to reach A. the simplest way to do that is to divide A by B

So if A is 100% and B 82% you divide 100 by 82 = 1.21951, the number greater than the decimal point is the performance needed to reach 100%, target A, rounded to the nearest number, that being 1.22 so its 22%.

What is 100 as a percentage of 82?
 
Last edited:
Grain of salt with this one....

As of internal testing, AMD, December 2024.

9700 XT = 7% slower in raster than 7900 XTX, between 4070 Ti and 4070 Ti S in RT at 305 Watts
(believes there was a Wokong benchmark where it was 60% faster in RT than the 7900 XTX)

9700 = 5% weaker than 7900 XT in raster, = to 4070 in RT at 240 Watts


 
Last edited:
Definitely grain of salt... how are the AIB cards that big at 305W with 3x 8pin power? Even if AIB sucks 20% more power, thats just about within what PCIe slot and 2x 8pin can supply. Power and cooling doesn't add up for that rumour.

Call me fussy, but I also don't appreciate the recent trend of bigger coolers over 2 slots. Makes other PCIe slots inaccessible and unable to be used and there's not significantly better temps from them.

I shouldn't think that a disqualifier, i have 2X 8 pin PCIe connectors, with the PCIe slot that's 375 watts, its a 250 watt GPU.

Modern GPU's don't really pull much if any power from the slot, they are more stable not relying on power from the motherboard, the 9700 XT is a 300 watt + GPU, each 8 pin is 150 watts, 2 would be 300 watts, 2X 8 pin + a 6 pin would be 375 watts, for a bit more headroom (they do come with a 15% adjustable power profile) 3X 8 pin makes sense for a 300 or a bit more watt GPU.
 
Last edited:
Assuming similarity in (raw) gaming performance between the 5070TI and the 9070XT then 600 quid still sounds too much for the AMD card to me.
With the amount of additional non-gaming utility with Nvidia cards and the way it all "just works" you are effectively getting a suite of products with team green's offering. If you're the average Windows user, AMD (for now at least) are only really offering you one function: gaming.
Furthermore, that gaming function will always lack the sophistication of Nvidia's software suite (and their formidable R & D budget) and will always lag behind in this regard.
So therefore there needs to be a significant difference in price to reflect all of this. It's a little bit like the difference between buying a PC and a console: even if the gaming experience was similar, I'd always expect to pay more for the PC because of everything else it can do.
It's not just mindshare...

If the 5070 Ti is £750+ i can't see AMD pricing it much under £600
 
My understanding is that the 7800XT sold well when it hit the 450 (and sometimes even lower) mark. I think AMD will surely have seen the data on this and so I hope that they will cut to the chase and just come out of the gate with that sweet spot price. Given inflation, I think an RRP of 499 (assuming that is still significantly profitable) should be the aim.

Its not even listed on the Steam Hardware Survey.

How cheap do they have to be to sell anything like the worst selling SKU from Nvidia?
 
No problem.

Here's what I originally said:

Assuming similarity in (raw) gaming performance between the 5070TI and the 9070XT then 600 quid still sounds too much for the AMD card to me.
With the amount of additional non-gaming utility with Nvidia cards and the way it all "just works" you are effectively getting a suite of products with team green's offering. If you're the average Windows user, AMD (for now at least) are only really offering you one function: gaming.
Furthermore, that gaming function will always lack the sophistication of Nvidia's software suite (and their formidable R & D budget) and will always lag behind in this regard.
So therefore there needs to be a significant difference in price to reflect all of this. It's a little bit like the difference between buying a PC and a console: even if the gaming experience was similar, I'd always expect to pay more for the PC because of everything else it can do.
It's not just mindshare...


The point was not that I am personally overjoyed at the prospect of paying more for the Nvidia product. The point is that given Nvidia's "suite of products" beyond just gaming and their extra software tools, their products are more versatile. The core logic is that value comes from more than just raw performance. Features, software quality, and versatility contribute to a product's overall worth, allowing a company like Nvidia to command a premium price even if AMD offers similar gaming power.
Intel understand this which is why they are targeting a segment of the market and coming in cheap. AMD should follow suit.

This is a value-neutral, unemotional analysis of the situation. Not the desire of my heart. I would like Nvidia to charge significantly less than they do! But (for now at least) they don't have to. So hopefully AMD will come in at the right price and lay a claim to that segment of the market. That will give them a solid foundation to build on for the next gen.

For what it's worth, I'm hoping to buy the AMD card if the price is right. If it isn't I'll likely go for a used 40 series.

If you think Nvidia command a premium for their software that's a you problem, if you want that premium software pay for it, $600 while a little too high for my taste is still a 25% premium for that Nvidia software, if its = to a 5070 Ti i'd like to see it at $550 just because that is the limit of what i'm willing to pay for any GPU, that's a 36% premium for Nvidia software.

If you think Nvidia GPU's are too expensive take it up with them, its not AMD's job to get in to a price war with Nvidia on your behalf, its AMD's job to make as much money as they can from their products, just as it is Nvidia's, AMD are not an exception to that rule, and if they can sell 1,000,00 units at $600 or $1.100.000 at $500 then they are not making as much money doing the latter, that's the reality, the fantasy is assuming Nvidia have such a premium for software and its AMD's job to empathetically provide the lube when you volunteer to bend over for Nvidia.

They are a business, not a charity.

Direct your disapproval at this situation at those whom caused it.
 
Last edited:
Insanity is doing the same thing over and over again expecting a different outcome.

AMD have and have not in the past tried to battle Nvidia on price, including selling at below cost, ATI tried it before them, none of it ever worked, why? Because this market only ever blames Nvidia's opponents for Nvidia's deeds, they have the perfect market position, they can do as they please and when they do AMD cop the fallout from it further diminishing AMD's mindshare and bolstering Nvidia's.

This market is insane and its not going to take much more before even AMD recognise that as a hard truth.
 


Only Mores Law is Dead, in the Cyberpunk bar chart to give one example.... would make a graph that says it represents one res / setting (4K RT Ultra) and one measure (AVR FPS) and then create 5 wildly different bars for that.... :rolleyes: Which is it Tom? are you going to pick one when the reviews come out and say you were right????
 
Last edited:
What do you mean 5 wildly different bars? It clearly shows you at the bottom of the first slide that the colors represent 5 different cards. The pink and the orange are the 9070XT and 9070. In CP2077 without RT, the 9070XT is faster than the 7900XT by around 7fps and only 4fps slower than the 7900XTX.

Ghi5PNRacAI76XM


Ghi5QDJacAYO_l9


The second slide shows games with RT enabled in which case the 9070XT is faster than the 7900XTX in the heavy RT games.


In which case the 9700 XT is 17% faster than the 7900 XTX in Cyberpunk 4K RT, that puts it slightly ahead of the 4070 Ti.

2V9UYe6.png
 
I'm interested in the encoder AMD have worked on, any info on this? Or is it just AV1 they made better?

Be nice if they have worked on H.264 and got it similar to NVENC.

Why not just use AV1?

This was recorded using AV1.

 
Last edited:
Long time since I've upgraded the GPU but interested in a decently priced 9700, but I don't understand why these new AMD cards are predicted to be slower than the 7900XT(X) cards. Why is this? I assume this is just raster performance, but still, why can't they make them faster?

With the 7900 line still be made and updated?

Trying to keep the cost and size down, the 4090 on the same process node is over 600mm, its huge, AMD are trying to keep it in the 300 to 400mm range, which for just shy of 7900 XTX performance is impressive in its self.
 
Last edited:
Nvidia are overpriced, that doesn't mean AMD have to slowly make themselves bankrupt as a counter measure.

In Q3 last year AMD made $462 Million revenue and out of that $12 Million profit from gaming GPU's. (2.6%)

In the same quarter Nvidia made $3,300 Million revenue and $2,462 Million Profit from gaming GPU's. (74.6%)

If the 5070 Ti is $750 but the 9070 XT $600 'the problem is with AMD not Nvidia' "if...if... if they want to gain marketshare" Yeah ok so $500, now the 5070 Ti is 50% more expensive, AMD go from 10% market share to 20% with zero margins.... now what? What do AMD do now that they have R&D costs but zero profit from their sales and 20% marketshare? Do they keep going like this and not R&D newer / better products? What one might call the ATI model of fighting Nvidia, its why they went bust, or do they start to push the price up again so they have some R&D?

I'm asking because these clever people on Youtube keep making it about AMD when Nvidia charge too much but never explain how AMD forever slashing prices is going to help that! Because these people are really not very clever at all, they are in fact incredibly stupid.
 
Last edited:
Back
Top Bottom