• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I do think they would benefit hugely from launching first, assuming the price to performance is there, while they likely won't get a massive number of sales out the gate, if every vaguely disappointed 5000 series review ends with "the 9070 xt offers better price to performance" they could see a huge bump in sales off the back on nvidias marketing
 
Slightly off topic, I know the 9070XT is pipped to have roughly the same RT performance as the 4070ti, What sort of perfomance bump was there from the 3080 to 4070Ti for ray tracing?
 
Slightly off topic, I know the 9070XT is pipped to have roughly the same RT performance as the 4070ti, What sort of perfomance bump was there from the 3080 to 4070Ti for ray tracing?
22%

tEz8Yni.png
 
i keep thinking its just AI fighting itself.

Nobody can be that stupid surely?
You have heard my compatriots recently (once again) voted in president is an unhinged reality TV presenter?

If you really want a laugh, go to PCMR on Reddit. It's like watchiny 10 year-old children argue about how their daddy's have bigger muscles (with most not understanding basic mechanics of how muscles function).
 
The trouble AMD have is they are already on the back foot on GPUs, given the same performance on competing cards just to get equal market share they have to price below Nvidia because of peoples mindset. THe educated could argue that nvidia have an extended feature set but Im sure the general mid-range GPU buying public arent touching them, its just cos people.
It's no different from how it used to be with CPUs, no matter how competitive AMD were "people" bought Intel.

The difference is though with Ryzen AMD undercut the absolute crap out of Intel, they didn't perform anywhere close in the first couple of gens but priced it so cheap people switched, people realised AMD are not that bad and look at the CPU market now.

The problem is AMD refuse to do that with GPUs (could be about to change) they only need one break out generation to make people realise Nvidia are ripping them off, one generation of users to make them realise their drivers are perfectly OK.

AMD had the chance in the 7000 series they didn't.

We will see how it pans out next week and the week after.
 
They still need to clear their old stock so are they holding off for that? Do they really not know the perf of Nvidias 50 series, or do they know that their cards are a bit of a disappointment as well. Nvidia might be struggling to increase raw performance so they focuses on the extra bells and whistles to make up for it; if that's the case then I'm sure AMD would be struggling, too.
Both AMD and Nvidia will have paid spies in enough places of the supply chain to know each others approximate performance.

I would guess that AMDs drivers are undercooked on a few key titles which would affect an overall performance score.
 
Aren't AMD cards meant to be a bit better for OC'ing compared to Nvidia or is that something I've just made up in my head ha.
They used to have a lot more headroom (my memory only goes to the tail end of polaris and up, Vega56/64 on 14nm had a lot of headroom as did Radeon VII on 7nm). But as the generations roll on they've made better, more optimised and more specific designs for gaming, and there's less headroom as a result. Even back in the Radeon VII days every card came out of the factory with it's own voltage curve, so they did try and make use of binning for the consumers benefit. However the safety margin was large I think partly to account for compute workloads being much more sensitive to voltage (what might work for gaming might not work for all compute workloads, tl;dr games are poorly optimised and gave the GPU enough of a breather to not crash and burn). Wouldn't be surprised if the safety margin has also reduced greatly with RDNA1/2/3/4 from being more focused on gaming.
 
You know how the ASRock card is meant to come with a different power connector. Will there be an adapter in the box do we think? Just my PSU isn't ATX3.1 so I just use, at the moment, 3x 8-pin jobbies. Not that I am wanting to get ahead of myself here. :p
 
You know how the ASRock card is meant to come with a different power connector. Will there be an adapter in the box do we think? Just my PSU isn't ATX3.1 so I just use, at the moment, 3x 8-pin jobbies. Not that I am wanting to get ahead of myself here. :p
I would like to think so. Most do if it's a unique connector. I'm toying with the idea of upgrading my 750w seasoning PSU. The coil whine is something else when the GPU is ramped up. I initially thought it was my Vega 56 but exactly the same when I changed to the 3080.

They used to have a lot more headroom (my memory only goes to the tail end of polaris and up, Vega56/64 on 14nm had a lot of headroom as did Radeon VII on 7nm). But as the generations roll on they've made better, more optimised and more specific designs for gaming, and there's less headroom as a result. Even back in the Radeon VII days every card came out of the factory with it's own voltage curve, so they did try and make use of binning for the consumers benefit. However the safety margin was large I think partly to account for compute workloads being much more sensitive to voltage (what might work for gaming might not work for all compute workloads, tl;dr games are poorly optimised and gave the GPU enough of a breather to not crash and burn). Wouldn't be surprised if the safety margin has also reduced greatly with RDNA1/2/3/4 from being more focused on gaming.
That makes sense. Good explanation too.
 
Back
Top Bottom