• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA™ 4 and RX 9000 Series Reveal - Live Chat [1pm UK time]

  • Thread starter Thread starter Hostile_18
  • Start date Start date
Have fun with sub 10% market share then.
How would it really impact me? They've been at a low market share for ages and owners have survived.

I'm only really interested in playing games at a decent standard at a price that isn't too insane.

People talk like their price is what's stopping market share. It's not. It's nvidia fan boys refusing to entertain any alternative. (I own two nvidia gpus by the way). Many seem to think DLSS is absolutely essential and I get the impression some less educated think Ray tracing is an RTX function.

Amd have had better low end cards than nvidia for a long time but seemingly still get out sold by 60 series cards.

Amd will increase some market share now, but even if it was £400 it still wouldn't reverse the market share in any significant manner. It's gonna take years of releases similar to this one.
 
How would it really impact me? They've been at a low market share for ages and owners have survived.

I'm only really interested in playing games at a decent standard at a price that isn't too insane.

People talk like their price is what's stopping market share. It's not. It's nvidia fan boys refusing to entertain any alternative. (I own two nvidia gpus by the way). Many seem to think DLSS is absolutely essential and I get the impression some less educated think Ray tracing is an RTX function.

Amd have had better low end cards than nvidia for a long time but seemingly still get out sold by 60 series cards.

Amd will increase some market share now, but even if it was £400 it still wouldn't reverse the market share in any significant manner. It's gonna take years of releases similar to this one.

It already impacts you. More marketshare = more profits for AMD = more R&D, better features ( like DLSS 4, Ray Reconstruction etc ), more software support ( RTX REMIX - like Halflife 2 RTX), better dev relations and more support for them to implement said feature - nvidia actively sends engineers to help devs, amd doesnt - it just sends their documentation and that’s it. Also dissing weaker cards by overlooking features is silly. They count, a lot. Sounds like usual samsung / android fanboys dissing iphones cUz spex suX. Those people will never see things different because they’re stuck in their own perspective. They know best, it’s the others that are stupid and dk stuff.

Basically more bang for the buck.

Sure it will take more gens than 1 to improve market share but it has to start somewhere. Problem is, it never does.
 
How would it really impact me? They've been at a low market share for ages and owners have survived.

I'm only really interested in playing games at a decent standard at a price that isn't too insane.

People talk like their price is what's stopping market share. It's not. It's nvidia fan boys refusing to entertain any alternative. (I own two nvidia gpus by the way). Many seem to think DLSS is absolutely essential and I get the impression some less educated think Ray tracing is an RTX function.

Amd have had better low end cards than nvidia for a long time but seemingly still get out sold by 60 series cards.

Amd will increase some market share now, but even if it was £400 it still wouldn't reverse the market share in any significant manner. It's gonna take years of releases similar to this one.
With the way the game industry is going, having a decent upscaler is almost essential, so FSR being way worse than DLSS, especially the new model, is a negative/sacrifice for buyers. VRAM arguments aside, my 3080 aged better than the 6800XT.

RT is probably less of a selling point for the average buyer and AMD are improving in this area. For your money, you can generally buy a higher end AMD card and brute force things if it really matters.

I’m glad AMD are finally releasing an AI upscaler. If it’s good then I’ll gladly jump ship to the 9080XT instead of the 5080.
 
The fact that your primary argument got dismantled and

It wasn’t dismantled, but - wrongly - dismissed. AMD needs bums on seats - to sell product - and to do that it needs to be significantly cheaper than Nvidia. Not modestly cheaper but significantly. Otherwise they will continue to be a niche seller.
 
DLSS 4 is worth the extra £150 over this if everything sold at MSRP
Won't know until next week if these cards are actually worth it or not...

As someone who runs a 4080 and a 7900 XT, I can tell you DLSS 4 is not worth the extra over FSR at 4K and that’s even the older FSR. Yes the delta is noticeable but FSR is not terrible and more than useable for me. That delta is going to be much reduced for FSR4.

I understand it’s subjective but at what point does it become “how much”? You say £150 but you are basing that on MSRP. If a 5070Ti costs £950 and a 9070 XT with FSR4 and RT close enough costs £700, is that “still worth it”? What if you can get a 9070XT on launch day for £600 and the only 5070Ti in stock is £1000, is that “worth it”?

It’s all well and good someone buying £2000 plus GPUs and convincing himself he’s been shrewd and that DLSS 4 is worth it. Without even knowing how FSR4 behaves.
 
Last edited:
DLSS 4 is worth the extra £150 over this if everything sold at MSRP
Won't know until next week if these cards are actually worth it or not...

I think most folk would readily pay an extra £100-200 for an equivalent Nvidia GPU over AMD due to the Better features/support even at the same level of performance. We can see from their behaviour that Nvidia are fully aware of this reality unfortunately :(

That’s the issue AMD really needs to address somehow.

Going to be an interesting week with the release of both new AMD cards and the RTX 5070. Hopefully some stock is actually available though. I’m growing fed up of these paper launches where you can’t actually buy the product!
 
Last edited:
As someone who runs a 4080 and a 7900 XT, I can tell you DLSS 4 is not worth the extra over FSR at 4K and that even the older FSR. Yes the delta is noticeable but FSR is not terrible and more than useable. That delta is going to be much reduced for FSR4.
And as someone who went from 6800XT to 4090, I disagree. Even at 4k, the gulf between FSR and DLSS is significant and obvious. FSR3 scaling to 4k is a smeary blur while DLSS, even its 3 incarnation, is clear. I can easily and immediately tell the difference.
Perhaps it will change with FSR4, and I hope it does, because AMD needs a winner and nvidia needs competition or we all lose out.
 
I think most folk would readily pay an extra £100-200 for an equivalent Nvidia GPU over AMD due to the Better features/support even at the same level of performance. We can see from their behaviour that Nvidia are fully aware of this reality unfortunately :(

That’s the issue AMD really needs to address somehow.

Going to be an interesting week with the release of both new AMD cards and the RTX 5070. Hopefully some stock is actually available though. I’m growing fed up of these paper launches where you can’t actually buy the product!

Hence why AMD have improved RT and FSR to reduce any perceived deficit. All indications are they have closed the RT and upscaling gap significantly with RDNA4. If they are offering competitive GPUs at 30% - 40% lower street prices, people will take notice.

RT ultra in CP2077 at 4K

5070Ti - 30 FPS
7900 XTX - 20 FPS
9070 XT - 25 FPS
4080 - 27 FPS

The 5070Ti has gone from 50% faster than the 7900 XTX, to 20% faster than a 9070 XT in extreme RT. That is high end vs mid tier and a massive improvement. For the majority of games that RT deficit will be less than 10%.
 
Last edited:
And as someone who went from 6800XT to 4090, I disagree. Even at 4k, the gulf between FSR and DLSS is significant and obvious. FSR3 scaling to 4k is a smeary blur while DLSS, even its 3 incarnation, is clear. I can easily and immediately tell the difference.
Perhaps it will change with FSR4, and I hope it does, because AMD needs a winner and nvidia needs competition or we all lose out.

Quality vs quality settings? On my 4080 you need to pixel peep at 4K to see differences and is game dependent. In fact in some games the FSR implementation was arguably better than DLSS (on balance). You were trading some shimmering for reduced motion blur etc.

Yes DLSS is better overall but FSR is not some pile of crap that had zero utility or uses for a lot of people.

Though this is about at what point does “features” become moot.
 
Last edited:
Quality vs quality settings? On my 4080 you need to pixel peep at 4K to see differences and is game dependent. In fact in some games the FSR implementation was arguably better than DLSS (on balance). You were trading some shimmering for reduced motion blur etc.

Yes DLSS is better overall but FSR is not some pile of crap that had zero utility or uses for a lot of people.
Now what I'm about to say is anecdotal so lots of room for errors. However, when I was testing a B580 right after selling my 6950XT, I gave XeSS a try an thought holy moly does it look way better than FSR. Then I gave FSR a try and the confusing part was that while it looked worse than XeSS while giving better performance, the overall quality of FSR looked better than I remembered on my 6950XT as in sharper/cleaner/overall better. Now this makes no sense to me and I'd be the first to admit it may just be my memory but I have been wondering ever since if different architectures gives slightly different image quality results using certain upscalers? fx RDNA vs Arc. Could be interesting to investigate, at least to me it could.
 
It already impacts you. More marketshare = more profits for AMD = more R&D, better features ( like DLSS 4, Ray Reconstruction etc ), more software support ( RTX REMIX - like Halflife 2 RTX), better dev relations and more support for them to implement said feature - nvidia actively sends engineers to help devs, amd doesnt - it just sends their documentation and that’s it. Also dissing weaker cards by overlooking features is silly. They count, a lot. Sounds like usual samsung / android fanboys dissing iphones cUz spex suX. Those people will never see things different because they’re stuck in their own perspective. They know best, it’s the others that are stupid and dk stuff.

Basically more bang for the buck.

Sure it will take more gens than 1 to improve market share but it has to start somewhere. Problem is, it never does.

I look at it in a pretty simplistic way. NVIDIA have waaay more money and scope to hire the very best engineers and provide them with a higher level of support.

I think it's a stretch to expect amd to somehow flip the script on gpus in the same way they did against Intel because AMD aren't making a halo card to compete with 5090. By that alone they can never be seen as making the "best performing gpus". The most they can achieve is best bang for buck, but by that you don't become a leader in the gpu space.

I don't understand your dissing about feature set bit. I'm not dissing about feature sets. I'm just saying that there are people who will buy purely for dlss and will not consider amds offering regardless. Much like your apple android argument. There's people on each side who wouldn't dream of swapping regardless.
 
Now what I'm about to say is anecdotal so lots of room for errors. However, when I was testing a B580 right after selling my 6950XT, I gave XeSS a try an thought holy moly does it look way better than FSR. Then I gave FSR a try and the confusing part was that while it looked worse than XeSS while giving better performance, the overall quality of FSR looked better than I remembered on my 6950XT as in sharper/cleaner/overall better. Now this makes no sense to me and I'd be the first to admit it may just be my memory but I have been wondering ever since if different architectures gives slightly different image quality results using certain upscalers? fx RDNA vs Arc. Could be interesting to investigate, at least to me it could.

Now you have me remembering my FSR testing in the 7900 XT and I was convinced it looked worse when I went back to the 4080 in the same rig. I put it down to some confirmation bias and moved on :D
 
It already impacts you. More marketshare = more profits for AMD = more R&D, better features ( like DLSS 4, Ray Reconstruction etc ), more software support ( RTX REMIX - like Halflife 2 RTX), better dev relations and more support for them to implement said feature - nvidia actively sends engineers to help devs, amd doesnt - it just sends their documentation and that’s it.
I would argue AMD already has plenty of monies for r&d - they don't need to sell gaming GPUs to have the monies, gaming is only 7% of their revenue currently. They sell mostly enterprise but also loads of consoles chips and that's where most of the r&d happen, which also benefit gaming too (cooperation with Sony is big apparently - all the current RT improvements and FSR4 are the results of it). I dare to say, without consoles AMD wouldn't even bother with gaming GPUs anymore. But because consoles are a thing and require loads of work to make progress, their Radeon department will benefit a lot by that alone over the coming years, which should be good for everyone.
 
Any concrete specs on the video encoding/decoding.

Currently looking at upgrade from my 4070 as I need 10-bit 4-2-2 hardware encode and decode. The nv 50 series has it, but I'd happily go AMD for less if the video performance is up to scratch.
 
Back
Top Bottom