• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

I find threads like this funny.

AMD vs Nvidia, AMD vs Intel. People expect the same competition from a company fighting on two fronts, both CPU and GPU.

Until recently Intel hadn't even dipped their toe in any major way into the GPU market why not ask the same question of them vs nVidia?

More competition the better IMO but don't expect companies who have other major focuses to compete with the top end of nVidia (not an area that I've ever bothered with anyway). If it wasn't profitable they wouldn't do it.
 
I find threads like this funny.

AMD vs Nvidia, AMD vs Intel. People expect the same competition from a company fighting on two fronts, both CPU and GPU.

Until recently Intel hadn't even dipped their toe in any major way into the GPU market why not ask the same question of them vs nVidia?

More competition the better IMO but don't expect companies who have other major focuses to compete with the top end of nVidia (not an area that I've ever bothered with anyway). If it wasn't profitable they wouldn't do it.

I'm coming around to thinking this way.

As others have pointed out AMD are a CPU business who also make GPU's, they are in fact perhaps not that interested in fighting Nvidia for market share. AMD's rival are Intel, not Nvidia.
 
AMD and Intel have a lot of history, AMD see Intel as a mortal threat, their strategy is to cut Intel down to size as a way to neutralise that threat.

For AMD everything is about that.

In 2020 AMD's revenue was $16 Billion with 45% margins
In 2020 Intel's revenue was $78 Billion with 60% margins

In 2022 AMD's revenue was $26 Billion with 50% margins
In 2022 Intel's revenue was $62 Billion with 34% margins

In Q1 2022 Intel's revenue was $18 Billion, in 2023 Q1 its expected to be $11 Billion, Intel will likely fall between $40 to $50 Billion by 2023 years end, AMD should stay the same at around $26 to $27 Billion with margins very much higher than Intel, they have Intel on the ropes and there is no mercy here, this is almost revenge. Very cold revenge.
 
Last edited:
If AMD threw in the towel with their GPU division then PC gaming is dead because nobody will be able to afford to put a PC together. Component prices have never been higher with entry point motherboards costing what a mid-high end board used to cost just a few years ago and GPU's are just disgracefully priced. If someone told me ten years ago that a mid range GPU will cost £600+ I would have laughed at them but that's exactly what has happened. If AMD threw in the towel, Nvidia who already charge ridiculous prices for their GPU's, will basically be able to set the price at what they want. Intel is a very, very distant third with their Arc series of cards so Nvidia has nothing to fear from them. I imagine that many will have switched from PC gaming to a console already due to the price of components and as long as a console that is basically plug and play costs around £500 compared to a basic pc at £1000-1200 or a high end one at £2000+ even more will end up jumping ship.
 
Last edited:
If AMD threw in the towel with their GPU division then PC gaming is dead because nobody will be able to afford to put a PC together. Component prices have never been higher with entry point motherboards costing what a mid-high end board used to cost just a few years ago and GPU's are just disgracefully priced. If someone told me ten years ago that a mid range GPU will cost £600+ I would have laughed at them but that's exactly what has happened. If AMD threw in the towel, Nvidia who already charge ridiculous prices for their GPU's, will basically be able to set the price at what they want. Intel is a very, very distant third with their Arc series of cards so Nvidia has nothing to fear from them. I imagine that many will have switched from PC gaming to a console already due to the price of components and as long as a console that is basically plug and play costs around £500 compared to a basic pc at £1000-1200 or a high end one at £2000+ even more will end up jumping ship.
That's what people were saying about CPUs as well. "intel stagnation blablabla". Now that we have competition mainstream CPUs went from 330€ up to 800€. 2 companies does not competition make.
 
Perhaps that is the point, AMD can't fight Nvidia in the dGPU space, but they still win if people move to consoles.

AMD's Console SoC business is ramping up, handheald's, The Tesla has a PS5 type console in it, even phones.
Fight the top end, there's no need. Like you say, they win with consoles too.

I'm sure I can't be the only person who will buy a 'good' to me GPU but won't dish out the sort of cash that could buy me a PS5 and a 4K TV.
 
It's all about perceptions.

Nvidia are perceived to be the "premium" product and therefore demand the premium price.

AMD always seen as the "better bang for buck" alternative.

And both are priced accordingly outside of any mining boom.

The irony of it all, is Nvidia wanted to, they could very easily out price AMD completely and literally destroy them. So the question is, why don't they?
 
  • Like
Reactions: sg0
That's what people were saying about CPUs as well. "intel stagnation blablabla". Now that we have competition mainstream CPUs went from 330€ up to 800€. 2 companies does not competition make.
CPU prices are around the same as they have been for ages, well maybe a little cheaper. Both Intel and AMD used to charge £1K+ for the highest end desktop chips. The 7950X is~£550 with a game now. The 3D versions are a bit steep but adding the extra cache chip costs. The mainboards and Nvidia GPUs are insane but CPU’s, not so much. Intel had ~2-4% improvement per-gen and stuck with quad cores for ages, if it was not for AMD, we would still be getting 2-4% quad cores, **** Intel for that ****! Competition forced Intel to pull its finger out and now we have MUCH better options.
 
Last edited:
That's what people were saying about CPUs as well. "intel stagnation blablabla". Now that we have competition mainstream CPUs went from 330€ up to 800€. 2 companies does not competition make.

That’s not entirely correct though - competition isn’t all about price, it’s about a number of aspects. ‘Intel stagnation’ was a very real thing. Without Ryzen it’s hard to say how dire Intel’s lineup might look today. It was AMD that started to ramp up core counts more aggressively on mainstream desktop parts. Yes Intel had done it on their enthusiast platforms like X58 but you sure had to pay a pretty price for it, and they were happy to keep giving everyone else 4 cores up to the 7700k!

CPU’s are actually pretty good price wise these days IMO. You’ve got top end parts like the 13900k available for under £600. I remember when products like the P4 EE or QX9770 were a thousand pounds +.
 
Last edited:
That’s not entirely correct though - competition isn’t all about price, it’s about a number of aspects. ‘Intel stagnation’ was a very real thing. Without Ryzen it’s hard to say how dire Intel’s lineup might look today. It was AMD that started to ramp up core counts more aggressively on mainstream desktop parts. Yes Intel had done it on their enthusiast platforms like X58 but you sure had to pay a pretty price for it, and they were happy to keep giving everyone else 4 cores up to the 7700k!

CPU’s are actually pretty good price wise these days IMO. You’ve got top end parts like the 13900k available for under £600. I remember when products like the P4 EE or QX9770 were a thousand pounds +.
Well, I know people don't care about the actual numbers but just sh**** on Intel, but reality is - Intel during their stagnation era (2011-2017) gave us on average the same MT performance increase per year that amd gave us between 2017 - 2022 (R7 1700 to R5 7600x). People can argue all they want about Intel's stagnation, but the numbers prove them wrong. Look how im going to get quoted with flawed comparisons now cause people don't like what the numbers show us.
 
So you cant tell the difference in a blind test. As expected
DLSS issues are visible in movement and not on static images. If you can't see that in games it's a clear bias. I suggest you check call of duty (second to last), first mission in the forest, look a bit up. Pretty with DLSS? No issues on screenshot but then start moving and enjoy whole screen flickering with artifacts caused by sharpening in dlss on tree branches. There's many more instances in many games, of image instability, horrible sharpening artefacts and horrible ghosting.

Check latest video by hardware unboxed where they shown quite a few, and they did say it's better than fsr2 in almost all cases, but it's still a net image quality loss even in 4k (quality setting). Even newest dlss 3 isn't free of these issues. Having tested many games with it on my 4090 I decided to never use it again unless no other option. Frame generation I find very useful for CPU restricted areas but standard DLSS is just bad for image quality, period.
 
Last edited:
DLSS issues are visible in movement and not on static images. If you can't see that in games it's a clear bias. I suggest you check call of duty (second to last), first mission in the forest, look a bit up. Pretty with DLSS? No issues on screenshot but then start moving and enjoy whole screen flickering with artifacts caused by sharpening in dlss on tree branches. There's many more instances in many games, of image instability, horrible sharpening artefacts and horrible ghosting.

Check latest video by hardware unboxed where they shown quite a few, and they did say it's better than fsr2 in almost all cases, but it's still a net image quality loss even in 4k (quality setting). Even newest dlss 3 isn't free of these issues. Having tested many games with it on my 4090 I decided to never use it again unless no other option. Frame generation I find very useful for CPU restricted areas but standard DLSS is just bad for image quality, period.


I find dlss looks better than native in most games at 4k so I usually enable it regardless of performance even if I was already getting 4k 100fps. And that's fine, everyone is entitled to their opinion and play games however they want to
 
Last edited:
DLSS issues are visible in movement and not on static images. If you can't see that in games it's a clear bias. I suggest you check call of duty (second to last), first mission in the forest, look a bit up. Pretty with DLSS? No issues on screenshot but then start moving and enjoy whole screen flickering with artifacts caused by sharpening in dlss on tree branches. There's many more instances in many games, of image instability, horrible sharpening artefacts and horrible ghosting.

Check latest video by hardware unboxed where they shown quite a few, and they did say it's better than fsr2 in almost all cases, but it's still a net image quality loss even in 4k (quality setting). Even newest dlss 3 isn't free of these issues. Having tested many games with it on my 4090 I decided to never use it again unless no other option. Frame generation I find very useful for CPU restricted areas but standard DLSS is just bad for image quality, period.
Sure, I can upload videos as well for whoever wants to take the blind test. I've done this 10 times and no one managed to figure out which one is upscaled and which one isn't, it's a waste of time at this point.
 
I find dlss looks better than native in most games at 4k so I usually enable it regardless of performance even if I was already getting 4k 100fps. And that's fine, everyone is entitled to their opinion and play games however they want to
Exactly that. Especially native + TAA looks blurry as heck. I always have it open even thought I don't need the performance, just for image quality increase.
 
Last edited:
I find dlss looks better than native in most games at 4k so I usually enable it regardless of performance even if I was already getting 4k 100fps. And that's fine, everyone is entitled to their opinion and play games however they want to
I found that it's only better if the game has horrible native AA - and only for that reason. There's a noticeable texture quality loss, sharpening artefacts (fixable by using drivers sharpening and disabling the one in DLSS in some cases), various motion artefacts (shimmering of objects and ghosting). Also, often it softens whole image as well. Thin lines look better (again, AA) but that's about it - however that seems to be enough to make many people happy, however objectively speaking, it doesn't make up for all the other issues. HU described well and shown plenty of examples where DLSS (and even more so fsr2) quality 4k just doesn't look great.
 
Sure, I can upload videos as well for whoever wants to take the blind test. I've done this 10 times and no one managed to figure out which one is upscaled and which one isn't, it's a waste of time at this point.
I have friends who see no difference at all between AA on and off in games, or between ultra and medium settings. That doesn't prove anything about quality loss, it only proves the fact that most people do not care either way.
 
Back
Top Bottom