• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA2 Refresh for 2022 6x50XT cards

The trouble is that price performance in the GPU market has been stagnating for the last 5 years or so especially at the lower end where the same money tends to get you a lower tier down every generation which isn't much faster than the tier up from the previous gen while at the higher end prices keep ballooning and dragging everything else up so really you're forced to spend more else you don't even end up getting much better than you had before.

The issue is not only the top going up,it's also the class of dGPU you are getting as you go down the range. But its really weird that things such as cars can have luxury models,but these don't affect mainstream ones,but with electronics this game gets played. But it's also the reason why consoles are even more relevant than they should be. The average PC gamer doesn't have the GPU and CPU hardware many have on tech forums,ie,a GTX1060 and GTX1050TI are still common dGPUs even now. What this means,unlike the era of Crysis,entry level and mainstream gaming systems are increasingly starting to not have enough processing power to enable big jumps. Instead more focus has gone onto consoles now. Both Nvidia and AMD have to be blaimed for this,as they have no real want to compete too hard as it's easier just to mildly poke each other. It was like the smartphone market a while back when Apple and Samsung just did the same,but Chinese OEMs came along and gave them competition(especially in the mainstream pricing tiers),forcing them to awake from their slumber.
 
The issue is not only the top going up,it's also the class of dGPU you are getting as you go down the range. But its really weird that things such as cars can have luxury models,but these don't affect mainstream ones,but with electronics this game gets played. But it's also the reason why consoles are even more relevant than they should be. The average PC gamer doesn't have the GPU and CPU hardware many have on tech forums,ie,a GTX1060 and GTX1050TI are still common dGPUs even now. What this means,unlike the era of Crysis,entry level and mainstream gaming systems are increasingly starting to not have enough processing power to enable big jumps. Instead more focus has gone onto consoles now. Both Nvidia and AMD have to be blaimed for this,as they have no real want to compete too hard as it's easier just to mildly poke each other. It was like the smartphone market a while back when Apple and Samsung just did the same,but Chinese OEMs came along and gave them competition(especially in the mainstream pricing tiers),forcing them to awake from their slumber.
Yeah the car and phone markets have plenty of competition yet the GPU market is a duopoly where neither company really wants to battle the other on price and performance and they are happy to sit back and watch the $$$ roll in while offering as little performance as they can get away with since it will mean people have to upgrade again sooner.
 
Yeah the car and phone markets have plenty of competition yet the GPU market is a duopoly where neither company really wants to battle the other on price and performance and they are happy to sit back and watch the $$$ roll in while offering as little performance as they can get away with since it will mean people have to upgrade again sooner.

Why I am starting to modulate the games I play now. Luckily in the PC market we have 1000s of games over the last 20 years we can play,and the reality it's our "need" to run AAA games with fancy graphics at certain FPS forcing us to upgrade more often than not. So in my view,I am increasingly going to just upgrade when it makes financial sense to do so and if I need to wait a bit longer to run a game then be it.
 
It also gives time for the display issue to harmonise. Its bad enough with prices of dGPU being so high now, but also a good display can cost the same. When you look at the tv side they have similar features and larger screens yet cost less than some monitors.
 
It also gives time for the display issue to harmonise. Its bad enough with prices of dGPU being so high now, but also a good display can cost the same. When you look at the tv side they have similar features and larger screens yet cost less than some monitors.

With TVs you have economy of scale though - many times more TVs are being produced and sold than fancy monitors, hence they are able to optimise production, and cost of tooling added to each is much smaller etc. Good monitors will always be much more expensive for the size, because of it.
 
AMD on the other hand want a bigger share of our custom, because mining is not a long term thing,

More like amd cards are poor at mining compared to nvidia so they make the excuse that mining is not a long term thing. Either way nvidia are making billions from mining be it short term or long term the cards are best for gaming too so win win to the card that can game and mine.
 
More like amd cards are poor at mining compared to nvidia so they make the excuse that mining is not a long term thing. Either way nvidia are making billions from mining be it short term or long term the cards are best for gaming too so win win to the card that can game and mine.

Utter rubbish im afraid. 3090>3080> 6800/Xt/6900 > 3070<>3060Ti. In fact output per watt, the RX6600 is king (30mhash at 55w)
 
More like amd cards are poor at mining compared to nvidia so they make the excuse that mining is not a long term thing. Either way nvidia are making billions from mining be it short term or long term the cards are best for gaming too so win win to the card that can game and mine.

Utterly clueless. :cry:
 
Not really, most are using 5700 as they are 51 mhash for 120w , but of recent its the RX 6600. 2020 when 3060`s could be bought by the pallet load, then they were used a lot. But now its a lot of amd cards.
 
A2000 and 6600's are best efficiency. Unless your going for minimal estate space then your beefy cards could muscle in but basically non-miners gonna hate and hate with inaccuracy..
 
I cannot imagine AMD are happy with under 20% of the GPU marketshare and the best way to gain marketshare is by price. Even if Wall Street might not like low margins, anything except the lowest end GPUs should still have way better margins than consoles.

So I would expect that as soon as AMD sort out some more capacity, they will one again have to compete on price. RDNA2 compares very well Ampere aside from up-scaling and RT, and since AMD seem have started pouring in lots more into R&D I expect RDNA3 and onwards to be even more competitive. But all that effort to sit at 16-18% marketshare? That would be strange.
 
I cannot imagine AMD are happy with under 20% of the GPU marketshare and the best way to gain marketshare is by price. Even if Wall Street might not like low margins, anything except the lowest end GPUs should still have way better margins than consoles.

So I would expect that as soon as AMD sort out some more capacity, they will one again have to compete on price. RDNA2 compares very well Ampere aside from up-scaling and RT, and since AMD seem have started pouring in lots more into R&D I expect RDNA3 and onwards to be even more competitive. But all that effort to sit at 16-18% marketshare? That would be strange.
Marketing is what they need to focus on. They need to shake off the budget tag.
 
I cannot imagine AMD are happy with under 20% of the GPU marketshare and the best way to gain marketshare is by price. Even if Wall Street might not like low margins, anything except the lowest end GPUs should still have way better margins than consoles.

So I would expect that as soon as AMD sort out some more capacity, they will one again have to compete on price. RDNA2 compares very well Ampere aside from up-scaling and RT, and since AMD seem have started pouring in lots more into R&D I expect RDNA3 and onwards to be even more competitive. But all that effort to sit at 16-18% marketshare? That would be strange.

Wall Street only care about margins,as that is what really affects share price speculation. Yet they have learnt nothing over the last 50 years from Japan,SK,Taiwan and China coming in from the bottom and displacing a lot of our companies. All of them are obsessed about Apple forgetting Apple has ceded a large percentage of the smartphone and PC market to lower margin competitors.

Marketing is what they need to focus on. They need to shake off the budget tag.

Most gamers don't have deep pockets. The reality is that the most common GPUs on Steam are the GTX1060,GTX1050TI,GTX1650 and so on.

BOTH AMD and Nvidia are going to slowly cause the deterioration the PC gaming market if they just forget entry level and mainstream gamers.

Over the last few years,despite the gaming market massively increasing in revenue,smartphones and consoles are taking an increasing percentage of that revenue. So all the articles about PC gaming revenue,ignore the whole gaming market is larger and other more budget friendly platforms are slowly gaining share.
 
I cannot imagine AMD are happy with under 20% of the GPU marketshare and the best way to gain marketshare is by price. Even if Wall Street might not like low margins, anything except the lowest end GPUs should still have way better margins than consoles.

So I would expect that as soon as AMD sort out some more capacity, they will one again have to compete on price. RDNA2 compares very well Ampere aside from up-scaling and RT, and since AMD seem have started pouring in lots more into R&D I expect RDNA3 and onwards to be even more competitive. But all that effort to sit at 16-18% marketshare? That would be strange.

AMD needs better Rasterization performance vs Nvidia, which its rumoured they will have in the next round.
RT performance needs to be at least equal.
Power consumption needs to be lower, which again it looks like it will be.
They need a version of FSR that's as good as DLSS.
They need to be cheaper.

Only if all those things are met AMD might eat in to Nvidia's market share.

But you will still get some idiots, Hardware Unboxed reviewed the RX 6600 as fast and cheaper than the RTX 3060 but still concluded the GPU "sucked" by claiming it was only as fast as the 5600XT when his own slides show its marginally faster than an RX 5700.
This also explains why they recomended the RTX 3050 for up to $450 while completely ignoring the RX 6600 which was in stock at the time for $499, right now the RTX 3050 starts at $600, IMO because most reviewers focused on comparing the RX 6500 to the 3050 and in that way give it a glowing review, yeah.... well done you did it again.
 
The way AMD RT works,if they increase rasterisation performance,the RT performance should nicely increase. The bigger problem here is the lack of performance increase in entry level and mainstream markets,and I mean just in rasterised performance.

The fact that dGPUs such as the GTX1060/RX480 still are in the rough ballpark of what is available under £300(after six years) is really not great IMHO.

I think this is very similar to what we saw in the phone market when Apple and Samsung mostly competed with themselves and we had increasingly subpar entry level and mainstream offerings. Then we had the Chinese companies who made very good entry level and mainstream phones,causing Apple and Samsung to have to respond.
 
The way AMD RT works,if they increase rasterisation performance,the RT performance should nicely increase.

If nothing other than the RT to Raster ratio changes then the 7900XT should get 2.2X better RT performance vs the 6900XT.

But that is irrelevant, if AMD's RT performance is anything more than 10% slower than Nvidia's competing card it will be used as a reason to crap on it.

Basically the way this has to work is don't give them a reason, any reason. because most of them just copy and paste the script Nvidia write for them.
 
If nothing other than the RT to Raster ratio changes then the 7900XT should get 2.2X better RT performance vs the 6900XT.

But that is irrelevant, if AMD's RT performance is anything more than 10% slower than Nvidia's competing card it will be used as a reason to crap on it.

Basically the way this has to work is don't give them a reason, any reason. because most of them just copy and paste the script Nvidia write for them.

AMD needs to get out there and get more games better optimised for the way it does RT. The sad reality is just like with tessellation,Nvidia will make sure games run best on its uarch which is counter-productive to how AMD does it IMHO. Consoles will help no doubt but its going to take some time!
 
AMD needs to get out there and get more games better optimised for the way it does RT. The sad reality is just like with tessellation,Nvidia will make sure games run best on its uarch which is counter-productive to how AMD does it IMHO. Consoles will help no doubt but its going to take some time!

If AMD keep going in the direction they are with their technology Nvidia will find it increasingly difficult to keep up, just as Intel is, yes i know ADL is better, but the CPU Core is 3X the size of the Zen 3 Core and 3X the power consumption for 15% better IPC, when you look at it that way, which you should, ADL is pathetic in comparison.

Nvidia's RDNA3 competing GPU will be the size of one of AMD's TRX40 socket Heat Spreaders and use a ridiculous amount of power, a lot more than RDNA3, and still be significantly slower.
 
If AMD keep going in the direction they are with their technology Nvidia will find it increasingly difficult to keep up, just as Intel is, yes i know ADL is better, but the CPU Core is 3X the size of the Zen 3 Core and 3X the power consumption for 15% better IPC, when you look at it that way, which you should, ADL is pathetic in comparison.

Nvidia's RDNA3 competing GPU will be the size of one of AMD's TRX40 socket Heat Spreaders and use a ridiculous amount of power, a lot more than RDNA3, and still be significantly slower.

But they do need to use their new found riches,to get better optimisations in software for their designs. It's what Intel and later Nvidia did...and why Nvidia even dGPUs which technically were not superior could still win the day. Also why CUDA got such a foothold in certain other markets too.
 
Back
Top Bottom