• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Absolutely right. Nvidia are trying to tell us expensive dGPUs are here to stay and it isn't their fault. Yet AMD show that a competitive dGPU can be made for much less than Nvidia are trying to convince us is the new norm.

If the 7800XT matches the 4080 16GB in raster for ~$500 less, then no amount of "but ray tracing" will make it worth that premium. In essence both GPUs will require upscaling to become playable with RT. So that ~$500 extra doesn't really get you anything tangible.

At 4K I can already get 40+ FPS in Nvidia sponsored RT games on my 3080 FE. And yet people still use CP2077 as some sort of RT Nirvana, despite it being a 2 year old game most people will have stopped playing long ago.

If people want better priced AMD dGPUs,than they need Nvidia to respond and drop their prices. Would I prefer the RX7950XT to be £799 and the RX7900XT be £599? Sure! But how is that going to happen,when AMD is beating Nvidia in price/performance already without even trying? Nvidia needs to drop prices.

But if they are trying to sell an RTX3050 for RX6600/RX6700 level money,then they are not even responding to RDNA2 price drops,despite significantly better AMD dGPUs available for the same price. The RX6600 is around 30% faster than an RTX3050 in rasterised games,and they trade blows in playable RT games too.

Also,wait for Atomic Heart to be the next "reason" why Nvidia cards are worth the premium.
 
Last edited:
If people want better priced AMD dGPUs,than they need Nvidia to respond and drop their prices. Would I prefer the RX7950XT to be £799 and the RX7900XT be £599? Sure! But how is that going to happen,when AMD is beating Nvidia in price/performance already without even trying? Nvidia needs to drop prices.

But if they are trying to sell an RTX3050 for RX6600/RX6700 level money,then they are not even responding to RDNA2 price drops,despite significantly better AMD dGPUs available for the same price. The RX6600 is around 30% faster than an RTX3050 in rasterised games,and they trade blows in playable RT games too.

Also,wait for Atomic Heart to be the next "reason" why Nvidia cards are worth the premium.

That's my point exactly. Nvidia are telling us the $1200 4080 16GB is the new norm. AMD called their bluff with the $1000 7900XTX that is a lot faster in raster but slower in RT. I am not saying the 7900XTX price is good, just not anywhere near as bad as the 4080 16GB. I genuinely cannot remember anyone (AMD or NVidia fans) being impressed by the 4080 12GB or even 16GB price/performance.

The ball is in Nvidia's court and the fact there are no price cut announcements to knock the wind out of AMDs sails, is quite telling. Nvidia never miss a trick when it comes to their marketing making AMD look like amatuers. Instead AMD have been getting positive marketing and feedback from most tech sites for the fact their top tier dGPU is actually quite competitive in raster, yet $600 cheaper.

Nvidia have never let that **** happen before and it is totally uncharacteristic for them to let AMD have this much positive press.
 
That's my point exactly. Nvidia are telling us the $1200 4080 16GB is the new norm. AMD called their bluff with the $1000 7900XTX that is a lot faster in raster but slower in RT. I am not saying the 7900XTX price is good, just not anywhere near as bad as the 4080 16GB. I genuinely cannot remember anyone (AMD or NVidia fans) being impressed by the 4080 12GB or even 16GB price/performance.

The ball is in Nvidia's court and the fact there are no price cut announcements to knock the wind out of AMDs sails, is quite telling. Nvidia never miss a trick when it comes to their marketing making AMD look like amatuers. Instead AMD have been getting positive marketing and feedback from most tech sites for the fact their top tier dGPU is actually quite competitive in raster, yet $600 cheaper.

Nvidia have never let that **** happen before and it is totally uncharacteristic for them to let AMD have this much positive press.

They have missed massively on revenues and they lost over 20 percentages points in margins,which are now quite close to AMD:

It's quite clear,they are selling their dGPUs for less money to Dell,etc because I have seen some decent prebuilt deals. They are just taking the mickey with the PC enthusiast market,by using us to prop up their margins.

The thing is even if they unlaunch the RTX4080 16GB at £1300 and put it down to "only" £900,it looks very reactionary.
 
Last edited:
I agree, but wasn't there some benchmarks that showed the 4080 16GB a lot closer to the 4090 than the slides before.

Not sure where the 4080 Ti is going to sit between the 4080 16GB and the 4090 at moment as prices are insane for the 4080 16GB.

The 4070 16GB has 47% less shaders than the 4090, if people are posting benchmarks close to the 4090 those slides are due to a CPU bottleneck, the 4070 16GB isn't going to be much faster than a 3090Ti.
 
Last edited:
I fail to see how getting gouged by all vendors is good for consumers.

I'd say things are the worst they've ever been for people, even with Intel having "viable" products now.

Which is slower than a 6600XT and more expensive.....

A 6650XT, an even faster card is £300, the A770 is £450, £150 more for a slower card with actual problems.

Its not been a good start for Intel, they mean to start where Nvidia are.....

We need to start appreciating AMD more because the other two are away with the fairies!

But they are exPENsive to maKe...... that's not AMD's problem. AMD spent the R&D to make them efficient.
 
Last edited:
.. I genuinely cannot remember anyone (AMD or NVidia fans) being impressed by the 4080 12GB or even 16GB price/performance ..

I agree. I am normally an XX80 buyer and I am absolutely horrified by the price increases this time around. It's £1200+ for NVIDIA and £1000+ for AMD. The performance of the 4080 is utterly miserable compared to the price increase. They really have gone outside my comfort levels now. Instead of leaping out to buy one, I will be waiting, and for the very first time seriously looking at an AMD. In the mean time I am going to "downgrade" my monitor so I don't have to keep feeding the NVIDIA thieves!
 
I agree. I am normally an XX80 buyer and I am absolutely horrified by the price increases this time around. It's £1200+ for NVIDIA and £1000+ for AMD. The performance of the 4080 is utterly miserable compared to the price increase. They really have gone outside my comfort levels now. Instead of leaping out to buy one, I will be waiting, and for the very first time seriously looking at an AMD. In the mean time I am going to "downgrade" my monitor so I don't have to keep feeding the NVIDIA thieves!

Its not even an ##80 class card, its a 4070. :)

You would be buying a 4070 that's just been renamed the 4080.
 
Last edited:
Which is slower than a 6600XT and more expensive.....

A 6650XT, an even faster card is £300, the A770 is £450, £150 more for a slower card with actual problems.

Its not been a good start for Intel, they mean to start where Nvidia are.....

We need to start appreciating AMD more because the other two are away with the fairies!

But they are exPENsive to maKe...... that's not AMD's problem. AMD spent the R&D to make them efficient.

Why would I appreciate any vendor that's pulling my pants down?
They're all crap
 
I don't understand why 4k is suddenly an issue for some people (not you) 3090ti was a 4k card last gen yet the 7900xtx which should perform better is suddenly only a 1440p card.

My eyes are going to be abused when I plug a 7900xtx I'm playing 4k on a 1080ti.

oh yeah, feel like you need to be a lawyer to say anything positive or negative right now about any card. I'll try it as a programmer

If (screen res = 1440p)
{
amazing performance, well done
}
else if (screen res = 4k)
{
great performance, well done. But turn down some settings for RT games
}
else if (screen res = Varjo Aero)
{
oh dear precious, you really brought that on yourself
}
 
That card has been cancelled hasn't it

No that was the 4060 12GB.

GTX 970 = to 780Ti
GTX 1070 = to 980Ti
RTX 2070 = to 1080Ti
RTX 3070 = to 2080Ti

The 4080 16GB is = to 3090Ti

It has 47% less shaders than the 4090, that is a similar difference to the difference in the high end to all the generation i quoted above.

Its a 4070!
 
Last edited:
No that was the 4060 12GB.

GTX 970 = to 780Ti
GTX 1070 = to 980Ti
RTX 2070 = to 1080Ti
RTX 3070 = to 2080Ti

The 4080 16GB is = to 3090Ti

It has 47% less shaders than the 4090, that is a similar difference to the difference in the high end to all the generation i quoted above.

Its a 4070!
A 2080 = 1080ti = 2070 super

I thought they had learned from 2000 series, 4000 series is much worse.
 
A 2080 = 1080ti = 2070 super

I thought they had learned from 2000 series, 4000 series is much worse.

Yeah, they increased the shader count from 2304 to 2560 for the 2070 Super because it wasn't selling that well.

That brought it more in line with the 2080 which also got a Super refresh, they are all roughly around the 1080Ti.

n0sa0Im.png

Its just ridiculous that Nvidia upped the price for what is a ##70 class card from $500 to $1200 and re-branded it the 4080.

Its ####'ing outrageous.

Its where we are now with Nvidia, and Intel straight in with inflated prices... the 6650XT on that chart is £150 cheaper than the A770.
 
Last edited:
It really doesn't matter at this stage. If after all this (recent past moves) nvidia still maintain or grow that 80% then no harping on about it is going to make a blind bit of difference.
 
oh yeah, feel like you need to be a lawyer to say anything positive or negative right now about any card. I'll try it as a programmer

If (screen res = 1440p)
{
amazing performance, well done
}
else if (screen res = 4k)
{
great performance, well done. But turn down some settings for RT games
}
else if (screen res = Varjo Aero)
{
oh dear precious, you really brought that on yourself
}

Need some nesting, indentation or an ending clause :p
 
It really doesn't matter at this stage. If after all this (recent past moves) nvidia still maintain or grow that 80% then no harping on about it is going to make a blind bit of difference.

This is a business decision from AMD, they are not doing it out of charity, they expect to see market share growth from being significantly cheaper, if that doesn't happen they will think again about those decisions because selling 1 million units at $1000 is less money than selling 800,000 units at $1400 and thier margins would jump a huge chunk to boot.
 
The portion of the market that is even at least somewhat discerning is 10% at most, rest is sold on marketing, that's why all the hubbub on forums about this or that feature, raytracing performance, $/fps etc are all meaningless. Most of the market is buying based on brand recognition or just straight up pre-builts/laptops, so it matters even less. That's why ultimately Nvidia has kept such a dominating position vs AMD, because AMD has had historically incompetent marketing teams on their best days. At the same time though the supply constraints for them are also real, so it doesn't help. If Radeon was its own company it would probably do much more to try and compete & be aggressive but as it stands they can languish more as they keep building up the integration between both Ryzen & Radeon across platforms (including HPC, consoles, etc.). And truth be told for Radeon to really make a dent it's their work-related performance & features that need the most attention and without that they are forever cursed to remain #2, and even then they still risk being dethroned by Intel as pathetic as they are, still can't count them out just yet.
 
You need people to abstain. Hopefully anyone genuinely needing to upgrade from stuff lower than a 3080 or need a prebuild just opt for the RDNA3 stuff, that is when nvidia will sit up and take note.
 
Back
Top Bottom