• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Both the 3090 and 3090 Ti were halo cards meant to eke out every last bit of performance over the 3080 just like the Titans. They were only meant for buyers who wanted the best or for those who use that high VRAM for professional purposes. It's meant for a user base who would swap the 3090 for the Ti months down the line without batting an eye. In this context, how was the pricing any worse than the Titans?

The 3080 was the practical gaming flagship last generation which was around 10-15% slower than the 3090 and the 3080 Ti was also a good bit cheaper than the 3090 whilst being just 5% slower.

The x90 tier is meant for those who just wants the absolute best, efficiency and pricing being thrown out of the window.

It's virtually certain that Nvidia will come out with a 4080 Ti near the 7900XTX price point and performance, which many here say is value for money.

They might, but they wont like it, the margins on that 4080Ti will be very much lower than AMD's on the 7900XTX.

The last thing Nvidia wanted right now with their manufacturing costs so high was a price war with AMD making 4090 class cards that are half the size with cheaper VRam, much more efficient and on a cheaper TSMC node.

DLSS and RT was meant to kill AMD's competitivness, know your place AMD this is so annoying....
 
Who else has just lost thier margins lead to AMD trying to compete with them? hhmmmm???

Hello Mr Biden can i haz a $52 Billion bailout plox??? I'm skint. AMD been bullying me...
 
Last edited:
DLSS and RT was meant to kill AMD's competitivness, know your place AMD this is so annoying....

They can't force AMD out with RT.

For it to matter, seriously matter, RT needs to be everywhere, cheaply accessible and game breaking if not working.

Instead it's niche, horribly expensive and typically optional. Which means the games are designed to be played without it. Which should absolutely be the case for a graphics toggle. But the point is RT pressure is trivial.

By the time it's more widely adopted AMD will have chugged along gradually improving its RT and chances are it's going to be just fine when RT becomes a meaningful factor.
 
Both the 3090 and 3090 Ti were halo cards meant to eke out every last bit of performance over the 3080 just like the Titans. They were only meant for buyers who wanted the best or for those who use that high VRAM for professional purposes. It's meant for a user base who would swap the 3090 for the Ti months down the line without batting an eye. In this context, how was the pricing any worse than the Titans?

The 3080 was the practical gaming flagship last generation which was around 10-15% slower than the 3090 and the 3080 Ti was also a good bit cheaper than the 3090 whilst being just 5% slower.

The x90 tier is meant for those who just wants the absolute best, efficiency and pricing being thrown out of the window.

It's virtually certain that Nvidia will come out with a 4080 Ti near the 7900XTX price point and performance, which many here say is value for money.

The 4080Ti would cannabalise the already slow sales for the 4090. The only reason the 3080Ti existed is so Nvidia could get more money by diverting chips from the 3080. They were selling every GPU they could make at excessive margins just over a year ago due to chip shortages and scalpers/miners. Nvidia will not release a 4080Ti any time soon and when they do it won't be "only" $1000. They still haven't announced any price cuts on the 4080 16GB and when they do it will look bad, it will not be a case of "ooh well done Nvidia".

AMD have the 7900XTX and 7900XT already overpriced IMHO, yet purely because of the 4080 joke prices they get to look like good value. I have even had to remind myself that the 7900XT should have been the 7800XT, when I think "if the 7900XT is less than £1k I might just get one".
 
They can't force AMD out with RT.

For it to matter, seriously matter, RT needs to be everywhere, cheaply accessible and game breaking if not working.

Instead it's niche, horribly expensive and typically optional. Which means the games are designed to be played without it. Which should absolutely be the case for a graphics toggle. But the point is RT pressure is trivial.

By the time it's more widely adopted AMD will have chugged along gradually improving its RT and chances are it's going to be just fine when RT becomes a meaningful factor.

In the here and now, they can get away with not matching nvidia on RT especially since according to forums like this where even most enthusiasts apparently don't care for it.... the problem is the future and longevity of gpus for when RT does get used more heavily (pretty much every game coming out these days has it now, be that a basic implementation or a heavy implementation, it's clear at this stage where the future of gaming is going) and most importantly, isn't an optional setting, of course, not an issue for people who upgrade every 1-2 years though, only for those who want to hold onto their gpus for longer i.e. same way people say they want more vram in order for their gpu to last longer.

Also, it's not expensive to buy a gpu with RT support. It's only "expensive" if you want to "max" it out in games like cp 2077 @ high res. and refuse to use upscaling tech.

The other thing to consider here is with how much nvidia are investing in RT (not just from hardware pov but also software and all their partnerships) with amd investing very little in it, is how will things look in the future? Will it be like what has happened in the past where amd lagged behind in raster and dx 11 for several years because they weren't proactive and only focused on short term results? Of course, amd could decide to really focus on RT with any future releases once titles start to up the RT effects or/and move to it being standard where RT is no longer an optional setting but imo, they will need to start looking into it more with next gen. and onwards and treat it like how they did with low level apis such as mantle (now vulkan) and dx 12 where they arguably led that compared to nvidia. As it is right now, imo, nvidia have a significant head start with RT in every area possible and long run, this could pay of for them.
 
Last edited:
They can't force AMD out with RT.

For it to matter, seriously matter, RT needs to be everywhere, cheaply accessible and game breaking if not working.

Instead it's niche, horribly expensive and typically optional. Which means the games are designed to be played without it. Which should absolutely be the case for a graphics toggle. But the point is RT pressure is trivial.

By the time it's more widely adopted AMD will have chugged along gradually improving its RT and chances are it's going to be just fine when RT becomes a meaningful factor.

Unlike PhysX RT does matter, maybe not so much right now but in the long run.

The problem is Nvidia had an idea, from their perceptive something that will give them an edge, which it does, however they also didn't think AMD would have their own version of it so quickly, to some extent that is right, Nvidia's is very much better.

But compared with AMD that comes at a cost, the RT cores in Nvidia GPU's, along with the Tensor cores and the 96MB of L2 is what makes Nvidia 3'rd generation RT so fast, the cost is the amount of die space that takes up, it is very very much more than AMD's version which is very much more compact and efficient, that's AMD MO, their entire thinking is compact, efficient low cost modular design, they are so damned good at engineering solutions, always have been, AMD designed integrated multi-core circuits that are compact and efficient while Intel tried glueing multiple CPU's together, whose design won-out? whose design are Intel using now? Again AMD designed compact integrated circuits for 64Bit logic, Intel tried building a whole different architecture that ended up massive, with very high power consumption at half the performance of AMD64, guess whose design won-out, whose design are Intel using now?

Look at Sapphire Rapids, look at the size of it, now look at Bergamo with twice as many cores...

The Navi 31 logic die is half the size of AD 102, the latter is also on a better node, AMD will keep growing their RT performance without making it a giant power hogging beast.
 
Last edited:
They can't force AMD out with RT.
Nvidia doesn't want to kill off its competitor and draw the government's ire as a monopoly when instead it can let AMD stay on life support with 15-20% market-share which predominantly results from the lower margin bottom of the market anyway. They're ecstatic with the current arrangement.
 
Nvidia doesn't want to kill off its competitor and draw the government's ire as a monopoly when instead it can let AMD stay on life support with 15-20% market-share which predominantly results from the lower margin bottom of the market anyway. They're ecstatic with the current arrangement.

Don't kid yourself, Intel was going to revoke AMD's X86 licence, if AMD hadn't seen that coming, invented 64Bit logic and then tagged that on to X86 creating X86_64 Intel would be the sole X86 provider now and a long time ago. Intel never believed AMD have any right to make X86 based CPU's and they still don't

We used to have about 10 different GPU makers, Nvidia want to be the sole survivor and they work every minute of every day to try and make that happen.
 
Last edited:
6900XT was faster than 3090 at 1080p and 1440p in raster and was trading blows with 3090 at 4k and was significantly cheaper yet the 3090 sold significantly more. We can discuss all day about whether RT matters but when you are pricing your product at a grand, even the most little things matter. It's an enthusiast price tier and RT is important for that reason.
 
Nvidia doesn't want to kill off its competitor and draw the government's ire as a monopoly when instead it can let AMD stay on life support with 15-20% market-share which predominantly results from the lower margin bottom of the market anyway. They're ecstatic with the current arrangement.

Why are you talking of life support and being ecstatic.

AMD isn't a company living off a 20% pc gaming gpu share. Nvidia has nothing but its gpus.
 
Why are you talking of life support and being ecstatic.

AMD isn't a company living off a 20% pc gaming gpu share. Nvidia has nothing but its gpus.

They couldn't survive off that alone, no one could, with an imbalance of revenue like that the lower share holder can't R&D to keep pace with the larger share holder, nothing like enough income from 20% share, its why ATI was going bust, AMD have Ryzen revenue to lean on.
 
6900XT was faster than 3090 at 1080p and 1440p in raster and was trading blows with 3090 at 4k and was significantly cheaper yet the 3090 sold significantly more. We can discuss all day about whether RT matters but when you are pricing your product at a grand, even the most little things matter. It's an enthusiast price tier and RT is important for that reason.

mining was also a thing and the 30 series was soo much better, even if I could have bought the 6800xt at msrp I would still choosen the 3080 FE it paid for itself with mining plus more
 
6900XT was faster than 3090 at 1080p and 1440p in raster and was trading blows with 3090 at 4k and was significantly cheaper yet the 3090 sold significantly more. We can discuss all day about whether RT matters but when you are pricing your product at a grand, even the most little things matter. It's an enthusiast price tier and RT is important for that reason.
The 3090 sold more because it had double the hash rate which meant after a few months of mining they worked out at a similar price.
 
mining was also a thing and the 30 series was soo much better, even if I could have bought the 6800xt at msrp I would still choosen the 3080 FE it paid for itself with mining plus more
You can check the Steam charts as well. Ampere as a whole crushed RDNA2. IIRC, the 3090 has a share of almost half the rdna 2 lineup
 
Last edited:
You can check the Steam charts as well. Ampere as a whole crushed RDNA2. IIRC, the 3090 has a share of almost half the rdna 2 lineup

yes I know but it doesnt happen over night the question is can they keep going looking years ahead will be interesting what both have in the lower tiers where most of the market is
 
Last edited:
The 3090 sold more because it had double the hash rate which meant after a few months of mining they worked out at a similar price.
As per the October Steam survey, the RTX 3080 Ti which was an LHR GPU had more share than the 6900XT, 6800XT and 6700XT combined despite costing more than all 3.
 
As per the October Steam survey, the RTX 3080 Ti which was an LHR GPU had more share than the 6900XT, 6800XT and 6700XT combined despite costing more than all 3.
You are saying what people already know. More people buy Nvidia because of mindshare and the fact that the overwhelming majority of prebuilds and laptops (with dedicated graphics) come with Nvidia graphics cards.

I know people who will only buy Nvidia but have absolutely no idea what they are talking about, it's just what someone told them once so they stick to it.
 
yes I know but it doesnt happen over night the question is can they keep going looking years ahead will be interesting what both have in the lower tiers where most of the market is
They can only keep doing this as long as they have a unique selling point for their cards. Raster performance within 5-10% of each other might as well be an error margin unless we are talking about games which are right on the cusp of 60 FPS, which in my experience with the 3080 Ti was only in RT games.When performance is so similar, the differentiators are price and features. When you enter halo card territory, the price becomes kind of irrelevant so its a war of features. At the mid-range, the price is the diffrentiator, but here you have an Nvidia card which can somewhat run RT games with 1-2 effects turned on and has DLSS in addition to FSR while with the AMD card, RT will be inferior and you won't get DLSS.

AMD needs a unique selling point for their cards if they want to gain marketshare. Just tailgating Nvidia in rasterisation while charging lower prices won't cut it because now the market thinks Nvidia is the more premium option with more features.
 
You are saying what people already know. More people buy Nvidia because of mindshare and the fact that the overwhelming majority of prebuilds and laptops (with dedicated graphics) come with Nvidia graphics cards.

I know people who will only buy Nvidia but have absolutely no idea what they are talking about, it's just what someone told them once so they stick to it.
I think you are doing Nvidia customers a disservice by saying they are uninformed. The 3080 Ti costed well over £1,200 through its lifecycle. People who will be buying a card like that or a prebuilt with it (likely to have astronomical prices) will be doing their due diligence and know what the alternatives are. I can agree with you if we are talking about something which is £500 or lower but a customer spending a grand on something knows what he is doing.
 
Last edited:
You seem to think RT is such a big difference imo it isnt in the mid and lower tiers look at the poll on this forum, pricing matters in the lower tiers, Im interested what we will get in the £600-£700 range
 
Last edited:
Back
Top Bottom