• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Not much excitement for RDNA3, compared to the RTX 4000 series

AMD still makes most of their money from selling CPUs. Logically, it's probably better to focus most of their investment there.

It's not really in their interests to out produce each other, unless they can scale the production at low cost. They seem to have 'just enough' capacity at the moment. I would guess that scaling up chip production is always a big risk, financially.
Thing is, while some of RNDA2's R&D was paid for by MS and Sony, that's not the case for RNDA3 and besides if a new generation costs tens, or hundreds of millions to design and tape out then selling low quantities just because the CPU division can sell the wafers for more... Well why bother being in the GPU business then?

The margin obsession ignores the huge fixed costs of designing a new generation of GPUs; they really need volume. Plus, their margins aren't Apple crazy so surving on under 20% of the market is very dangerous even if developers can't totally ignore them due to the consoles.
 
AMD don't give a rats ass about the PC GPU market.


They do give off that impression. Just seem happy to let NVIDIA announce how good they are, then they sweep in very late to try and compete in the mid range.

Hopefully they have a quantum leap on ray tracing performance this year to match NVIIDa whilst bettering rasterisation.
 
Poor Volta?

As mentioned, AMD have been happy with selling consoles as fast as they can make them. What little RDNA2 cards they made also sold out at higher than normal prices.

I don't think RDNA3 will be back to the days they can't compete at the high end, but I can't see them beating the 4090. If they can compete in the $500-$1000 range, I think that is a good outcome.

Also whilst people pretend they don't care, RT does matter and FSR needs to improve to DLSS2 levels. I am happy they ditched FSR1 which was ridiculous and just a sharpening filter.
 
Last edited:
They do give off that impression. Just seem happy to let NVIDIA announce how good they are, then they sweep in very late to try and compete in the mid range.

Hopefully they have a quantum leap on ray tracing performance this year to match NVIIDa whilst bettering rasterisation.

A lot of people don't care about RT, and AMD know it doesn't matter what they do, people will still just buy NV.
 
100% agree Raja in my book sabotaged the Radeon division at AMD while he worked there and when he moved to Intel DGPU division I knew the same was going to happen and as we see same BS again from him and worst part now hiding and not even commenting on the released Intel DPGUs that are a fail in reality for Intel and a huge money waste so far and selling at a loss even at MSRP because they can't compete with AMD and NVidia for the price to performance and reliable drivers and worst you need a system that supports REBAR and games with DX12 as DX11 and under games are basically performing a lot worse because they are using a bit of software to translate DX11 and under calls to DX12 in the drivers which adds huge overheads so far.

Meeh Intel lost the plot hiring Raja to run their DGPU department and only reason I see they did that is because he stole AMD IP to make these new intel DGPUs to speed up intels dgpu section, but something smells very fishy why they hired him as he was shown to be a failure at AMD at the time and many years before that with his performance lies for the new gen cards they were introducing at them times. Worst mistake intel could have made really in my book hiring him.
Sabotage would be deliberate which suggest he's actually competent, he isn't.

ARC is suspiciously like Vega, in fact its not just like it, it has exactly the same problems and behaves exactly the same way in specific games, it is Vega Mk2, Maybe that is why Intel brought him in, or maybe AMD sold Intel (lol) the IP, which (lol) they no longer wanted.

In any case as an RTX 2070 Super owner i find this hilarious.

Here it is, on sale for £450. I paid £480 for my MSI 2070 Super Gaming X.

A770 comes in at 8% slower than an RTX 2080.
The RTX 2070 Super is 7% slower than the RTX 2080.

So is this, and £100 cheaper.

Intel are not serious about competing in this space.

Y6dgNRS.png


ylYnTty.png
 
A lot of people don't care about RT, and AMD know it doesn't matter what they do, people will still just buy NV.

The people that don't care about RT aren't dropping $1k on a card this generation.

RT has stopped being a special setting. RT GI and RT reflections will just be part of high and ultra settings.
 
Last edited:
The people that don't care about RT aren't dropping $1k on a card this generation.

RT has stopped being a special setting. RT GI and RT reflections will just be part of high and ultra settings.

Not sure how you can know that about every individual. Plenty of people bought near AMD cards last gen too.

The truth remains that NV have huge mindshare, especially outside of communities like this.
 
Last edited:
The people that don't care about RT aren't dropping $1k on a card this generation.

RT has stopped being a special setting. RT GI and RT reflections will just be part of high and ultra settings.
I don't care about RT and I'll pay over $1k to get my VR driving sims to run smoothly in night races.
 
Not sure how you can know that about every individual. Plenty of people bought near AMD cards last gen too.

The truth remains that NV have huge mindshare, especially outside of communities like this.

I mean if my statement isn't true for 100% of people, that makes a difference? Maybe AMD can sell to the 5 people who will choose a worse performing RT card (all else being equal) when talking about $1k graphics cards.

I would not pick a 6800 XT over a 3080 for their MSRPs. 6900 XT vs 3080 Ti? Easy choice.

Only reason I have a RX 6800 right now is because it was way cheaper on MM.

I will be looking for a 4K60/120 graphics card in the next 6 months. Not being able to run games like Spiderman Remastered or Dying Light 2 with RT will matter.
 
Last edited:
VR, the one thing that is even rarer than RT.

Nvidia tried to push 8k gaming but 8k might as well be 16k or 32k as its getting to the point where we can't see the difference unless the screen is two stories high and you sit 2ft from the screen.

With VR, you are effectivly holding a magnifying glass up to the screen so higher resolutions offer a more noticable improvement.
 
Nvidia tried to push 8k gaming but 8k might as well be 16k or 32k as its getting to the point where we can't see the difference unless the screen is two stories high and you sit 2ft from the screen.

With VR, you are effectivly holding a magnifying glass up to the screen so higher resolutions offer a more noticable improvement.
Native 4K with AA on a 32” monitor still looks jaggy and shimmery to me unfortunately. I say unfortunately because I would love for the monitor I already own to have the highest perceivable resolution. Not quite there yet.
 
I still think an RTX 3060 TI is enough, especially for 1080p.

It's a shame these cards are still being sold at £400+

Tbh if buying right now, definitely get the FE as these are still being sold on the website at MSRP. Shame there is still the 1 per household rule! Best bet is to get someone else to buy 1 for you if someone in your household already bought one, circumventing this rule now seems to be otherwise impossible.

RT is purely nice to have.

Or the equally decent RX 6700 XT:
You can get 3070Ti's for £400 now, 3080's for around 450 (members market)
 
Native 4K with AA on a 32” monitor still looks jaggy and shimmery to me unfortunately. I say unfortunately because I would love for the monitor I already own to have the highest perceivable resolution. Not quite there yet.

I'm happy with a quality 1080p monitor while sitting at my desk, but 1080p in VR (Rift CV1) makes everything look like a game of Minecraft.

The difference is more drastic when you wear the monitor on your face.
 
Last edited:
Poor Volta?

I'm still baffled as to why people bring that up years later, Volta never became a geforce gpu because according to leather jacket man it was too expensive to put into that market at the time, hence "poor volta" as it was widely expected to be the next geforce gpu.
 
Last edited:
I'm still baffled as to why people bring that up years later, Volta never becamse a geforce gpu because according to leather jacket man it was too expensive to put into that market at the time, hence "poor volta" as it was widely expected to be the next geforce gpu.

What are you talking about. Turing was the consumer card and it wiped the floor with AMD. It is a perfect example of AMD hype that failed.

Don't need revisionism of history.

RDNA2 was the first time anything AMD has done has lived up to anything like the hype. Hopefully RDNA3 is continuation of that instead of the half a decade before that.
 
Last edited:
Back
Top Bottom