• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
I thought Dave only talked **** on the intel threads, turns out he's branching out!
Imo this generation makes no sense for you as a 3080 owner, except if you want just way more performance, which really means 4090 as everything else from Nvidia will be overpriced for the specs, and RDNA 3 barely caught up to what you already have if we add RTX. I'd say even more so because I think they will do a refresh after a year, and the price/perf will be way better.

I'm not defending anyone, just talking about the situation at hand. You're right that lower the down the stack the situation is brutal though, Nvidia happily squeezing blood from a stone there, so AMD has more of a shot, but that's months away until we see lower end cards, plus the used market is very appealing so not much reason to wait for them.

Yeah, but we're talking about the already existing RT modes first; even there RDNA 3 is way behind. Also I can't agree on Overdrive, I think it's great to see games get way higher fidelity modes, and since I love the game I'm definitely grateful to see it get upgrades. Hell, why can't AMD do that in other titles that might favour them?! Instead they pay for nothing. Can't see why that's better.

We literally have official numbers from AMD themselves.

Better visuals = more expensive to render. Same as it is with raster, or do you think the non-RT equivalents like PCSS, VXAO, planar reflections, etc. are not resource hogs? Like it or not that's why people are upgrading to high-end GPUs, for prettier graphics. RT is transformative & is now widespread. This ain't 2018 anymore.

12dkTmQ.gif

No one's asking for that. They can't seem to offer 4090 performance for any amount of money atm, that's the sad part.

4090 ? Why would I want that my monitor is 1440p 240hz the 4080 16gb looked like a nice jump but yeah screw that pricing

I mostly only play cod and not sure if it's AMD sponsored but it always seems to run better on AMD so will be looking with interest at the 7800xt
 
It amazes me people are wanting to pay 4090 prices. The words "****** stupid" comes to mind.

The competitor charges the same RRP as their previous generation high end dGPU,whilst offering 50% to 70% extra performance,50% more VRAM,whilst their competitor charges even more money.

But then blame the competitor for the high price of the Nvidia dGPUs,which "forces" them to "have" to spend $1300 on an RTX3070 replacement or nearly $1700 on an RTX4090. Also apparently it must be the fault of AMD for Nvidia trying to sell the RTX3060 replacement for $900. Getting an RX7900XT for that price is worse!

However,then don't seem to ask,why didn't Nvidia just price their RTX4000 series dGPUs better forcing AMD to charge less? Oh,wait they don't because Nvidia would rather sell you an RTX3080TI for under $1000.

It's the same logic with those eternally locked to iPhones. It's always some reason for dunking on Android phones.
 
Last edited:
What AMD have done imho is crush the 4080, so the top end seekers are still in the Nvidia camp but commercially and on volume AMD have potentially set the new standard in the high end.

Tbh what I think is more interesting is where AMD go from here, in theory they could scale the chiplet approach into something much more potent as and when they want... maybe that is RDNA4 but the lid has been opened
 
Some actual numbers, now do the math Vs 4090.

WoDskPY.png

From what I can gather from some 4090 videos on youtube, the 4090 average framerates for these games are:

COD:MW2 = 145 fps
GOW= 130 fps
RDR2 = 103 fps
AC:V = 128 fps
RE:Village = 175 fps
Doom Eternal = 190 fps

Looks like the 7900XTX is probably going to be between 4080 and 4090. It is a bit cheaper than the 4080 but I still would not pay £1000 for any gpu no matter who makes it.
 
Last edited:
Not likely with recessions imminent and downturn on the pc market as a whole





Im hoping the AIBs have a 450w bios to really show what these new gpus are capable off

It will be a limited supply card and to take the title, ATI/AMD have done it before, will come with a silly price for sure but not aimed at the general gamer more for a collector.
 
RavenXXX2 said:
Holy ****. Eveyone in the comments has been conditioned to these new higher prices that they think a $1000 gpu is "a great price". Nvidia really ****** you guys, huh? You guys have that Stockholm syndrome from the great 2020-2021 GPU shortage that you can't tell what's a good price anymore. AMD and especially Nvidia are laughing all the way to the bank after see all you guys cobbling up all their over-priced GPUs.

Not that I'm happy about these prices... but comments like this really bother me as they show a significant failure in any comprehension of economics.

To preface... I'm annoyed about the de-valuation and over-inflation of tiers that we see from Nvidia... but that's only because they have no competition in the market.

But when you look at actual pricing changes over the last 20 years & account for exchange rates and inflation... the prices aren't all that crazy.

Radeon X800XT was released 2004... It was $499. I seem to remember getting one for about £350.

Exchange rate was something like 1.9 at the time, so even with VAT, I paid a bit over the odds because stock was in short supply.

Now the exchange rate is 1.1... so even at base pricing with VAT, we are looking at around £550, just with the exchange rate.

Then we look at inflation... that comes to around £900.

Then there are various tech-specific supply chain restrictions, limits, expenses & other things that have happened all over the place since then... so, if we stay AMD/ATi-specific... we are looking at a grand total of £100 price increase... in 18 years... 100 quid in EIGHTEEN YEARS.

Ooooooh... massive price increase... just because your wages haven't tracked with exchange rates and inflation... are NOT the problems of these companies - they are the problems of your governments and beyond. Get your head on straight and look at the wider world.
 
I know it's a tech demo, and this technically hurts the RTX 4090 going by the unusually low fps, but timestamped here is what a future engine does when ray tracing is set to off - aka visual quality nose dives to becoming un-look-at-able:


600 fps in DP 2.1 pure raster Fortnite offers no protection versus this potential future issue where raster becomes obsolete and FPS will be locked at whatever ray tracing power the GPU can deliver.

Outside a few tech demo type games sponsored by Nvidia,or PC exclusives,most AAA games are developed with consoles in mind too. These will have lower RT performance than an RX7900XTX,so until consoles become powerful enough,games will still mostly use the hybrid approach. It also needs entry level and mainstream dGPUs to do RT well enough and they certainly don't IMHO.
 
The core die of Navi 31 is 30% smaller than the AD103 on a slightly worse version of TSMC 5NM. Nvidia is at the reticle limit of TSMC 5NM and had a massive jump from Samsung 8NM to TSMC 4N. The next jump won't be as big. So AMD is competing with a smaller core die and a few tiny memory dies made on 6NM. This is pretty much a Zen/Zen2 moment.

How is Nvidia going to compete with these large monolithic dies,unless they move onto chiplets soon?

AMD is giving 50% to 70% performance jumps over it's last flagship at $899~$999. Nvidia was trying to sell a RTX4080 12GB for the same price mere weeks ago.

Nvidia is charging $1300 for that "small die" and $1650 for it's "large die" because it made too many Ampere dGPUs and wants to preserve margins. The yields on AD103 are probably worse than the Navi 31 core die.

As usual more people defending prices of Nvidia,because it must be the "fault" of AMD. It was the same with Zen/Zen+ because Intel had the edge in single core performance and gaming. But we saw where the modular Zen cores would lead AMD.
So, what I am hearing, is that it was AMD's active choice to under-perform / under-achieve.

If that's the case, then we are back to the idea that Lisa is purposefully leading AMD's GPU division to failure & acting to not exceed the performance of her uncle's company... if it could be proven, that would be quite a significant antitrust/anti-competitive class action lawsuit.
 
Yeah, but we're talking about the already existing RT modes first; even there RDNA 3 is way behind. Also I can't agree on Overdrive, I think it's great to see games get way higher fidelity modes, and since I love the game I'm definitely grateful to see it get upgrades. Hell, why can't AMD do that in other titles that might favour them?! Instead they pay for nothing. Can't see why that's better.
I agreed with the rest of your comment, instead of reducing the gap, it is in fact worse now in RT. But we need to understand some tricks they are using too, there will be games where there will always be a gap. I agree that AMD can do the same thing and i don't understand why the hell they pay to have some crappy RT shadows added in some games that very few are playing. ( i guess they are doing it so that they will fool 2-3 people that RDNA can do ray tracing :) ).
They needded to do the same thing : pick 2-3 big titles and run some custom code inside those games. But to be honest, when Nvidia already has over 80% marketshare, that train is already lost, very few good developers will risk their sales to help AMD use dirty tricks.
As for you liking CP 2077 and being glad it gets a heavier RT mode, that is nice but will not run very well on your old card either so you will need to pay for the new mod. And you are free to do it, probably there are people buying a card to play Quake RTX or Portal or Godfall, why not?
I just don't like these marketing stunts, i don't think they should be considered as perf metrics for a card.
 
If they weren't trading blows then why is Nvidia putting out these chunky performers so early when they could just do what Intel did for years with minimal improvements?

I mean they aren't doing it for charity so something must have spooked them.
Because they're so far ahead, it's better to release something to bring in new orders than nothing.

That's exactly why they were able to migrate tiers again this generation...

x60 -> x70
x70 -> x80
x80 -> x90
 
Outside a few tech demo type games sponsored by Nvidia,or PC exclusives,most AAA games are developed with consoles in mind too. These will have lower RT performance than an RX7900XTX,so until consoles become powerful enough,games will still mostly use the hybrid approach. It also needs entry level and mainstream dGPUs to do RT well enough and they certainly don't IMHO.
Yep, nobody is going to develop a (real, not tech demo) game that'll look completely awful unless you have a 4090 equivalent card in RT for many years yet. For the foreseeable future everything will have a traditional lighting mode that still looks decent.
 
Last edited:
Better visuals = more expensive to render. Same as it is with raster, or do you think the non-RT equivalents like PCSS, VXAO, planar reflections, etc. are not resource hogs? Like it or not that's why people are upgrading to high-end GPUs, for prettier graphics. RT is transformative & is now widespread. This ain't 2018 anymore.

12dkTmQ.gif
I wouldn't say 130 odd games in the space of 4 years is widespread and I'm hard pressed to notice the difference between a RT and non-RT games without side-by-side comparisons like the one in the image you posted, i tend to notice high fps to more of an extent than fancy lighting.

Don't get me wrong RT is lovely and all but that hit to fps just doesn't seem worth it for that odd wow moment, anyway this isn't really the thread for RT discussions. :)
 
Last edited:
Like CyberPunk 2077, the nVidia favourite go-to! Calls DP 2.1 a gimmick and then goes on about RT performance :D
I didn’t see they put CP 2077 RT performance data. Just Raster and only compared with 6950XT.

I suspect they put that there cos CP2077 is the bench on GPU perf atm and also maybe they have done some optimisation on the driver for that game maybe yielding a good deal of improvement over RDNA2.

Anyway let’s see what happens when the benches are out.
 
Just caught up with the event. Seems decent to me. Sure RT performance improvemen is disappointing, but raster seems like will be quite good. Price in dollars seems relatively good. Shame it will translate into £1100 here though.


What I want to know is, what’s the consensus here? Is AMD giving these cards away? :cry:


Nail on the head mate. Same every launch. Just want to laugh and pay £1600 like its no problem :)

Says the guy that paid £1400 for a 3090. Gift that keeps on giving you are mate :p:cry::D
 
Remember when you could get a Graphics card for 650 quid them were the days.
over a decade ago + real world inflation and in the 600-700 days it was probably £1 = 2$+

vote tories for more expensive gpus and a destroyed economy, clearly labour are the peeps to vote for if you want cheap gpus

What I want to know is, what’s the consensus here? Is AMD giving these cards away? :cry:

Seems like they are giving them away to maintain marketshare, bet they will have tiny margins and no expensive coolers so probably run 80c+
 
Last edited:
So what have we learned so far?.

With AMD you need to compromise on RT.

With ADA cards below the 4090 you need to compromise on raster.

If you don't want to compromise on either RT and raster then you need to compromise your wallet.
 
Initially disapointed with this launch, after some thinking what I really want from a GPU it's exactly what AMD seem to be offering with RDNA 3.

1. Good raster performance at 1440p.
2. PRICE.
3. Don't care about RT, don't care about DLSS or FSR either. Don't care about encoding etc, so all those kind of features are worthless to me.

Judging by price and some rough guesswork about raster, I'm going to be able to buy sometime next year either a decently priced mid-range card 7700XT or 7800XT, or NV's alternative if they're priced right.

:)
 
Status
Not open for further replies.
Back
Top Bottom