• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Lets not forget the only card Nvidia had in the 7900XT/7900XTX price range before it was unlaunched had worse RT and raster than a 3090ti while the $1200 4080 16gb really isnt that much better +10% raster + 15% RT over a 3090ti.

Screenshot-356.png

Screenshot-357.png
 
if you can afford a grand for a gpu, you can go the extra 500 notes for the vastly superior product. No one cares about the 4080.
About 10% faster, or maybe even 5% if you run the 4090 at a sensible power level which most people will. You and I have a different understanding of the word "vastly". Also the 7900 XTX will be about £600-£650 cheaper, not £500.
 
Meh looking into this further, looks like it's Nvidia forever for me until productivity performance improves for AMD cards. An interesting thread on reddit popped up so I did some searching, and turns out even a 3060 Ti beats the 6900XT in productivity apps that use GPU acceleration (I use Davinci and Adobe apps). Will be interesting to see what the 7000 series shows as they have the new media encoder hardware, but as far as 6900 series goes, they're at the bottom of all the graphs vs nvidia for GPU acceleration in apps.

I don't think much will change however since AMD made no mention of productivity in their stream. It was all gaming focused.
 
Last edited:
if you can afford a grand for a gpu, you can go the extra 500 notes for the vastly superior product. No one cares about the 4080.
That old fallacy, most of the people buying will be bunging it on credit. They're already overstretched. The people I know with proper money aren't spending it to sit in their mother's basement ;)
 
@DeliciousStorage Nobody knows how much they’re going to retail for here yet!
$999 = £885.20
£885.20 x 1.2 (20% VAT) = £1,062.24

The reference model will likely be somewhere between £1049-£1099 unless the exchange rate substantially changes before release. AIB's probably an extra £100 on top. Of course we can't know for sure but it's better than the £500 estimate pulled out of thin air.
 
Last edited:
DOA for non fanbois

People will buy it because it's Nvidia so it will still sell, unless AMD are out of stock all the time then they could get it.

But that recent LTT video they are suggesting it could trade blows with the 4090 in non RT performance.

If that ends up true you would have to be a moron to buy a 4080 at it's current price
The 4080 was already a morons card and didn't need AMD to confirm it.
 
i think amd uses a logic unit that doubles as a ray intersection block or a TMU in one clock cycle, thats been the case with rdna2 and leaked numbers suggest that they have continued in the same direction with rdna3..
the identity is like 1 tmu or 4 ray box intersections per clock, so amd will generally report a tmu:rt core in 1:4 ratio, so no dedicated rt hardware for rdna3

also nvidia has moved away from deep BVH structures to flatter BVH structures for data organization, atleast thats what i could interpret form the white paper, and new acceleration structures for rt workloads, the gap will only widen as new games start implementing nvidia's methods
I have not seen any information that would suggest that at all. You have 320 TMU and 80 ray accelerators but they do not seem to be connected workwise and 4:1 is just because of how things are organised inside CU. Especially that the job that TMU is doing is widely different from RT acceleration - it wouldn't make any sense at all if it could do both jobs. And you seem to be forgetting each RT unit can do 4 ray box intersections per clock but also it can be 1 triangle intersection per clock - are you suggesting 4 TMU calculate 1 triangle intersection? Too bad there's no actual whitepaper for RDNA2, as that would make it easier to understand. Perhaps there will be for RDNA3, time will tell.
 
no i am pretty sure that was how the architecture worked for rdna2, the logic unit could actually switch between rt and texture mapping at change of clock.. maybe they spiced it up in rdna 2 with something else, but the ratio pretty much nails it on how the broader approach hasnt changed

here it is:
 
Last edited:
Snipped the post to save space


I see the leakers just had to cash out a few days before the official announcement and the one you highligted seems to be cashing out after lol but theres no point in trying to hold them accoutable as the PC community seems to forget 5 mins later and forgive them. Remember AdoredTV with his wild predictions with Zen2 and the multiple meltdowns that followed as it got so bad a lot of people wanted anything to do with him banned of the AMD subreddit.

Looks like MLID got it right though. There are some who do get it right. And you are misinformed that person is no leaker. He has no clue to anything of insight.
 
Last edited:
  • Like
Reactions: G J
While ray tracing does improve graphics I'm not convinced that useless I'm standing still looking at it, instead of actually going for the games objective, I'll notice it much.
 
Majority of people are not going to pay over £1000 for a gpu. Lets see when the 4060ti costs £600 if people will lap it up or go for something even further down the stack. I certainly won't be getting any card over £800 and will not buy a low tier card like the 4060 series for more than £350. Will wait it out for a used top tier card when the time is right. AMD and Nvidia can suck balls if they are hell bent on setting everything at mining market prices.
 
Majority of people are not going to pay over £1000 for a gpu. Lets see when the 4060ti costs £600 if people will lap it up or go for something even further down the stack. I certainly won't be getting any card over £800 and will not buy a low tier card like the 4060 series for more than £350. Will wait it out for a used top tier card when the time is right. AMD and Nvidia can suck balls if they are hell bent on setting everything at mining market prices.

I was expecting the 4080 16gb to jump to £750-£800 I would have bought it why did they get soo greedy

That's what I was expecting after buying 3080 for £650

AMD wouldn't even been in the discussion
 
Last edited:
Jesus, that's just pathetic. Get a grip AMD!

These numbers don't translate directly to RT gaming performance thankfully but yes the gulf is getting bigger between Nvidia and AMD. AMD would have needed 100% raw RT performance gain just to match up the 3090ti but delivered in their own claims 50% to 70%.

Leakers said they'd be able to match Nvidia this time which would have required 200% performance gain, but only a 1/4 of what is required was delivered
 
Last edited:
Majority of people are not going to pay over £1000 for a gpu. Lets see when the 4060ti costs £600 if people will lap it up or go for something even further down the stack. I certainly won't be getting any card over £800 and will not buy a low tier card like the 4060 series for more than £350. Will wait it out for a used top tier card when the time is right. AMD and Nvidia can suck balls if they are hell bent on setting everything at mining market prices.

This, my £650 3080 is the most expensive card I've ever bought. At this stage I'm more likely to downgrade to a 1080p monitor in the future then buy a new card once mine is struggling. I would have already switched to consoles if they would just give us native mouse and keyboard on all games.

Feel like this is an ever growing number of people if one of the consoles would allow it. Could hoover up all the people that are willing to spend £300-600 to game.
 
Status
Not open for further replies.
Back
Top Bottom