• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
The 6900XT has 5,120 Shaders.

If the 7600XT has 4,096 Shaders and is as fast or faster than that ^^^^ "because IPC" then just how fast is Navi 33 with supposedly 15,360 Shaders?

Have we gone from 2X to 2.5X to 3X the performance of a 6900XT now?

Pull it back a bit :)
 
128bus as reaching 6900xt performance, wont surpass it but ballpark the same.
Maybe a reality check for you then
But I suspect your the expert here, right?


Something for you to watch..



5700XT vs 6700XT (RDNA 1 vs RDNA 2) IPC. This may show you the truth about RDNA 2 and give you an idea what RDNA 3 will be, remember clock speeds for the core next gen will not be going up much even on 5nm. Basically you will see from this video is they really didn't do much apart from up the clock speeds to get slightly more performance.


Also if you expect 6600xt to beat a 6900xt next gen in the form of a 7600xt expect to pay 6900xt prices then for the lowest gen priced cards. Remember the 6900xt was meant to be MSRP $999 reality it's selling for $1500-$2000 real world price. So you want 7600xt at these prices ? Prices next gen will make this gen look cheap.. people are yet not aware of what is coming pricewise next time. I know you will say they can't do that and it will kill the pc gamers market, go check the sales last 2 years with these high prices now and you will see the pc market and gamers market has never done better. So proving people are willing to pay, this is for very expensive $2.5k monitors Samsung G9 neo as an example and other high end gaming gear, before you state it's the miners doing it, miners don't buy gaming monitors and gaming gear that is now selling at amazing rates even with the high prices for gpus and cpus this gen.

Anyways we will see next gen and with all the new changes to come on some platforms due to pcie 5 and ddr5 ram. Wait for the motherboard prices next gen ;) and DDR5 . A huge shock is coming for some.
 
The 6900XT has 5,120 Shaders.

If the 7600XT has 4,096 Shaders and is as fast or faster than that ^^^^ "because IPC" then just how fast is Navi 33 with supposedly 15,360 Shaders?

Have we gone from 2X to 2.5X to 3X the performance of a 6900XT now?

Pull it back a bit :)

I still feel the MCM cards is a myth. Data centre maybe for certain task. Not for the likes of mere mortals.
 
I find this sort of behaviour quite predatory.

AMD came into a lot of criticism for the narrow Frame Rate range of Free Sync, quite rightly so, but they put a lot of work and no doubt money in the form of R&D into expanding that Frame Rate range and adding LFC tech and HDR.

Screen Vendors don't want to add $150 to the cost of the screen and given that Free Sync in its second generation is actually quite good start to adopt it over G-Sync Modules in droves.

Nvidia adopt AMD's Standard, great, but then also start promoting their own products off the back of AMD's work and R&D spend. Deliberately to push AMD branding out of the market while at the same time actually using their tech.

This is why sometimes i make the argument that AMD when they have the better solution should just do what Nvidia always do, but if everyone did that, which in this case that's how it would be, then where would we be as consumers?

It's not AMD's Standard.

As for Mate's TV with OLED. Some OLED's didn't support Freesync at launch but, did support Gsync. They didn't get the Freesync update until a firmware update. That's why the box had Gsync and not Freesync. Any TV that has both, has both plastered all over the front.
 
It's not AMD's Standard.

As for Mate's TV with OLED. Some OLED's didn't support Freesync at launch but, did support Gsync. They didn't get the Freesync update until a firmware update. That's why the box had Gsync and not Freesync. Any TV that has both, has both plastered all over the front.

Had a LG CX from release which worked fine on my VII at the time, Ironically states g-sync on box too but no mention of G-sync in the menus of the TV. It's states AMD Freesync premium when wanting to enable VRR. Even after several firmware updates still states Freesync premium.
 
Had a LG CX from release which worked fine on my VII at the time, Ironically states g-sync on box too but no mention of G-sync in the menus of the TV. It's states AMD Freesync premium when wanting to enable VRR. Even after several firmware updates still states Freesync premium.

on my BX you can disable freesync and VRR (gsync) will still work so those are two separate functions.
VRR is tied to instant game response.
 
on my BX you can disable freesync and VRR (gsync) will still work so those are two separate functions.
VRR is tied to instant game response.

On the CX at least the FS Pro option enables LFC

The point I was making is that it worked out the gate on the LG CX.
 
Something for you to watch..



5700XT vs 6700XT (RDNA 1 vs RDNA 2) IPC. This may show you the truth about RDNA 2 and give you an idea what RDNA 3 will be, remember clock speeds for the core next gen will not be going up much even on 5nm. Basically you will see from this video is they really didn't do much apart from up the clock speeds to get slightly more performance.


Also if you expect 6600xt to beat a 6900xt next gen in the form of a 7600xt expect to pay 6900xt prices then for the lowest gen priced cards. Remember the 6900xt was meant to be MSRP $999 reality it's selling for $1500-$2000 real world price. So you want 7600xt at these prices ? Prices next gen will make this gen look cheap.. people are yet not aware of what is coming pricewise next time. I know you will say they can't do that and it will kill the pc gamers market, go check the sales last 2 years with these high prices now and you will see the pc market and gamers market has never done better. So proving people are willing to pay, this is for very expensive $2.5k monitors Samsung G9 neo as an example and other high end gaming gear, before you state it's the miners doing it, miners don't buy gaming monitors and gaming gear that is now selling at amazing rates even with the high prices for gpus and cpus this gen.

Anyways we will see next gen and with all the new changes to come on some platforms due to pcie 5 and ddr5 ram. Wait for the motherboard prices next gen ;) and DDR5 . A huge shock is coming for some.

There are small differences there, at 1080P sometimes the 5700XT is a little faster and other times the 6700XT is a little faster.

At 1440P and 4K the 6700XT is almost always a few frames faster.

They both have the same 2560 Shaders and (in this test) clock locked at 1.8Ghz.
The difference is the 5700XT has a 256Bit bus with the 6700XT only a 192Bit bus, but the 6700XT also has a 128MB L3 style Cache.

But the biggest difference is the clock speeds, despite both being on the same 7nm node the 5700XT runs at around 2.2Ghz with 225 Watts where as the 6700XT runs around 2.8Ghz with 230 Watts.

The 5700XT is around the 2070 Super performance while the 6700XT is around the 2080TI performance, with the same power consumption, with those latter points taken in to account they are not the same architecture, they are in fact very different.
 
Grey on says RDNA 3 lineup still includes MCM GPUs

I'm not sure - I don't think amd will do it but if it turns out to be true then it can be a problem for games.

https://mobile.twitter.com/greymon5...a-lovelace-geforce-rtx-4090-with-18432-cores/


The problem that I see is look at the MI200, AMDs new MCM GPU using brand new 3rd Gen interconnect - it's so good that its two GPUs, as in it shows up as two GPUs in windows - so are we back to crossfire now?
 
Grey on says RDNA 3 lineup still includes MCM GPUs

I'm not sure - I don't think amd will do it but if it turns out to be true then it can be a problem for games.

https://mobile.twitter.com/greymon55/status/1457931961796743168?ref_src=twsrc^tfw|twcamp^tweetembed|twterm^1457931961796743168|twgr^|twcon^s1_&ref_url=https://wccftech.com/amd-nvidia-next-gen-flagship-gpus-detailed-rdna-3-radeon-rx-7900-xt-with-15360-cores-ada-lovelace-geforce-rtx-4090-with-18432-cores/


The problem that I see is look at the MI200, AMDs new MCM GPU using brand new 3rd Gen interconnect - it's so good that its two GPUs, as in it shows up as two GPUs in windows - so are we back to crossfire now?

Obviously not as Crossfire is dead, almost nothing supports it or SLI.
 
Perhaps uses MCM design for the VRAM memory modules infinity cache.

Always look for as much detail as possible in leaks, the most detailed leak I've seen so far suggests the high end will use TSMC's 5nm EUV process, with the mid end utilizing 6nm EUV for the GPU die itself. Link here:


There's even some helpful estimates for release, with the mid end chip releasing in early Q4 2022, and the high end releasing in the 1st half of 2023.

If you look back at AMD's roadmap, all that was committed to was an advanced node (EUV), no mention of multiple GPU dies:

This is sufficiently non specific, that it allows them to use 5nm and 6nm for different GPU dies.

p.s. I felt this topic deserved it's own thread, as this is really completely different to rumours I've seen in other posts.
 
Last edited:
It looks like AMD's performance target for RDNA3's mid end will be the 6800 XT, or possibly the 6900 XT, I would guess with much improved ray tracing performance. It sounds like it could be a decent upgrade for the majority of PC gamers.

The performance we end up, could be somewhat dependent on how high the boost clock speed is for each GPU, something that is probably something that is still being tuned by AMD's engineers. With RDNA 2 Compute Units, ramping up the clock rate drastically increases power consumption (and therefore, cooling reqs).
 
Last edited:
Tbh the leaks are just PR and speculation fudged into something that generates news/discussion/stories

MCM, GCD, MCD whatever you want call it isn't that important, what's important is you look at GPUs as a black box for graphics acceleration and stop fixating on technology branding.

How fast is the black box at given quality settings?
How much is the black box to buy at retail?
When can I actually buy it for real?

What goes inside the black box, yes ok its interesting to a point but who cares really how its constructed?

As far as we can sensibly predict Nvidia is looking to build a single monolithic die as usual on a new node whilst throwing as much power as they dare at it to get the best clock speeds. Knowing Nvidia they will cut down the die in several ways for segmentation and future products, it is also possible we will see an MCM solution before the usual 2yr refresh cycle.

AMD on the other hand look to be first to offer up an MCM solution which in plain speaking means multiple silicon dies bonded together somehow on a substrate that is physically one or more discrete chips/die's whatever... 1 or more sockets for the sake of argument. They also have the option of adding 3d stacking on top to add more dies be that compute or cache whatever they have the tech. In theory this is a more efficient manufacturing process basically meaning they can throw more transistors at the problem to do more work without the failure rates large monolithic designs have. However all of that mean squat IF AMD can't get it to work well or they pocket the profits vs. Nvidia's monolithic solution.

Neither company is your friend at this point both have to varying degrees shown they are willing to do whatever it takes to exploit the market to its full potential.

On top of all that you have Greymon and Kopite7kimi just pushing out noise for PR for either team and to knock off tech sites from the truth. To coin the Kopite7kimi phrase the other day I am disappointed with both solutions right now.

The only thing we can be reasonably confident on imho at this point is Nvidia being concerned about AMD disruption because of the noise coming out all points to that. It is logical therefore to predict AMD could potentially do something this generation... but will they? I don't know, I hope they do but recent track record says no they won't.
 
MCM itself is nothing new - if you look at Voodoo cards, etc. they used MCM aspects in the implementation. Where it becomes interesting is where you can using MCM designs to expand the capabilities beyond what can be done simply shoving 2 discrete cores on there with all the limitations of SLI/CF technology.
 
Status
Not open for further replies.
Back
Top Bottom