• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
The rumour: 3750mhz max boost clock.

3750mhz is the max clock limit set in the GPU vbios yes, so not possible to get any card above 3750mhz unless amd release new vbios in the future - maybe 4ghz bios is reserved for a future 7950xt
 
Last edited:
AMD just did significant price cuts to rdna2 cards, including taking $300 off the 6900xt so instead of doing what Nvidia did which was placing rtx4000 in tiers above rtx3000, AMD is cutting price because rdna3 will take pricing rdna2's place - essentially confirming that yes RDNA3 GPUs will be cheaper to buy than RTX4000
 
Last edited:
AMD were using the wafers for Zen 3 and Milan. They didn't have enough to go around.

This time they have both N5 class and N7 class parts and many more wafers so have much more capacity to supply the market.



Well from what NV showed that 4080 12GB is ~= 3080Ti which is where the 6950XT sits in performance so we have 7,680 shaders @ ? clockspeeds being about equal to 5120 shaders @ 2.3Ghz or there abouts.

Full N32 will be 7680 shaders and I expect the cut version in the 7700XT will use at least 5120 (that is cutting 10 WGPs and 1 whole shader engine) but will be running at > 3 Ghz. So unless there is a large drop in IPC going to RDNA3 (I expect there to be some drop in IPC because each shader is smaller but I expect it to be more than made up for by the increased clockspeed) the 7700XT should have a good 20% or so more performance than the 6950XT which will beat the 4080 12GB in raster quite comfortably. RT I don't really know, hard to get an actual estimate at the moment given NVs obfuscation of native RT performance.

Until we get reviews, the only sort of apples to apples RT numbers who have for RTx4000 is using Nvidias published numbers that it uses for its RT cores

4080 16gb has 76 RT cores, each is rated for 1.5tflop = 114 tflops of RT performance

4090 has 128 RT cores, each is rated for 1.5tflop = 192tflops of RT performance

RTX3090ti has 84 RT cores, each is rated for 0.92tflops = 77tflops of RT performance

You'll notice that each RT core for RTX4000 has a large improvement in tflop performance output and much of this is because RTX4000 GPU clock speeds are about 50% to 60% higher than RTX3000

These numbers aren't perfect, the problem is that RT effects don't make up 100% of time processing a new frame, so you can double, triple, quadruple RT performance while only having small framerate improvement if RT effects only make up small portion of the frametime.

What you can say from this, the new 4080 and 4090 have greatly more powerful RT output, but games are not made up of 100% RT affects so the actual improvement in gaming framerate will not be known until review day
 
Last edited:
Lets say the 7950XT is 50% faster than the 3090Ti in Raster and RT how much should it cost?

Depends what AMD's goal is.

If they want to cannibalize RTX4080 sales, then $999.
If are happy to slot in between Nvidia, then $1299.

My feeling is AMD would choose the $1299 option, that's where Lisa has been pushing the company in the last couple years
 
Last edited:
RDNA3 rumour on billibilli https://m.bilibili.com/space/4139209?spm_id_from=444.42.0.0

They're saying AMD's full unveiling of RDNA3 will take place on 3 November, two models will be announced and stock arrives in early December. For performance, they say neither card compete with the 4090 in raster or RT

Take with salt especially as these are generalised statements that don't reflect what we know of the 4090, that it's stupidly slow at 1080p so there is almost no chance rdna3 is slower than a 4090 at 1080p
 
Last edited:
AMD did that with Zen4, announced a month before launch (Zen4 announced 29 August, launched 27 September)
 
Last edited:
Hardware unboxed commented on that thread it is possible 4K 240hz on DP 1.4a

Yes Samsung already has a 4k 240hz monitor, doesn't need DP 2.0

DP2.0 is only needed for 8k 120hz and 4k 360hz/480hz and last time I checked there are no 4k 480hz or 8k 120hz monitors
 
Last edited:
Depends how these things are calculated I suppose.

For example Nvidia claimed +100% increase in perf/watt for RTX4000 but based on reviews it looks more like 60-70% unless you power limit the GPU to 200w cause at 200w it shows up as 100% faster
 
Last edited:
Stands to reason though given the flagship RTX card uses a die that's about 20% bigger then the next gen Radeon card (T.B.C).

Not necessarily, RDNA cards have tended to have smaller dies anyway and are still competitive outside of Ray tracing with rtx3000 so just cause a rtx4090 has a 20% larger die doesn't mean it will be faster - in fact when looking at the Core layout diagram, the RT and tensor cores take up a significant portion of space on Nvidia dies and if they weren't there then die sizes between amd and Nvidia would be similiar

 
Last edited:
PS5 and Series X really are cracking bits of kit. Primarily PS5 here but as I have Game Pass on PC I couldn't resist a series X as well. In fact its Plague tail on Game pass just released today :)

Yea and no.

They are good value for money in this market even after their price increases, but they are also already getting outdated in performance.

Plagues tale runs at 1080p-1440p 30fps on an ps5 and series x and it's not even running at equivalent to PC graphics settings and on ps5 in particular the framerate drops down to 25fps in some parts. A PC with the latest and greatest GPU can run at higher quality settings and 4k 60fps to 100fps.

As I said though, the consoles are still good value for money, mainly because the entry level of hardware market is non existent, no one makes entry GPUs anymore. But when these consoles were released they were very comparable to a high end PC GPU and today the high end PC GPUs are 4 to 5 times more powerful
 
Last edited:
Again "if" this is true Nvidia know something we don't.

Nvidia will not release a Titan Branded GPU unless they are certain it can beat AMD's best by at least 10%.

Rumour is: No Titan Ada.


Nvidia doesn't need Titan based on these rumours.

The rumour says 2x raster performance over 6900xt - rtx4090 is already 2x over 6900xt so the rumour would make the 7900xt and rtx4090 on par with each other

Then the rumour says 2.5x RT performance over 6900xt - rtx4090 is already 4x over the 6900xt so the rumour would place the 7900xt well below the rtx4090 in RT.

When you add these rumours into a package you can see Nvidia can and will keep the 4090 price high and they have no need for a Titan as they will claim another mindshare win over amd.

As for your TPU benchmarks. they are still using a garbage test system that heavily slows down the 4090 that's why their results are so low compared to other reviewers who used the latest parts to test with instead of 2 to 3 year old parts.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom