• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
AMD just did significant price cuts to rdna2 cards, including taking $300 off the 6900xt so instead of doing what Nvidia did which was placing rtx4000 in tiers above rtx3000, AMD is cutting price because rdna3 will take pricing rdna2's place - essentially confirming that yes RDNA3 GPUs will be cheaper to buy than RTX4000
 
Last edited:
Cheaper yes by how much we don't know I doubt it will be same as rdna2 was they can be higher and still be below Nvidia's price

Also will they this time let you buy from the AMD store for UK ? No doubt AIB will be higher
 
Last edited:
The point is it would be using GPU cycles to render a frame its not displaying, displaying the frame costs nothing, rendering it costs GPU cycles, the same as if it was putting that image on your screen. Its rendering a higher res image to gather data from, not to display on the screen.

So does Jensons explanation whats going on here make any sense to you? It doesn't to me.

The way I understand it, the GPU renders frames #1 & #3 at a low resolution and then upscales it using DLSS Super Resolution. Next, DLSS Frame Generation AI analyses frames #1 & #3 and creates frame #2 using data from frames #1 & #3. Then the process starts again - frame #5 is rendered at a low resolution and upscaled. Frame #4 is then created using AI with data from frames #3 and #5.

It's similar to the Oculus Rift when a GPU cant handle 90 FPS - it switches to 45 FPS mode and uses AI to add an approximate frame inbetween to bump it back up to 90 FPS. Except Nvidia is making out like it's their own clever idea.
 
Last edited:
The way I understand it, the GPU renders frames #1 & #3 at a low resolution and then upscales it using DLSS Super Resolution. Next, DLSS Frame Generation AI analyses frames #1 & #3 and creates frame #2 using data from frames #1 & #3. Then the process starts again - frame #5 is rendered at a low resolution and upscaled. Frame #4 is then created using AI with data from frames #3 and #5.

It's similar to the Oculus Rift when a GPU cant handle 90 FPS - it switches to 45 FPS mode and uses AI to add an approximate frame inbetween to bump it back up to 90 FPS. Except Nvidia is making out like it's their own clever idea.
sounds like a good candidate for increased input latency unless there is something about the tech I don't understand which is very likely.
 
Hopefully this new AMD will smash nvidia
I think this moment is what AMD have been waiting for. AMD would have a hard time changing people mindset of nvidia, so now that Nvidia has done the hard part for them, or at least opened the door a fair bit, it seems like AMD is ready to capitalize on it. With the current leadershop at the top and the new mindset of the renewed AMD since Ryzen 1 I'm not so fearful that AMD will mess it up. Of course AMD will still have to get close performance wise to actually gain anything from this, otherwise they too will fail. Time will tell though I'm much more excited for AMDs release than I was watching Nvidias.
 
Last edited:
Hopefully this new AMD will smash nvidia

I posted this in the 4k series thread and it's a sad fact:

It wouldn't matter if amd brought out a card that bent the 4090 over and pumped it raw at half the price, the "but muh drivers" argument is still floating around after all these years which hurts sales since so many stupid ******** believe it.
 
AMD dropped prices on the 6000 series to compete at a price just below their counter parts. This looks like a good sign that they might do the same for the 7000 series.

xDCeZuN.jpg

But we will have to wait and see if that pans out on release/reveal.

ROFL, they dropped prices because they didn't have a choice, i'm dreading the fact that they could again match NV price for raster performance on launch.. eg.£929 7800XT
 
Last edited:
ROFL, they dropped prices because they didn't have a choice, i'm dreading the fact that they could again match NV price for raster performance on launch.. eg.£929 7800XT
7800XT $799 and beats the 12GB and 7900XT $1199 and gets near to the 4090. This is both a fairly small increase on 6800XT/6900XT and also leaves good profits for AMD :)
 
Last edited:
The rumour was the 7700XT is giving you 6900XT performance (possibly better RT) but for the £500 price bracket. Let's see if they do this though. Bear in mind it looks well into 2023 when the navi 32 follow.

The rumour is N33 ~= 6900XT in 1080p and perhaps 1440p. N33 is going to be an 8GB 128bit card so will be the 7600XT.

7700 XT is very likely to be a cut down N32 with just 3 MCDs.
 
The rumour is N33 ~= 6900XT in 1080p and perhaps 1440p. N33 is going to be an 8GB 128bit card so will be the 7600XT.

7700 XT is very likely to be a cut down N32 with just 3 MCDs.
The architect would have to have made some crazy gains for that to be possible.

But let’s assume it is close to being true, I could understand why Nvidia would feel the need to release an 800w monster of a card
 
The rumour is N33 ~= 6900XT in 1080p and perhaps 1440p. N33 is going to be an 8GB 128bit card so will be the 7600XT.

7700 XT is very likely to be a cut down N32 with just 3 MCDs.

No the rumours I seen were referring to the x700 tier. From RDNA2 the 6700 was navi22. Navi23 is the x600 tier so I think your tiers are mixed up. Much more likely a 7700XT being on par with a 6900XT, smaller vram but improved in ray tracing.
 
Status
Not open for further replies.
Back
Top Bottom