• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.

Do We Actually Need PS5 Pro/ 'Xbox Series Next' Enhanced Consoles This Generation?​



Yes we do we always need faster hardware, always

But I can understand why they may not launch pro machines. Pro machines will cost more and they will be hard to sell when the global recession hits at the end of this year.

And we can already see other companies changing their design and pricing because of the weakening global financial situation- the recently shown off AM5 x670e motherboards paint a very clear picture. Most recently with am4 and z690 we saw motherboards with rrp close to $1000 but with am5 and x670e manufacture have toned down their designs, removed and X pensive gimmicks and cut down cost and these boards are now at $500 instead of $800/900/1000 etc. this is not a random occurrence we saw it on all manufacturers they are all cutting down on their designs and lowering prices because you won't be able to sell a $800 motherboard in a recession just like Sony and MS won't be able to sell a $800 console in a recession
 
Last edited:
I reject the rule of thumb claim that AAA games with good graphics have crap gameplay and games with older graphics have better gameplay and design.

Yes there are some games with good graphics that are bad, but that's not new there have always been those games every year.

And I have a very recent example that disproves the claim:

I just finished spending 300 hours playing the highly addicted Elden Ring, a AAA masterpiece with excellent gameplay and graphics. And I just started playing Diablo Immortal, a game with old graphics and absolutely garbage gameplay and design that has earned it a 0.8 out of 10 aggregate user score .
 
Isn't this what HBM is? It's more or less dead in non-datacentre applications. Chiplets allow to put away stuff that doesn't need a cutting edge process like I/O. This increases the yield of the expensive large and performance defining part by making it smaller overall.

That's got me too. AMD says it's using MCM chipsets, when you look at the diagram it's memory chips sitting on the die

But guess what, that's exactly what HBM is, so by AMDs definition of this, AMD was already making chiplet MCM GPUs years ago with the Fury
 
AMD says because Nvidia is pushing higher and higher power limits, they have to follow or get left behind. In a world where it's becoming harder to get more performance, yet the demand for performance is growing, something has to give and that's power. AMD says it has better efficiency than Nvidia but the extra efficiency of RDNA3 is not enough to avoid increasing power draw to keep up or even beat Nvidia performance, however Nvidia has to push power more than us.




My take on deciphering what AMD says here: RDNA3 GPUs will consume more power than RDNA2 GPUs, but they will consume less than RTX4000 GPUs. No actual wattages were given so we have a pretty big range to guess with - for example the RTX4090 is rumoured to now be 450w, so that would mean the 7900xt can be anywhere between 310w and 440w
 
Last edited:
Yeah, i mean pre Zen and RDNA AMD was brute forcing their CPU's / GPU's to keep up with competition, and failing.

It seems now that Nvidia pushing their silicon harder, and Intel with their CPU's, but they are NOT failing. They just want those 10 FPS that puts them at the top of the bar graphs.
So AMD will have to do the same, Zen 4 will go to 170 watts, up from 105 watts Zen 3 and RDNA3 will go up from 300 watts, who knows what to but there is talk of 450 to 600 watts for Nvidia's RTX 4090 so i wouldn't be surprised going forward that any high end GPU will now be at least 400 watts, because no one wants to risk being left behind by the competition.

Isn't that great when all our energy costs are sky-rocketing? its going to cost 50p an hour to run these things.


Good thing consumers have choices, AmD will still sell 65w CPUs and 200w GPUs so you're more than welcome to buy that rather than the 170w CPU and 450w GPU
 
AMD's Senior Vice President of Engineering at Radeon Technologies Group, David Wang, has confirmed some new details about RX7000 RDNA3 GPUs


* Still using a hybrid core design that can do Rasterisation and Ray Tracing, no FFU.

* Ray Tracing performance is improved over RDNA2

* Faster clock speeds than RDNA2

* RDNA3 supports AV1 codecs

* RDNA3 GPUs have DisplayPort 2.0 ports
 
Will you be buying AMD though Grim? :D

No why would I. What a random question; just because I'm posting news about a product doesn't mean I'm planning to buy said product.

I'm sure I've told you guys before but all the screens I use for gaming in my house don't support Freesync so there is no chance I'll buy an AMD GPU until I've replaced all my monitors and TV
 
Last edited:
You've priced up the materials but forgot about labour and operating costs. AMD spend 2.85 $billion per year on R&D. I don't know how many years it took to design RDNA but as an example, Raja Koduri started work on Ark in 2017 so up to 5 years. That's many $billions in design costs which need to be recovered in GPU sales before they even turn a profit. AMD's head office in Santa Clara, California has 18,924 employee's so that's probably close to a $billion per year just to keep the office running.

It's been estimated Intel has already spent $3.5billion for its desktop Arc GPUs even though it's basically sold nothing yet

They would need to sell at least 10 million GPUs at $350 a pop just to recover from existing sunk costs

He also forgot about transistors; you can't just say that one 200m2 pice of silicon is worth the same as another 200m2 piece of silicon, not only can they do totally different products in different markets, you have different development costs and even if remove all those factors the number of transistors on each silicon can vary , so yields can vary and therefore cost to build does vary.

I'd think that much of the current prices is market related even though there are many factors at play; zen3 CPUs are pricey because they can be - they are still selling even at inflated prices so why lower them where as amd has a harder time moving GPUs so they need a lower price
 
Last edited:
If they get +50% performance per watt, (as they told their investors) they should do well. (Unless they massively cut TDP's)

For VR, I prefer the frame timing of my 6800XT over my 3080Ti, but the 3080Ti has just enough grunt to run my sims at night with lowered settings and the 6800XT can't quite do it.

The 6800XT basically "feels" better in VR up to the point where it can't hold 90fps, then it's pretty bad. (I think Nvidia's reprojection works better so the 3080Ti remains playable when it starts to struggle)

Another 50% (for either card) should give me the grunt to run at night. If RDNA3 maintains the same solid frame pacing, and the price is good, AMD could get another sale from me.


RDNA 3 is looking efficient. it only has 2x 8 pin power cables indicating TDP between 300 and 350, probably closer to 350w given the size of the amd triple fan cooler has increased (the heatsink is larger)

It's interesting that amd has reworked the memory configuration, they've lowered L3 gamecache size compared to what rdna2 has and it's probably because they are getting extra bandwidth from faster vram modules
 
Last edited:
According to latest MLID video just released, AMD does not think it can compete with the 4090, but RDNA3 will be cheaper. Previously AMD thought it might even beat the 4090 or whatever Nvidia would release based on how much performance it estimated Nvidia would achieve but Lovelace and the RTX4090 has ended up faster than AMD anticipated
 
Last edited:
Yep, there is no way the 3090ti beats the 6900xt or 6950xt in AC Valhalla, it's well established that game heavily favours AMD GPUs - Im surprised Nvidia included it in their benchmarks since they knew it would make them look bad.

From what I can tell only the 16gb rtx4080 is matching the 6950xt's performance in that game, which is not good for Nvidia but that's still like 20 or 30% faster thanbthe 3090ti
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom