• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
What wunkley said worked. He is right, non-native res in Windows (on some monitors) won't activate HDR.

I assume your 3440x1440 is the native res of your monitor. That's not what we're talking about.

What is the reason for not using native res in OS? Sounds weird
 
Last edited:
The cache on these are huge, I look at the diagrams for Navi 31 and 32 and total size of the the cache chiplets look nearly as big as the actual GPU core
 
Smart move from AMD. Users are already angry at high upgrade costs for Zen4 CPUs so keeping older connectors on RDNA3 means no one needs to run around looking for new cables or PSU.

And of course letting others be first adopters are smart too, AMD did that with DLSS and RT by letting Nvidia grind their teeth on it so that when AMD comes in they have a more mature offering. AMD is only really going in on RT with RDNA3 and they will have a good start and they are letting Nvidia grind it out again with their buggy implementation of DLSS3 frame generation stuff, wait a while and come out next year maybe with their own version that's more mature.
 
Last edited:
Perhaps it's a nonsense but could there possibly be price fixing going on?

Yes but not the illegal type.

What's illegal is when you talk to your competitor and directly set a price.

Legal price fixing happens when an industry has little competition and both companies without even talking to each other realise that it's in their long term financial interest not to have prices that are much different from each other - this is how petrol companies work too
 
Last edited:
can someone explain why people care about ray tracing? In games I'm too busy playing them to notice fancy reflections. Are people so sucked into to nVidia kool aid they need RT for no apparent reason?

Because the graphics look better

It's like asking why do people care about using high graphics settings instead of low. Or why people will buy a PC that's cost 3 times the price of a PlayStation.
 
Last edited:
This is dubious. When a game has *properly implemented* baked-in lighting, it looks great. When devs use ray tracing as an excuse to phone-in the baked-in lighting, then yes, ray tracing looks better than crap-tier baked-in lighting.

Control is an example of such a title. Ray tracing looks nice in Control, but when you turn off ray tracing, it looks like a 6-year-old coded the lighting on a "bring your child to work" day.

This is just whataboutism. Some games still look great on low settings and not far from high, some look 20 years older on low than high.
 
Last edited:

If correct we're in for a treat :cool:
The last part is the most exciting "news" in GPUs for a long time.


Tldr?

In other news today, an investment company says both amd and Nvidia should be worried because Intel is going to overtake TSMC in having highest transistor density node by the end of next year.
 
Last edited:
Like in 2016, 2017, 2018, 2019, 2020...... Intel's return to node leadership is just around the next corner.

Some people are deluded, like Jon Peddie, thie Jon Peddie telling MLID that Intel will sell ARC et mass just because its Intel, MLID himself postured ARC will be a huge success, because its Intel, and if you don't believe that you're stupid, its Intel.

And then there's me, stupid, perhaps, but with a firm grip on reality....

Their claims is funny enough based on TSMC forecasts not Intel's. It's because TSMC has delayed its production plans for more advanced nodes than what it can produce today.
 
Last edited:
The latest MLID video is out with new info.

Confirms that rdna3 is launching as an efficiency and cost focused architecture not max performance. Price will definitely be lower than Nvidia.

* 7700XT is not faster than the 6950xt

* Reference 7900XTX is 350w TDP and 2x8 pin, but AIB models are 450w TDP and 3x8 pin and many will be using the same 3/4 slot coolers used on the RTX4090

* 7900XTX does not match RTX4090 in rasterisation or ray tracing. However with the extra 100w TDP on AIB models they might get some really nice performance gains with overclocking so it's up in the air to how fast it can get.

* 7900XT and XTX will be in good supply, AMD is confident in its decision to front load supply to premium models after seeing how well the RTX4090 has sold.
 
Last edited:
I’m not convinced they do. They’ve already cranked up the power and have a die which is 608mm2. I believe the max die size for 5nm is 800(ish)mm2.

And given nVidia have to cram everything onto one die, they can’t just fill that extra space with more shaders as they’ll need to add all the other components of a chip.

AMDs chiplet design means their main chip is 350mm2 and it is filled with shaders.

What? The 4090 is already the full size die, the die wouldn't be any bigger than the 608mm2 it already is.

The only difference is some of the cores on the 4090 are DISABLED and that's the difference. To be exact, 12% of the cores on the RTX4090 are disabled/turned off and would be turned on for a RTX4090 ti
 
Last edited:
AMD has done very well this time to hide performance. They have pulled an Nvidia and not provided AIB's with working drivers until tomorrow, so AIB employees can't run benchmarks and leak performance.
 
Status
Not open for further replies.
Back
Top Bottom