• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
nope they cited research papers
they also talked about intel's work in raytracing..which predates even nvidia's which was surprising though, maybe they did it for pixar or other AAA hollywood studios, but they were talking abt realtime stuff - maybe they were trying to do that using cpus, i couldnt get more details
 
Last edited:
AMD is working closely with Microsoft and Sony,so realistically they are developing tech to meet the needs of those large companies,who own tons of gaming studios. If you listen carefully to them,RDNA2 emphasis was on maximum performance per mm2 and their RT solution is area efficient. Its quite clear for a very long time,AMD GPU uarchs are closely aligned with the consoles,and rely on essentially Microsoft and Sony funding it,so AMD can concentrate more on CPU R and D. It's also most likely companies such as Microsoft and Sony also have been doing their own R and D into the software side of things - consoles debuted advanced image reconstruction methods(with support in hardware) long before we saw that in PC games.

But ultimately,with Nvidia trying to rebrand the RTX4060/RTX4060TI as a RTX4080 12GB,and the RTX4070/RTX4070TI as a £1000+ RTX4080,the reality is most PC gamers are going to be on dGPUs weak at RT.Most mainstream PC gamers are price sensitive,and as Nvidia jacks up pricing of smaller and weaker dies upwards,the performance jumps under £500 will become more and more muted.Most of the RT capable dGPUs are under £500 on Steam and the RTX3050/RTX2060/RTX3060/RTX3060TI have far more share than all the other RT capable dGPUs combined. Then if you look at non-RT capable dGPUs,the share is even more. With the global recession coming,how many mainstream gamers will be able to afford to paying more and more especially with interest rates rising? Nvidia/AMD need to be offering more value and deeper discounts to shift their unsold stock,but are so concerned about margins despite their huge revenue misses. In the past when we had games with massive jumps in image quality,it usually came with excellent entry level and mainstream dGPUs. When Crysis came out with had dGPUs such as the 8800GT.

Even my RTX3060TI RT performance is barely sufficient in Cyberpunk 2077. It is going to be terrible in games such as Atomic Heart. Ultimately again it will be consoles which are going to be an important factor in all of this.
 
Last edited:
So that counters this theory then whereby we auto-assume nvidia think up and work on everything decades before everyone else then? As above pointed out (along with other companies working on 3D techniques in the industry before nvidia) the entry portion of the dGPU stack will not champion the feature if it has to reduce quality and rely on DLSS.

Where I would accept nvidia are miles ahead would be in AI and corporate industries like robotics, which most of the time is not gaming related.
 
Last edited:
So that counters this theory then whereby we auto-assume nvidia think up and work on everything decades before everyone else then? As above pointed out (along with other companies working on 3D techniques in the industry before nvidia) the entry portion of the dGPU stack will not champion the feature if it has to reduce quality and rely on DLSS.

Where I would accept nvidia are miles ahead would be in AI and corporate industries like robotics, which most of the time is not gaming related.
We need an Elon Musk to bring about an Nvidia stomping competition to the market to really even things out.

Tbh im really surprised he hasn't ventured into this with his background to go into AI and DGpu technology.
 
theres divergence in strategy, nvidia is betting big on dlss as the next frontier of growth while amd is still taking the traditional approach.. my guess is that next gen amd will have the rug pulled from beneath them
also i was looking at few youtube videos, it seems nvidia started working on these new technologies (speculative rendering and RT) a decade before amd

Not exactly, AMD have their own version of DLSS, their approach of how that works on the silicon is different, AMD use less die space to make it happen, you could argue Nvidia's DLSS is better and you would be right, but FSR 2 is about 90% as good as DLSS 2, you really have to analyse still captures to see the difference.

Much to Nvidia's frustration i'm sure AMD was able to get a form of RT and DLSS up and running in a single year after Nvidia launched it on their GPU's, while i don't rate DLSS any better than FSR; RT for sure is better on Nvidia GPU's, for now, because AMD's first iteration is very immature in comparison, but i think this second generation will be a significantly higher increase than Nvidia's third generation and AMD will close up the gap by quite a bit.
 
Last edited:
Radeon 6900XT prices in the US have dropped by $100 in the last week or so. Looks like AMD is desparate to get rid of the stock as soon as possible.

Might be #WishfulThinking, but I have hopes.

They've since dropped another ~$50. Stock levels are very low everywhere. I hope AMD has learned from Nvidia's 4080 12GB mess and they wouldn't attempt to pull off something similar.
 
Not exactly, AMD have their own version of DLSS, their approach of how that works on the silicon is different, AMD use less die space to make it happen, you could argue Nvidia's DLSS is better and you would be right, but FSR 2 is about 90% as good as DLSS 2, you really have to analyse still captures to see the difference.

Much to Nvidia's frustration i'm sure AMD was able to get a form of RT and DLSS up and running in a single year after Nvidia launched it on their GPU's, while i don't rate DLSS any better than FSR; RT for sure is better on Nvidia GPU's, for now, because AMD's first iteration is very immature in comparison, but i think this second generation will be a significantly higher increase than Nvidia's third generation and AMD will close up the gap by quite a bit.
no i was commenting on the strategic direction, nvidia considers dlss and rt as core offerings while amd treats them as add-ons.. thats pretty much plainly apparent with ada and this divergence would reflect in future products
 
Last edited:
I must have watched an old documentary thing where it followed Jensen. I just recalled he got in his car and they talked about AI and thought it was well in there with Tesla at the time. Didn't know they moved on so fast (intel first and now AMD for the infotainment deck, not sure what works in the driverless side of things) but this doesn't surprise me.
 
no i was commenting on the strategic direction, nvidia considers dlss and rt as core offerings while amd treats them as add-ons.. thats pretty much plainly apparent with ada and this divergence would reflect in future products

From my perspective you missed out the massive direction on compute and AI. Often others commented they copied the Vega era where gaming was the benefactor of the business like spec. If anything AMD went back to focusing on the gaming element and it trumped the compute/business focus so they effectively swapped places - just look at the GPU mining scene where nvidia were preferred instead of the previous era of vega/rdna1.
 
Last edited:
From my perspective you missed out the massive direction on compute and AI. Often others commented they copied the Vega era where gaming was the benefactor of the business like spec.
i dont think so, amd never had the software stack - its always been an afterthought for amd
the thing is ALUs are general purpose logic cores - most of the time they just add 2 numbers and add the result to a third number in FP32 precision..
you could have used that anywhere, its just one of those things when someone realizes that he can run but hasnt ever trained his body for the upcoming marathon but still tries to wing it
 
Last edited:
no i was commenting on the strategic direction, nvidia considers dlss and rt as core offerings while amd treats them as add-ons.. thats pretty much plainly apparent with ada and this divergence would reflect in future products

That's true yes, you're right its a difference in marketing strategy, i think, for Nvidia going forward they want to make it less about raw performance and instead make it about features, like more glossy looking games, like you were selling an iPhone and not just a tool for the job, more about the infotainment system in the car rather than the cars engine.
 
Last edited:
i dont think so, amd never had the software stack - its always been an afterthought for amd
the thing is ALUs are general purpose logic cores - most of the time they just add 2 numbers and add the result to a third number in FP32 precision..
you could have used that anywhere, its just one of those things when someone realizes that he can run but hasnt ever trained his body for the upcoming marathon but still tries to wing it

AMD's software development is improving greatly, this has been a problem and AMD knew it, they have been working on changing that for some years now and are getting good at it.
They just paid $40 Billion for the largest producer of FPGA's, someone not interested in software stacks doesn't buy a company like that let alone for that kind of money.
 
That's true yes, you're right its a difference in marketing strategy, i think, for Nvidia going forward they want to make it less about raw performance and instead make it about features, like more glossy looking games, like you were selling an iPhone and not just a tool for the job, more about the infotainment system in the car rather than the cars engine.
i believe dlss is going to be th e only path towards rendering efficiency as we start targeting higher than 4k resolutions, there are various approaches being tried atm and i willing to bet on this technique becoming the de-facto rendering method for future games

You dont think so?

Look up ROCm and you will see they have had software and interest since 2016.
its nowhere close to CUDA though, the CUDA feature set dwarfs ROCm.. and AMD cards dont do matrix FMA in hardware

AMD's software development is improving greatly, this has been a problem and AMD knew it, they have been working on changing that for some years now and are getting good at it.
They just paid $40 Billion for the largest producer of FPGA's, someone not interested in software stacks doesn't buy a company like that let alone for that kind of money.
the point i was making is nvidia's on top of the innovation game, amd has been a follower - eventually amd will get there, but then nvidia would have moved on to something else. theres a big difference in company DNA. nvidia also employs many subject matter experts that have nothing to do with hardware but rather focus on the end user and industry problems that could be called deep-in-domain
 
Last edited:
You said they never had the software stack. Clearly they did. And they have been able to do similar things for years..
what i meant was that their software stack was an afterthought, they started with nothing, then saw nvidia doing something and cobbled up something because nvidia was doing it and that is still being shunned by the wider industry, they didnt go in with the kind of intent that nvidia did with CUDA.. its just like the guy who starts training a day before the marathon.. get the context
 
Last edited:
i believe dlss is going to be th e only path towards rendering efficiency as we start targeting higher than 4k resolutions, there are various approaches being tried atm and i willing to bet on this technique becoming the de-facto rendering method for future games

Yes, AMD think quite differently, they don't think Moore's law is dead.

Intel, Nvidia, AMD they all saw the slowdown in Moore's law but they all had very different approaches to it, which is awesome, well for two of them.....

Stating with Intel, so far as they were concerned they own the entire X86 market, sitting pretty, didn't really care about the slowdown in Moore's law and chose not to think much about it, because what are you going to do? Buy an AMD CPU????

For Nvidia, yeah, its about turning the GPU in to something from what was the engine to drive your game to something that's about features, including some that (i don't intend this in a derogatory way) cheat to get the frame rates up.

AMD have also thought about it, they think they can cheat Moore's law through advanced packaging technologies, what Intel disparagingly refereed to as "Glue" which believe it or not IS actually a technical term for what AMD are doing.
For those who still think Intel were right, somehow??? Intel still cannot get anywhere near AMD's now obsolete and dead 64 core Zen 3 chip in datacentre while AMD are now making 128 core Zen 4 chips, that is how far AMD are ahead of Intel.

AMD see the same strategies with GPU's, and don't count them out of pulling it off, i've been around a long time and i have watched AMD out innovate Intel over and over and over and over........ your Intel CPU would not work on your desktop without AMD's IP in it.

While at the same time AMD are also looking to keep up with Nvidia's solutions.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom