• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

I would think that an engine would have to been developed solely with RT in mind before we see anything impressive.

And since cross-platform (i.e. consoles) is the #1 criteria when designing a new engine from the ground-up, I doubt much progress with RT everywhere will be made this generation of consoles.

At the end of the day neither Sony or Microsoft were willing to pour enough money into RT in their consoles. Okay, it was AMD's first iteration of RT but we can already see from Sony that 308mm² was about the max die size they were willing to buy. Because in terms of yields at the clock Sony ended up wanting, a larger chip at a slower speed would have yielded far better (someone at AMD made a big mistake there by approving the higher clocks IMO). Dedicating enough die space to just RT (unlike the hybrid approach AMD went for) might have added 50mm² (picked out of the air figure!), but neither console player was willing to pay for that. So instead, they went with the min RT. Next time they have to cough up more.

Thing is, as far as dedicated silicon space goes, I would rather the console manufacturers took a leaf out of phone SOCs and dedicated space to neural nets. I think a robust, well design AI API framework which utilised neural net AI hardware for NPCs would be a far better use of silicon than more eye candy. (RT purists will disagree!)

Of course, eye candy is probably easier to implement than good NPC AI
 
I would think that an engine would have to been developed solely with RT in mind before we see anything impressive.

And since cross-platform (i.e. consoles) is the #1 criteria when designing a new engine from the ground-up, I doubt much progress with RT everywhere will be made this generation of consoles.

Like you said, having RT considered at the engine level will mean a lot, and in turn will allow for better performance and therefore work better on consoles.

RNDA2 will still take the biggest hit regardless, but you'll get a good middle ground at a more acceptable performance cost. Right now we have games like FC6 with "RT Lite" and DL2 with balls-to-the-wall RT.
 
Most people are still on non-RT dGPUs. Anyone on an RDNA2 dGPU or an Ampere based dGPU is in a minority of PC owners.
The most represented RT capable dGPU in the top10 is an RTX2060. There are more GTX1060 owners than RTX2060 and RTX2070 Super owners combined. There are more RTX2060/RTX2070/RTX2070 Super owners owners than RTX3060/RTX3060TI/RTX3070/RTX3080 owners combined and the RTX3080 is way down the list. There are nearly 26 million PS5/XBox sales so far. That is a huge number of people still on 1st generation/lower level RT capable systems(if the system can do RT).

Most of the Turing range is barely better than RDNA2 even in Cyberpunk 2077 which favours Nvidia dGPUs. Only the RTX2080TI holds it own with Ampere.

That pretty much means a huge chunk of people owning RT capable systems,will be having issues with RT performance,just not RDNA2 dGPU and console owners.

Agree. I said it many moons ago but the 2060 didn't have enough grunt to brag about RT yet in threads where people were asking what card to get the people mentioning "it has RT as well" like it was a major bonus. I cant see it on the second chart you provided but assume it sucked anyway.

So in summary, despite what the nVidia fans will have you believe. You can still enjoy a game without RT. It’s eye candy.

Well yes, I thought this was a standard thought process but I guess the marketing would have you believe differently! :)

Rdna 2's longevity should be the last thing on a 10gb users mind when Ampere gets replaced ffs.:p

Oof.

enchantedwilow.gif
 
Last edited:
Agree. I said it many moons ago but the 2060 didn't have enough grunt to brag about RT yet in threads where people were asking what card to get the people mentioning "it has RT as well" like it was a major bonus. I cant see it on the second chart you provided but assume it sucked anyway.



Well yes, I thought this was a standard thought process but I guess the marketing would have you believe differently! :)



Oof.

enchantedwilow.gif

I think we need doubling of the RT performance in the RTX3060TI tier so you can run the effects at 1080p without DLSS.
 
I think we need doubling of the RT performance in the RTX3060TI tier so you can run the effects at 1080p without DLSS.

dlss makes images look better anyway, I run dlss on by principal in many games because I find it looks better. Even if rtx4000 has triple RT performance, I'll still use DLSS where I can cause it looks better
 
dlss makes images look better anyway, I run dlss on by principal in many games because I find it looks better. Even if rtx4000 has triple RT performance, I'll still use DLSS where I can cause it looks better
If that's the case, get some Vaseline and smear that on your monitor for added effect :P
 
@TNA with his measly 8gb may disagree there after playing dl 2.... ;) :p :D :cry:
I may disagree with @TNA though, I can test it on my 3070 and see if I can break the 70's 8Gb again... ;) :p :D :cry:
I don't think anyone who paid £649 for a 3080 10gb will be too concerned.
Correct, great card, I'm not concerned in the slightest my 80 FE runs out of vram.:)

Point still stands though, after the 40 series launches, it's andrex time for Ampere.
 
@TNA with his measly 8gb may disagree there after playing dl 2.... ;) :p :D :cry:
Haha. I am surprised myself, I did not expect to be able to play it with RT on. But it just…works! :cry:

If that's the case, get some Vaseline and smear that on your monitor for added effect :p

@shankly1985 would disagree. He is using FSR when playing Dying Light 2 and says it looks good and my experience testing both FSR and DLSS on performance and DLSS looked much better to me. No Vaseline.


I may disagree with @TNA though, I can test it on my 3070 and see if I can break the 70's 8Gb again... ;) :p :D :cry:
I never claimed 8GB is not breakable though ;)

I said I surprisingly managed to get Dying Light 2 working with RT on. What I did find was the card run out of grunt before there was an issue with vram in this game. That’s why I have to used optimised RT settings to keep fps high enough to be playable.
 
*snip*
@shankly1985 would disagree. He is using FSR when playing Dying Light 2 and says it looks good and my experience testing both FSR and DLSS on performance and DLSS looked much better to me. No Vaseline.
*snip*

I was just poking Grim5 a bit for funzies. That said FSR looks like *** IMHO if your selected res is 1440p like my monitor is native. Not saying it's entirely FSRs fault cause 1440p native looks a bit bad as well. However, running 4k through VSR(since my monitor is 1440p) and then FSR looks pretty good :P. Seems a bit backwards but what do I know, I'm just happy with the end result and are having a blast with the game.
 
I was just poking Grim5 a bit for funzies. That said FSR looks like *** IMHO if your selected res is 1440p like my monitor is native. Not saying it's entirely FSRs fault cause 1440p native looks a bit bad as well. However, running 4k through VSR(since my monitor is 1440p) and then FSR looks pretty good :p. Seems a bit backwards but what do I know, I'm just happy with the end result and are having a blast with the game.
Haha. It is indeed fun to poke Grim :D
 
From the Grim youtube comparison above, I thought the 6900 image was better in that first area. The fps of the DLSS is obviously the playable speed wanted.

From the feedback given above it would sound more like the devs didnt bother too much catering for other settings rather than the GPU features being poorer than another.
 
Back
Top Bottom