• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

Caporegime
Joined
4 Jun 2009
Posts
31,859
Given people go on about vram being the main thing for longevity.... thought it would be interesting to see what others think about RDNA 2 and its potential limitations and if there is cause to be concerned about its future, much like how some are concerned about nvidia and its "lack" of vram.

Main things that come to my mind:

- FSR

Not being added to all the games, which really need it (obviously RSR will fix this but as we have seen with forcefully injecting FSR as well as NIS from nvidia side, it's not ideal and not always guaranteed to give good results, with time, it could improve though)

- ray tracing performance

No denying RDNA 2 ray tracing is beyond awful and we have seen first hand how even a 3070 can match/best a 6900xt in quite a few RT titles as of right now, given this is being added to the vast majority of games, it is quite the limitation to have imo, not to mention, if anymore ray tracing only titles like ubis upcoming avatar get announced....

Even in amd sponsored games, ray tracing still hits rdna 2 hard:

https://www.reddit.com/r/Amd/comments/oudbtc/rx6800xt_vs_rtx3080_realtime_ray_tracing/

What is going to happen when/if RDNA 3 comes out and matches or even exceeds nvidia in RT performance? Surely we will expect the ray tracing effects to no longer be held back as much as they currently are?

Will RDNA 2 owners upgrade to RDNA 3 for the better ray tracing performance to avoid this particular limitation?
 
I wouldn't buy any card with less than 12GB these days, my 2070 Super runs out of VRam before it runs out of muscle in Icarus, so its already happening.

Lets keep this to amd and their limitations, we already have plenty of nvidia/vram based threads.
 
Of course what I didn't mention in my response in the RDNA2 Refresh for 2022 6x50XT cards thread, was that I mainly play modded games.

And modded games have plenty of restriction and tend to be texture heavy. That's because if a studio was spec'ing a game to higher hardware requirements they would balance the model details, texture quality and shaders. Modders don't have access to the original models so they just up the texture quality and maybe apply a few too many ENB effects.

However, since this is a given then for those who like to run heavily modded games then consider this from the readme for the Skyrim SE Wabbajack modlist, Aldrnari:
http://www.wabbajack.org/#/modlists/info?machineURL=aldrnari

PC Specifications
Aldrnari is meant to use every single inch of my computer, and here are my specs: - I7-7700k - 1080Ti Zotac - 11GB of VRAM - 32GB of 3200mhz DDR4 RAM

Full PC Part Picker setup is here. I would recommend atleast 8GB+ of VRAM for 1080p, and for 1440p you will need a minimum of 10GB of VRAM, although more is highly recommended. There are tweaks at the bottom of this readme for 6GB of VRAM users, but they void support. I do not have stable 60FPS on 1440p everywhere with my setup because, frankly, I do not care about framerate if combat is fluid and I can take sexy screenshots. The average is 60FPS with dips in 50FPS in very few areas at 1080p because my screen is a 3840x1080p (no I don't play in ultrawide, that's pointless and no support will be offered for that either).

For those who only want to play current games as they are released, then VRAM becomes less of an issue as no developer is going to alienate 8GB and 10GB card owners.
 
On FSR/RSR:

These technologies make up for the fact that this generation's efforts in maintaining playable 4K aren't entirely successful. A lot of games even with adequate hardware can't reach good detail at 4K@60 for all sorts of reasons, especially ray-tracing. I think once 4K performance reaches this benchmark for most home users, these upscaling options won't be used as much. When that happens though I don't know - it might not be RDNA3 if they cost the earth.

On ray-tracing:

In a decade we'll be wondering how we enjoyed games without it, basically. I doubt RDNA3 will get us anywhere near that state, but if AMD don't catch up they will lose what little market share they currently have. And that would be a shame.
The question is whether anyone cares at this point, as cards are so expensive that the ones that'll sell the most won't really be capable of showing off ray-tracing on Jonny Smith's fancy new OLED TV. Price has to come down, otherwise it'll be the main determining factor for what's popular in the market.

I doubt RDNA2 users will upgrade if they have a 6800X or better. Not initially, anyway. It's all about price for the next year or two.
 
It will be interesting

I wouldn't worry about rayvtracing because you can just turn it off

but it's no longer a compute heavy gpu
 
The amount VRAM on RDNA2 based cards is the last thing you need to worry about. The one thing AMD has done right with the 6000 launch is ensure each SKU has ample VRAM for it's targeted resolution. @Nexus18 already mentions the areas that will compel users to upgrade in the future (namely Ray Tracing) and that will be the measure of the cards longevity. FSR is an interesting piece of technology, I wonder how many people will start using it as the norm in 3/4 years time if new GPU's still cost a ridiculous amount of money just to get some extra life out of their hardware?
 
I wont upgrade my 6800xt, I think I have 1 possibly 2 games that have raytracing and the card flies in the games I do play. Ill probably sit out a generation and see what happens.
 
Personally I upgraded my whole machine post 3xxx series release and went through 3 Nvidia cards before the 6900xt but this was the card I always wanted to complete my AMD setup. (I had a 5700xt before and before that I've been Intel and Nvidia for 10+ years right back to ATI 1800xt apart from the odd AMD dabble ) and I dont see me wanting to update it when the next gen comes round. Cost and supply will probably put it out of my reach too

Ray Tracing I can take it or leave it in Far Cry 6 yes the rain and puddles looked nice but then again its just visual enhancements it doesnt affect the actual game.
FSR I used it for RE Village but that was only because I had Ray Tracing on and all settings maxed. If I dont use RT I dont use or need FSR in the games I play.
I cant see it making a massive difference in the next few years. The landscape of updating your PC has completely changed.
 
Last edited:
Interesting variety of view points so far.

Of course what I didn't mention in my response in the RDNA2 Refresh for 2022 6x50XT cards thread, was that I mainly play modded games.

And modded games have plenty of restriction and tend to be texture heavy. That's because if a studio was spec'ing a game to higher hardware requirements they would balance the model details, texture quality and shaders. Modders don't have access to the original models so they just up the texture quality and maybe apply a few too many ENB effects.

However, since this is a given then for those who like to run heavily modded games then consider this from the readme for the Skyrim SE Wabbajack modlist, Aldrnari:
http://www.wabbajack.org/#/modlists/info?machineURL=aldrnari

For those who only want to play current games as they are released, then VRAM becomes less of an issue as no developer is going to alienate 8GB and 10GB card owners.

Yeah your post and a few other peoples posts kind of sparked my interest as to how do we "define" and gauge longevity when it comes to this area.

That is the main cause for vram limitations in my experience so far, I had issues in cyberpunk because I loaded a ton of modded 4-8k texture packs in and hit the fps drops and frame latency issues so had to remove 2 of the packs, which resolved this. Can't say I am a big modder these days generally though as too much faff required :p Only other game I modded somewhat was fallout 4.

On FSR/RSR:

These technologies make up for the fact that this generation's efforts in maintaining playable 4K aren't entirely successful. A lot of games even with adequate hardware can't reach good detail at 4K@60 for all sorts of reasons, especially ray-tracing. I think once 4K performance reaches this benchmark for most home users, these upscaling options won't be used as much. When that happens though I don't know - it might not be RDNA3 if they cost the earth.

On ray-tracing:

In a decade we'll be wondering how we enjoyed games without it, basically. I doubt RDNA3 will get us anywhere near that state, but if AMD don't catch up they will lose what little market share they currently have. And that would be a shame.
The question is whether anyone cares at this point, as cards are so expensive that the ones that'll sell the most won't really be capable of showing off ray-tracing on Jonny Smith's fancy new OLED TV. Price has to come down, otherwise it'll be the main determining factor for what's popular in the market.

I doubt RDNA2 users will upgrade if they have a 6800X or better. Not initially, anyway. It's all about price for the next year or two.

Personally I think fsr, dlss etc. is here to stay and probably one of the best things to come to gaming in a while, especially for people with weaker/older cards. Sadly, a lot of developers are just adding them in so as to avoid having to optimise their games though :(

I think consoles will have a big say in the future of ray tracing and how demanding it will be, as surprisingly console games have been adding ray tracing where possible too (probably cause the main thing for developers is that it saves them time in their workflow) so if the rumours of the consoles refresh using rdna 3 are true, I would say we'll see this being the real catalyst for ray tracing uptake (of course will depend on if rdna 3 rt perf. improves...)

It will be interesting

I wouldn't worry about rayvtracing because you can just turn it off

but it's no longer a compute heavy gpu

That's kind of the point of this thread.... :p same way settings are "potentially" having to be sacrificed because of nvidias vram limitations, likewise the same is happening with rdna 2 but because of its lack of RT grunt and/or FSR uptake in necessary games i.e. both take different paths but the outcome is the same = reducing/turning off settings.

@Nexus18 already mentions the areas that will compel users to upgrade in the future (namely Ray Tracing) and that will be the measure of the cards longevity. FSR is an interesting piece of technology, I wonder how many people will start using it as the norm in 3/4 years time if new GPU's still cost a ridiculous amount of money just to get some extra life out of their hardware?

Pretty much what I am trying to gauge from this thread. The way some people go on, you would think the only reason people upgrade is for more vram and nothing else :p

Quite a few turing owners upgraded to ampere for the better ray tracing perf. it offered.
 
Nv are rolling out their own `agnostic` version of FSR - open source is the way forward , as closed source usually comes at a price. Think Free sync vs G-Sync. AMD are coming with the 6950XT with faster Vram but likely on 6nm; if the core can be boosted in teh same way as the 6500XT apparently is, 1ghz extra speed should make u for any percieved problems with DXR ; raw horsepower at it again .

Problem is though is one of cost; TSMC already charge $20k for a 7nm wafer (Ian Cutress on his Tech channel 10 days ago in an interview dropped that one), and costs are going up even more. If a wafer itself costs $30 or even$40k, at 5nm , its the retail consumer who pays the most for 1 item as we dont have thousand item contracts.
 
For me Ray tracing is not a concern I have no interest but then mostly top down strat or first person shooters

as for longevity I tend to change my cards every 3-4 years so my rx6700xt should be good for most of that at 1440p and does have 12gb of ram
 
Will RDNA 2 owners upgrade to RDNA 3 for the better ray tracing performance to avoid this particular limitation?
Of course they will, just like the 3080 owners said they will upgrade to the next gen. :)
I think consoles will have a big say in the future of ray tracing and how demanding it will be, as surprisingly console games have been adding ray tracing where possible too (probably cause the main thing for developers is that it saves them time in their workflow) so if the rumours of the consoles refresh using rdna 3 are true, I would say we'll see this being the real catalyst for ray tracing uptake (of course will depend on if rdna 3 rt perf. improves...)

Adding RT does not save any time atm, it is extra work. It may save a lot of time on RT only games but i think any game that is developed primary for consoles in the next 5 years and most of the games that will be developed for PC will not be RT only. There may be a handfull of RT only games sponsored by Nvidia ( or AMD, Intel ) depending on who has an advantage on RT rendering at that point, released for the PC only market.
I don't know if the consoles refresh will use a much better video chipset but the games released for consoles will have to be able to run at least decent on PS5 and XboxX for the whole duration of this generation of consoles. So even if (let's hope) we will see heavier RT titles released for the "PS5 Pro" the game will have dumbed down settings for the PS5 original.
 
AAA games are designed around consoles, so considering how much faster the desktop cards are then longevity is of no concern due to RT performance. The reason RT is "held back" has exactly 0 to do with AMD and everything to do with needing to re-tool for next-gen (which is a process and hasn't been finished yet) and ofc having to cater to said consoles, so RT will be kept to a minimum outside of where it gives the greatest returns (usually reflections) for the cost. If anything RDNA 2 will age better than Ampere simply because Nvidia is quick to discard its older products as soon as the new one's out the door, but at least AMD & Devs are committed to RDNA 2(+) for this decade. Overall even with Ampere you'll start dropping off RT just because performance requirements will outpace the hardware (and driver) capability, so it will fall back to 99% raster anyway unless you want to drop <1080p or <60fps.
 
I wont upgrade my 6800xt, I think I have 1 possibly 2 games that have raytracing and the card flies in the games I do play. Ill probably sit out a generation and see what happens.
As an owner of a 3070ti, I only own and actively play 1 game that has rt, and that is world of Warcraft.

Most games with RT just aren't my thing or don't look good what's anyways. Cyber punk looks old Gen and with RT, looks worse then with games without RT.
 
As an owner of a 3070ti, I only own and actively play 1 game that has rt, and that is world of Warcraft.

Most games with RT just aren't my thing or don't look good what's anyways. Cyber punk looks old Gen and with RT, looks worse then with games without RT.

Yea I tried cyberpunk with RT and didn't think that much of it tbh. Sure it looks better in some places but also worse in others. But then I played through half of it on my Radeon 7 and then half of it on my 6800xt - I thought the story telling and gunplay were pretty damn good and carried the game through. Graphics were good but not mindblowing.
 
I think for any GPU generation you loose out on raw power to drive high frame rates before the VRAM becomes an issue.

If you still using an RDNA2 GPU in 4 years time you really need to start to consider the graphic settings and resolution to keep frames to the respective levels. FSR and DLSS help here

Ray tracing is like any new GPU features of the past we buy faster GPU to get faster performance of these new technologies. Physx, tessellation, AA, shadows have all at some point been a massive performance hog.
 
I see a lot of people saying "I'll just turn off RT", but it's something that will become mainstream. The long-term goal is for it to replace existing static/manual lighting tech as it's faster to implement, more consistent and more accurate.

Not saying it'll happen right now as it's still evolving and being optimised, which means that the final result will be less resource-heavy, but it's still something that you want on your card.

Also, if AMD release their DL FSR, you're out of luck.
 
Interesting variety of view points so far.
That is the main cause for vram limitations in my experience so far, I had issues in cyberpunk because I loaded a ton of modded 4-8k texture packs in and hit the fps drops and frame latency issues so had to remove 2 of the packs, which resolved this. Can't say I am a big modder these days generally though as too much faff required :p Only other game I modded somewhat was fallout 4.
Wabbajack now supports Cyberpunk but nobody has made modlists yet. For Skyrim and Fallout 4, the Wabbajack modlists have made getting a stable modded game far easier. Mainly download the list, config a few things and you're good go. You are stuck with that the modlist authors choose but this does mean you don't to know much about conflicts resolution.

Anyone who's potentially interested in modded games and hasn't visited the Wabbajack Gallery really should.
http://www.wabbajack.org/#/modlists/gallery

But, yes if a studio was doing an upgrade of their existing tilte they would probably do a far better job of balancing the hardware resource budget rather than just up'ing the texture quality as most mod authors do. However, things are as they are and that's why value VRAM highly.
 
Back
Top Bottom