• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

Caporegime
Joined
4 Jun 2009
Posts
31,859
Given people go on about vram being the main thing for longevity.... thought it would be interesting to see what others think about RDNA 2 and its potential limitations and if there is cause to be concerned about its future, much like how some are concerned about nvidia and its "lack" of vram.

Main things that come to my mind:

- FSR

Not being added to all the games, which really need it (obviously RSR will fix this but as we have seen with forcefully injecting FSR as well as NIS from nvidia side, it's not ideal and not always guaranteed to give good results, with time, it could improve though)

- ray tracing performance

No denying RDNA 2 ray tracing is beyond awful and we have seen first hand how even a 3070 can match/best a 6900xt in quite a few RT titles as of right now, given this is being added to the vast majority of games, it is quite the limitation to have imo, not to mention, if anymore ray tracing only titles like ubis upcoming avatar get announced....

Even in amd sponsored games, ray tracing still hits rdna 2 hard:

https://www.reddit.com/r/Amd/comments/oudbtc/rx6800xt_vs_rtx3080_realtime_ray_tracing/

What is going to happen when/if RDNA 3 comes out and matches or even exceeds nvidia in RT performance? Surely we will expect the ray tracing effects to no longer be held back as much as they currently are?

Will RDNA 2 owners upgrade to RDNA 3 for the better ray tracing performance to avoid this particular limitation?
 
I wouldn't buy any card with less than 12GB these days, my 2070 Super runs out of VRam before it runs out of muscle in Icarus, so its already happening.

Lets keep this to amd and their limitations, we already have plenty of nvidia/vram based threads.
 
Interesting variety of view points so far.

Of course what I didn't mention in my response in the RDNA2 Refresh for 2022 6x50XT cards thread, was that I mainly play modded games.

And modded games have plenty of restriction and tend to be texture heavy. That's because if a studio was spec'ing a game to higher hardware requirements they would balance the model details, texture quality and shaders. Modders don't have access to the original models so they just up the texture quality and maybe apply a few too many ENB effects.

However, since this is a given then for those who like to run heavily modded games then consider this from the readme for the Skyrim SE Wabbajack modlist, Aldrnari:
http://www.wabbajack.org/#/modlists/info?machineURL=aldrnari

For those who only want to play current games as they are released, then VRAM becomes less of an issue as no developer is going to alienate 8GB and 10GB card owners.

Yeah your post and a few other peoples posts kind of sparked my interest as to how do we "define" and gauge longevity when it comes to this area.

That is the main cause for vram limitations in my experience so far, I had issues in cyberpunk because I loaded a ton of modded 4-8k texture packs in and hit the fps drops and frame latency issues so had to remove 2 of the packs, which resolved this. Can't say I am a big modder these days generally though as too much faff required :p Only other game I modded somewhat was fallout 4.

On FSR/RSR:

These technologies make up for the fact that this generation's efforts in maintaining playable 4K aren't entirely successful. A lot of games even with adequate hardware can't reach good detail at 4K@60 for all sorts of reasons, especially ray-tracing. I think once 4K performance reaches this benchmark for most home users, these upscaling options won't be used as much. When that happens though I don't know - it might not be RDNA3 if they cost the earth.

On ray-tracing:

In a decade we'll be wondering how we enjoyed games without it, basically. I doubt RDNA3 will get us anywhere near that state, but if AMD don't catch up they will lose what little market share they currently have. And that would be a shame.
The question is whether anyone cares at this point, as cards are so expensive that the ones that'll sell the most won't really be capable of showing off ray-tracing on Jonny Smith's fancy new OLED TV. Price has to come down, otherwise it'll be the main determining factor for what's popular in the market.

I doubt RDNA2 users will upgrade if they have a 6800X or better. Not initially, anyway. It's all about price for the next year or two.

Personally I think fsr, dlss etc. is here to stay and probably one of the best things to come to gaming in a while, especially for people with weaker/older cards. Sadly, a lot of developers are just adding them in so as to avoid having to optimise their games though :(

I think consoles will have a big say in the future of ray tracing and how demanding it will be, as surprisingly console games have been adding ray tracing where possible too (probably cause the main thing for developers is that it saves them time in their workflow) so if the rumours of the consoles refresh using rdna 3 are true, I would say we'll see this being the real catalyst for ray tracing uptake (of course will depend on if rdna 3 rt perf. improves...)

It will be interesting

I wouldn't worry about rayvtracing because you can just turn it off

but it's no longer a compute heavy gpu

That's kind of the point of this thread.... :p same way settings are "potentially" having to be sacrificed because of nvidias vram limitations, likewise the same is happening with rdna 2 but because of its lack of RT grunt and/or FSR uptake in necessary games i.e. both take different paths but the outcome is the same = reducing/turning off settings.

@Nexus18 already mentions the areas that will compel users to upgrade in the future (namely Ray Tracing) and that will be the measure of the cards longevity. FSR is an interesting piece of technology, I wonder how many people will start using it as the norm in 3/4 years time if new GPU's still cost a ridiculous amount of money just to get some extra life out of their hardware?

Pretty much what I am trying to gauge from this thread. The way some people go on, you would think the only reason people upgrade is for more vram and nothing else :p

Quite a few turing owners upgraded to ampere for the better ray tracing perf. it offered.
 
Ray tracing is mainstream right now imo, you just have to look at all the games adding it.

Adding RT does not save any time atm, it is extra work. It may save a lot of time on RT only games but i think any game that is developed primary for consoles in the next 5 years and most of the games that will be developed for PC will not be RT only. There may be a handfull of RT only games sponsored by Nvidia ( or AMD, Intel ) depending on who has an advantage on RT rendering at that point, released for the PC only market.
I don't know if the consoles refresh will use a much better video chipset but the games released for consoles will have to be able to run at least decent on PS5 and XboxX for the whole duration of this generation of consoles. So even if (let's hope) we will see heavier RT titles released for the "PS5 Pro" the game will have dumbed down settings for the PS5 original.

True that, we'll always be held back somewhat because of current gen limitations but I think the visual difference between current games with old gen and RT methods show the laziness/lack of effort being put into said old gen ways i.e. compare RT and old gen methods of shadows, AO and reflections etc. such as deathloop, control, cyberpunk etc., you can see the non-rt areas look considerably worse and even worse than before the RT era of games i.e. games like batman arkham knight, alien isolation, RDR 2 being the peak for the old gen methods of reflections, lighting, shadows and AO, I don't think we'll ever see that kind of quality with the old ways again, instead, it'll be half assed and anyone who wants the best visuals in those areas will have to use ray tracing.

AAA games are designed around consoles, so considering how much faster the desktop cards are then longevity is of no concern due to RT performance. The reason RT is "held back" has exactly 0 to do with AMD and everything to do with needing to re-tool for next-gen (which is a process and hasn't been finished yet) and ofc having to cater to said consoles, so RT will be kept to a minimum outside of where it gives the greatest returns (usually reflections) for the cost. If anything RDNA 2 will age better than Ampere simply because Nvidia is quick to discard its older products as soon as the new one's out the door, but at least AMD & Devs are committed to RDNA 2(+) for this decade. Overall even with Ampere you'll start dropping off RT just because performance requirements will outpace the hardware (and driver) capability, so it will fall back to 99% raster anyway unless you want to drop <1080p or <60fps.

Whilst that is mostly true i.e. consoles come first, we have had plenty of games where ray tracing has been dialled up to cater for pc enthusiasts (which I am actually rather surprised by tbh)

Why is it that amd sponsored games considerably dial back RT effects/complexity especially resolution for likes of RT reflections?

Also, look at fc 6, ray tracing had to be deactivated for consoles. Hasn't it been confirmed that dying light 2 ray tracing won't be supported for consoles either?

I do agree though, at some point ampere owners are going to have to hold back on dialling up the RT effects and as per my comments in another thread, come the time for upgrading:

Upgrading for better ray tracing performance?
Upgrading for better rasterization performance?
Upgrading for more vram?

Which will be the main reason for people upgrading?
 
by the time that happens the peope on here with RDNA2 cards would have moved on years ago

my favorite youtube clip was the linus crew trying to figure out which game had ray tracing on or off

Appropriate video for the time but ray tracing has come along way since and his main game on that basis was shadow of the tomb raider, regarded as being the worst for ray tracing implementation i.e. just shadows and a very poor job at that.

I like this channel, goes through all/most rt games and shows a decent comparison:

https://www.youtube.com/c/RayTracingRevolution/videos
 
I think some have missed the point of this thread given all the posts/threads surrounding "vram issues/longevity" and "amd finewine longevity" ;) :p

So I guess we can safely say that vram is a complete non issue now then as well? :D

But I somewhat digress... isn't this why we buy top end gpus and the best displays etc. i.e. to get the best performance AND visuals we can?

Since when has graphics even been the sole reason to PLAY a game? Like honestly how did we ever manage to reach this point in gaming if the graphics are the only reason to PLAY?

As per my point above, given that you have people spending up to £2-4k on a pc and £500/1k+ on displays etc., I would argue graphics and getting the best performance is pretty important....

I'm a sucker for eye candy now and for me RT is truly the next big advancement in visuals, is it the be all and the only reason to play games? Of course not, I happily played CP2077 on my vega 56 but getting to experience the game with RT just made it so much more immersive and appealing/enjoyable, much like how I couldn't get into metro exodus when it first released, however, the enhanced version I was enjoying far more because of what RT did to improve the overall look and feel of the game.
 
To each its own.
I'm old enough to have played Wolfenstein 3d on a 286 and TBH visuals are a secondary concern, although in all honesty the switch away from CRT kinda killed many build engine shooters for me, they now give me motion sickness sadly.

I have no doubt RT is going to be a major improvement in mainstream gaming in a few years, however for that to happen it will have to be playable on hardware people can afford.
I'm glad many of you guys are happily splurging on top end hardware and helping paying R&D for the masses, however I would also like to invite you to take a good luck at median (not average) income in most countries and realize that only a tiny minority can afford high end PCs and even less people are willing to let their wallets be raped by top end prices.

I honestly could afford to shell 4-5k€ for a ludicrous PC but that would take a decent chunk out of my savings and there would be a lot of explaining to be done with my wife to justify it (and with her degree in philosophy wish me luck out arguing her!), I'm sure many of you guys can relate to that.

Don't disagree at all.

That's part of the point I made in my OP too, we are already seeing developers wanting to and trying to use RT where possible on "consoles" but they are limited because of RDNA 2, however, what will happen come the console refreshes i.e. ps 5 pro etc. which may use RDNA 3 (and hopefully have at least as good of RT perf. as ampere or better)? Consoles are where the true budget/casual/majority of gamers go, of course could be argued only a small minority will get the new consoles....

Im not worried about longevity at all. I had a 3080, and sold it. The 6900XT runs circles around it on any game I play at 1440p 144hz while staying cooler, and without needing 400w. I don't really care about RT because its an easy way to tank your performance for some eye candy that doesn't matter when you are playing fast paced games. In my opinion RT is not ready for this generation or probably the next. Until games can run natively at 4K 144hz WITH RT on, it is not a priority for me.

400w for a 3080? :confused: Mine hardly pushes past 275w :p
 
For better reflections on flat shiny surfaces? Come on, thats ridiculous its a bit, a very small bit, of eye candy and thats about it 99% of what you see is rasterization I can live quite happily with it turned off, does it look nice? Sure like a car thats had a wax and a polish but thats about as deep as it goes (CP2077 & 3090 owner here)

There is so much more to ray tracing than just better reflections and shinier surfaces..... Also given that you have metal, glass/windows, puddles etc. all over night city in cp2077, it kind of does make a big difference having better "reflections".

This is a good channel, which shows the differences of well.

https://www.youtube.com/c/RayTracingRevolution/videos

DL 2 just uploaded there:


One of the most underrated features with RT reflections is that they look the same no matter what the camera angle is i.e. with SSR, slightest movement in camera angle despite standing in the same spot and all of a sudden, you see the reflection disappear e.g. in DL 2, the only thing, which has changed is me changing the camera angle, with RT, this reflection would have stayed much the same. Personally I find this incredibly immersion breaking.

3Cna3jV.jpg

Y14cx3N.jpg

But each to their own and all that. At least people have a choice if they want to use it or not.
 
At least with Cyberpunk 2077 I found reflections were the most noticeable effect(not helped by the non-rasterised water in the sea not being great). The other effects were noticeable only if compared side by side,and because my RTX3060TI had issues at qHD with RT,I switched off all the other RT effects and kept reflections on. However,since I have a qHD monitor I had to use DLSS. The issue is DLSS like all similar techniques then adds its own set of problems.However,on the same token,even if I switched off the RT effects Cyberpunk 2077 looked fine IMHO,as I played most of it on my GTX1080 anyway.

Atomic Heart seems more like a game which seems developed more around using RT effects though,so I will wait and see how that pans out first. The issue is as long as most of the market is on non-RT dGPUs and the fact most of the RT capable dGPUs/systems many own are a bit crap at RT, it is going to hold back how much RT is being used. It's going to be tacked on as an Ultra quality option more than an integral part of 99% of most games(unless the RT used is basic).

As long as we keep getting games with RT like cp2077, metro, control, dying light 2, it'll scratch my RT itch until the next big leap in RT. Atomic heart looks good but the one I'm really looking forward to is avatar, by the time it comes out, we'll probably have the 50xx/80xx series cards then.
 
You can make a hack in the settings for FSR UQ and it looks amazing ( at least when you don't move :D ). Which is probably they didn't added the UQ setting in the game. :)
https://imgsli.com/OTQyOTI
I don't understand why some RT effects are more heavy than in ME for example. If we compare the RTGI in ME with the one in this game, this one is very poor and yet it hurts the FPS more. Another weird thing is why RDNA2 is working better on DX11 vs DX12. Usually it works like crap on DX11 and yet here the DX12 perf is worse. The question is: is it worse than crap or is this one of the fewer games where RDNA2 not having problems with DX11?

Good thing i don't like the game, for me it looks like playing AC Unity in 1st person. :)

I think this has more RT effects than metro? That and metro EE version was made/optimised purely for RT. Then of course differences in engines being used.
 
From the Grim youtube comparison above, I thought the 6900 image was better in that first area. The fps of the DLSS is obviously the playable speed wanted.

From the feedback given above it would sound more like the devs didnt bother too much catering for other settings rather than the GPU features being poorer than another.

As I mentioned in the other thread, there is no doubt techland didn't put as much time/effort into the rasterization/non-rt settings but even if they did, you would still be very hard pressed to get results as good as what we are seeing currently, not to mention, how long would it take them to get good results?

4a shared their workflow for this with DF, which is a good watch:

https://www.youtube.com/watch?v=fiO38cqBlWU&t=315s

This but then again, many did and still do argue that metro exodus (non enhanced version) has some of the best shadowing, lighting etc. yet it still pales in comparison to the enhanced version. There is only so much that can be done by rasterization methods and what we have seen with RT is still just the tip of the iceberg.

The only game where I would say lighting, reflections and shadows etc. etc. could almost be considered to be on par with RT implementations such as that found in metro ee, cp 2077 and dl 2 is rdr 2 but again, we are essentially comparing the very best that rasterization can offer to what is essentially still the early days of RT.

Another game where shadows, AO looks considerably better than rasterization is deathloop, although no doubt we'll get the same reason again of "developers didn't spend time making non-rt settings look good", which is a perfectly valid point to make but as per the youtube video above, there is a good reason for that, of course, we could also say that nvidia are paying good money to have developers gimp their games :p ;) But I doubt this given the games are still having to run and look good for the much bigger consumers/market i.e. consoles.

I think this is going to become more common going forward now as developers are seeing just how much quicker, easier and less effort is required to use RT over rasterization hence why they are trying to use RT where possible on consoles.
 
That's the answer I was hoping to hear.:)

Which reinforces my point!

Since I'm actually running Nv and not AMD, I've got a larger concern that when 40 series lands(and you still might not be able to get one) they'll double up RTX perf>they'll pay for more effects heavy games>it WILL criple this gens gpu's.

By adding that onto the vram light 70/80's tight waistband, it's a bigger concern personally as I've got my limit on how much cash I'm throwing at these two companies.

Nvidia Ampere Potential Longevity Concerns?

Should I start a thread?

Might as well, let me get the popcorn first :p ;) :D

Although I do agree big time on that highlighted point but if it gives us a further leap in visuals, bring it on I say! As mentioned before, that is going to be the deciding factor for my next upgrade.
 
I have no doubt amd will improve their RT capabilities, question is will it be enough though? They have a considerable way to go to match ampere, let alone whatever 40xx series will bring us...
 
Just wondering, have you actually sued FSR on AMD hardware or basing your opinion on paid for reviews?

I've used both and imo dlss is superior "overall" for these reasons:

- dlss replaces the default AA i.e. usually TAA which means better temporal stability, better motion clarity (from my testing), less shimmering, less jaggies, works much better across various resolutions more so 1440P and lesser res., also, the various dlss modes are more usable overall than fsr modes i.e. at 4k, you can use pretty much all of them except maybe ultra performance and still get very good IQ on the whole and for 1440p, you can use quality or balanced and still get good IQ
- FSR relies on native AA techniques and as a result of it being a spatial upscaler, it then enhances any existing aritfacts in the image so if the default native aa has shimmering, jaggies, temporal stability issues, motion ghosting issues etc. this will be enhanced further, FSR is very good at 4k but only UQ and Q mode from my testing, FSR at 1440p is not great, certainly wouldn't use anything less than UQ preset

However, having said that, FSR in cp 2077 is very good, first game where I have been impressed with it and UQ at 1440p actually looks good.
 
Will 30 series owners upgrade, like they did from the 20 series which was just a terrible for RT? You had a £1099 card that was then made redundant by the £469 3070 if all you are interested in is DLSS/RT.

Isn't it the same thing you are asking, the re: the longevity?

I'd love to know what the situation would have been like in the consumer focused gaming market had it not been for these shortages, I mean you saw people dumping those 2080 TI's for under £500 on the MM for a while, almost in a panic about how much they had been screwed over, which funnily enough was due to the fact RDNA2 existed and for once AMD would be competitive with Nvidia again, regardless of the RT function.

Personally (not a gamer anymore really) unless you include strategy games where graphics count for 1% of the game, but it seems like I am living back in the SNES/Mega drive days when people would argue over which console had better graphics, in other words pointless stupid playground arguments, except this time it is being held by grown people, not children, I think. ;)

I think you kind of missed the point of this thread ;) :p

I think some have missed the point of this thread given all the posts/threads surrounding "vram issues/longevity" and "amd finewine longevity" ;) :p

So I guess we can safely say that vram is a complete non issue now then as well? :D
 
Didn't you just totally avoid answering the question though? The longevity of the 20 series of only looking for RT was abysmal at best, or do you some how disagree?

I don't think you'll ever find me saying otherwise? I have often said, RT was not ready at the time of turing release for more reasons than just not having the RT grunt in the first place i.e. dlss being dog ****, pretty much no RT games and the ones that did have it were very limited and poorly implemented i.e. bf 5

As per all the articles/benchmarks etc. out there, ampere is superior in every way for RT to turing and most importantly is the fact that there is a considerable amount of RT titles now and not just quantity but also in terms of RT effects being used.


Again, the whole point was of this thread was to see how people viewed amds potential longevity issues given all the "zOMG, not enough vram!!!!" threads and making it out like nvidia users (more so 3080 10GB users) are suffering when there is still no solid evidence to show this where as we have several RT titles showing issues for amd across all resolutions, even with the flagship 6900xt.
 
Again though isn't this only true if you bought a card because you care about the very best graphical settings? So therefore anyone who buys into a 3080/6800XT or above is a higher end user and more likely to change it sooner rather than later, possibly even more so given the average price of the cards over the last 12 months when the majority would have been sold. So neither RT or VRAM will ever be an issue for most high end users, only those that keep something for 5+ years.

The actual interesting question would be in the higher volume parts which were traditionally ~£300 or below, your 1060 equivalents, that is what matter to the masses not the few, I'll call them the 1% (might be slightly higher). So if RT is important and the longevity of the card is based on that then it need to do it at the price point that has the biggest effect, not with the super mega enthusiast cards that people have been paying £1500-2000 for. When cards that can do RT well are at sub £300 then longevity might be a consideration for those buyers.

I am sure if RDNA 3 is better for RT etc. and if the prices normalise somewhat, then the longevity issue would only really hit much later when people are picking them up second/third hand, and they end up a with a lower second hand price vs the Nvidia equivalent due to the lower RT performance.

Going back to my point with the 20 series, isn't the RDNA 2 issue just his repeated, but for AMD rather than Nvidia. So all the first gen users from either manufacturer get a rough deal at the high end, and even worse at the low end, RTX 2060 I am looking at you.

Yup I agree there, that reasoning as well as what I stated in this thread was to highlight how flawed the vram/AMD fans logic is. Not sure if you have read the whole thread or not but you'll see all my posts are pointing to the fact that people will be upgrading for various reasons but chances are it's not going to be for "more vram" despite what the "need more vram threads!!!!" would have you believe.

Assuming that RDNA 3 main advantage is RT performance improvement i.e. similar as it was for turing to ampere, it'll be interesting to see how the ones/amd fans that give **** to turing owners for upgrading to ampere react to RDNA 3 ;) Not to mention, the inevitable sudden change in their opinions of RT ;) :p
 

Wondering if FSR 2 will help with RT perf. more on AMD than FSR 1 as it seems FSR 1 doesn't help out much at all with heavy RT scenes i.e. the 3080 hits 50/60 fps with same max RT settings using dlss quality @ 2560x1440. Bang4buck noticed this in his comparisons too, dlss performs significantly better in RT scenes than FSR 1 where as in rasterization, the performance gain is similar.
 
Back
Top Bottom