• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

To each its own.
I'm old enough to have played Wolfenstein 3d on a 286 and TBH visuals are a secondary concern, although in all honesty the switch away from CRT kinda killed many build engine shooters for me, they now give me motion sickness sadly.

I have no doubt RT is going to be a major improvement in mainstream gaming in a few years, however for that to happen it will have to be playable on hardware people can afford.
I'm glad many of you guys are happily splurging on top end hardware and helping paying R&D for the masses, however I would also like to invite you to take a good luck at median (not average) income in most countries and realize that only a tiny minority can afford high end PCs and even less people are willing to let their wallets be raped by top end prices.

I honestly could afford to shell 4-5k€ for a ludicrous PC but that would take a decent chunk out of my savings and there would be a lot of explaining to be done with my wife to justify it (and with her degree in philosophy wish me luck out arguing her!), I'm sure many of you guys can relate to that.
 
Im not worried about longevity at all. I had a 3080, and sold it. The 6900XT runs circles around it on any game I play at 1440p 144hz while staying cooler, and without needing 400w. I don't really care about RT because its an easy way to tank your performance for some eye candy that doesn't matter when you are playing fast paced games. In my opinion RT is not ready for this generation or probably the next. Until games can run natively at 4K 144hz WITH RT on, it is not a priority for me.
 

I'd agree with that, some people get a bit carried away on here, I have myself at times. Know of a few who are spending the best part of a months salary on a GPU alone and around 50% of the population don't even have a grand in savings. There may be a slight skew towards people on an above median wage on this forum, but it won't be far off.

I get it with this hobby, the upgrade itch is forever wanting to be scratched but when all is said and done, most people play a select few titles and are done with them in a matter of hours. Great visuals and audio do get me into single player titles, but if the gameplay is just repetitive nonsense I'll not hang around. It's why I mostly play multiplayer games as I find them more "random".
 
Indeed the upgrade itch is always there, although age tends to mull that down a little.
TBH having to scrap by with older hardware has its own joys as it lets you experiment on how to stretch it in ever more creative ways.

Something that would horrify enthusiasts for example is my latest monitor purchase, a cheap 32" curved Samsung monitor, It's only 1080p but it goes to 75hz and got freesync, which is an improvement from my 2012 vintage Asus 24" 1080/60 monitor.

Now, I admit 1080p on a 32" can be suboptimal in certain scenarios but playing around with Radeon settings I discovered their super resolution, which basically allows me to render at 1440p downscaled back to 1080, which basically works as artificial antialiasing and fixes the occasional jagged edges in games where my RX590 can afford to run at that resolution.

In my wife's case, she's got a similar setup but still the vintage HD7970 card which suffers a lot more, still thanks to FSR she can enjoy Horizon Zero Dawn, although stuff like Detroit:Become human doesn't run at all.
 
To each its own.
I'm old enough to have played Wolfenstein 3d on a 286 and TBH visuals are a secondary concern, although in all honesty the switch away from CRT kinda killed many build engine shooters for me, they now give me motion sickness sadly.

I have no doubt RT is going to be a major improvement in mainstream gaming in a few years, however for that to happen it will have to be playable on hardware people can afford.
I'm glad many of you guys are happily splurging on top end hardware and helping paying R&D for the masses, however I would also like to invite you to take a good luck at median (not average) income in most countries and realize that only a tiny minority can afford high end PCs and even less people are willing to let their wallets be raped by top end prices.

I honestly could afford to shell 4-5k€ for a ludicrous PC but that would take a decent chunk out of my savings and there would be a lot of explaining to be done with my wife to justify it (and with her degree in philosophy wish me luck out arguing her!), I'm sure many of you guys can relate to that.

Don't disagree at all.

That's part of the point I made in my OP too, we are already seeing developers wanting to and trying to use RT where possible on "consoles" but they are limited because of RDNA 2, however, what will happen come the console refreshes i.e. ps 5 pro etc. which may use RDNA 3 (and hopefully have at least as good of RT perf. as ampere or better)? Consoles are where the true budget/casual/majority of gamers go, of course could be argued only a small minority will get the new consoles....

Im not worried about longevity at all. I had a 3080, and sold it. The 6900XT runs circles around it on any game I play at 1440p 144hz while staying cooler, and without needing 400w. I don't really care about RT because its an easy way to tank your performance for some eye candy that doesn't matter when you are playing fast paced games. In my opinion RT is not ready for this generation or probably the next. Until games can run natively at 4K 144hz WITH RT on, it is not a priority for me.

400w for a 3080? :confused: Mine hardly pushes past 275w :p
 
Im not worried about longevity at all. I had a 3080, and sold it. The 6900XT runs circles around it on any game I play at 1440p 144hz while staying cooler, and without needing 400w. I don't really care about RT because its an easy way to tank your performance for some eye candy that doesn't matter when you are playing fast paced games. In my opinion RT is not ready for this generation or probably the next. Until games can run natively at 4K 144hz WITH RT on, it is not a priority for me.

Something wrong if your 3080 'needed' 400w as many of us here are under 300w.

AMD's console GPUs are managing to compete with Nvidia as long as you restrict yourself to legacy titles, engines or settings. In contrast I've had a great time with RT maxed out on a 3080/3770k combo at 1440p. I'm sure if you must have 4k then simply switching DLSS from Quality to Performance will do it for you and so to claim, as some others also do, that RT is not ready is only correct if you have a GPU, such as a 6900XT, that really isn't up to running modern titles :p

If only someone had pointed out AMD's issues just before launch ;)
 
Dying Light 2 is a pretty good use case where we're seeing DLSS and RT being worthwhile, which RDNA2 cards aren't able to manage.

FSR isn't viable because it looks pretty bad (based on the DF review), and even if we get an open DLSS, the RDNA2 cards still don't have the hardware to properly benefit from it.
 
AMD's console GPUs are managing to compete with Nvidia as long as you restrict yourself to legacy titles, engines or settings. In contrast I've had a great time with RT maxed out on a 3080/3770k combo at 1440p.

The main benefit that consoles have is their price point, but when you see a 2060 running circles around it, it's pretty disappointing.

For me, I'll just buy a PS5 for exclusives, but hard to justify it for anything that I could get on PC. Again though, £400+ for a console vs what you can get in PC parts; it's unfortunately the only option for a lot of people.
 
I think some have missed the point of this thread given all the posts/threads surrounding "vram issues/longevity" and "amd finewine longevity" ;) :p

So I guess we can safely say that vram is a complete non issue now then as well? :D

But I somewhat digress... isn't this why we buy top end gpus and the best displays etc. i.e. to get the best performance AND visuals we can?



As per my point above, given that you have people spending up to £2-4k on a pc and £500/1k+ on displays etc., I would argue graphics and getting the best performance is pretty important....

I'm a sucker for eye candy now and for me RT is truly the next big advancement in visuals, is it the be all and the only reason to play games? Of course not, I happily played CP2077 on my vega 56 but getting to experience the game with RT just made it so much more immersive and appealing/enjoyable, much like how I couldn't get into metro exodus when it first released, however, the enhanced version I was enjoying far more because of what RT did to improve the overall look and feel of the game.

I 100% agree RT is the future. But my post was aimed at Grim5 saying people should hold of buying until they can experience RT to its max like What!

That is millions of users sitting this out.

I take frame rate over graphics any day and will do even with next gen.

What I like to do anyway is if I really liked a game they is a good chance I'll go back from it again with a pc upgrade.

So I'll experience this on my poor 6800xt then I'll do it again on my poor rdna3 GPU :D
 
Something wrong if your 3080 'needed' 400w as many of us here are under 300w.

AMD's console GPUs are managing to compete with Nvidia as long as you restrict yourself to legacy titles, engines or settings. In contrast I've had a great time with RT maxed out on a 3080/3770k combo at 1440p. I'm sure if you must have 4k then simply switching DLSS from Quality to Performance will do it for you and so to claim, as some others also do, that RT is not ready is only correct if you have a GPU, such as a 6900XT, that really isn't up to running modern titles :p

If only someone had pointed out AMD's issues just before launch ;)
But if you make those changes you are losing image quality. If I was to play at 4K I want my image to look 4K, not a much lower resolution just to keep the FPS from tanking. And no RT is not ready, unless you play at 60FPS, which at that point I rather play without RT. I had a FTW3 for 7 months and sold it because I was disappointed at the performance.
 
Very strong points @CAT-THE-FIFTH
So in summary, despite what the nVidia fans will have you believe. You can still enjoy a game without RT. It’s eye candy.

Well graphics are the icing on the cake,but the cake has to be still good. It's why Cyberpunk 2077 despite its prettiness well short of what it should have been! :(

But I was mostly pointing out unless you have an RTX3080 and above dGPU,RT performance is still a bit of an issue,unless you use upscaling for everyone else. I personally think it will need a 2X improvement in RT performance for an upper mainstream/lower enthusiast level dGPU like my RTX3060TI for it to be more useable,without having to resort to upscaling.

The main benefit that consoles have is their price point, but when you see a 2060 running circles around it, it's pretty disappointing.

For me, I'll just buy a PS5 for exclusives, but hard to justify it for anything that I could get on PC. Again though, £400+ for a console vs what you can get in PC parts; it's unfortunately the only option for a lot of people.

In theory it shouldn't just looking at the desktop RDNA2 dGPUs such as RX6700XT which appear around RTX2060 Super level RT performance. Its most likely since we are in the inter-generational period,studios are still getting to grips with RDNA2 and as time progresses I expect better things out of the consoles.

IIRC,DF did a test with the original PS4,and a PC with a Core i3 and a GTX750 was as good or better at launch,but over time it started to drop far behind.
 
Last edited:
I'm a sucker for eye candy now and for me RT is truly the next big advancement in visuals, is it the be all and the only reason to play games? Of course not, I happily played CP2077 on my vega 56 but getting to experience the game with RT just made it so much more immersive and appealing/enjoyable, much like how I couldn't get into metro exodus when it first released, however, the enhanced version I was enjoying far more because of what RT did to improve the overall look and feel of the game.

For better reflections on flat shiny surfaces? Come on, thats ridiculous its a bit, a very small bit, of eye candy and thats about it 99% of what you see is rasterization I can live quite happily with it turned off, does it look nice? Sure like a car thats had a wax and a polish but thats about as deep as it goes (CP2077 & 3090 owner here)
 
For better reflections on flat shiny surfaces? Come on, thats ridiculous its a bit, a very small bit, of eye candy and thats about it 99% of what you see is rasterization I can live quite happily with it turned off, does it look nice? Sure like a car thats had a wax and a polish but thats about as deep as it goes (CP2077 & 3090 owner here)

There is so much more to ray tracing than just better reflections and shinier surfaces..... Also given that you have metal, glass/windows, puddles etc. all over night city in cp2077, it kind of does make a big difference having better "reflections".

This is a good channel, which shows the differences of well.

https://www.youtube.com/c/RayTracingRevolution/videos

DL 2 just uploaded there:


One of the most underrated features with RT reflections is that they look the same no matter what the camera angle is i.e. with SSR, slightest movement in camera angle despite standing in the same spot and all of a sudden, you see the reflection disappear e.g. in DL 2, the only thing, which has changed is me changing the camera angle, with RT, this reflection would have stayed much the same. Personally I find this incredibly immersion breaking.

3Cna3jV.jpg

Y14cx3N.jpg

But each to their own and all that. At least people have a choice if they want to use it or not.
 
At least with Cyberpunk 2077 I found reflections were the most noticeable effect(not helped by the non-rasterised water in the sea not being great). The other effects were noticeable only if compared side by side,and because my RTX3060TI had issues at qHD with RT,I switched off all the other RT effects and kept reflections on. However,since I have a qHD monitor I had to use DLSS. The issue is DLSS like all similar techniques then adds its own set of problems.However,on the same token,even if I switched off the RT effects Cyberpunk 2077 looked fine IMHO,as I played most of it on my GTX1080 anyway.

Atomic Heart seems more like a game which seems developed more around using RT effects though,so I will wait and see how that pans out first. The issue is as long as most of the market is on non-RT dGPUs and the fact most of the RT capable dGPUs/systems many own are a bit crap at RT, it is going to hold back how much RT is being used. It's going to be tacked on as an Ultra quality option more than an integral part of 99% of most games(unless the RT used is basic).
 
At least with Cyberpunk 2077 I found reflections were the most noticeable effect(not helped by the non-rasterised water in the sea not being great). The other effects were noticeable only if compared side by side,and because my RTX3060TI had issues at qHD with RT,I switched off all the other RT effects and kept reflections on. However,since I have a qHD monitor I had to use DLSS. The issue is DLSS like all similar techniques then adds its own set of problems.However,on the same token,even if I switched off the RT effects Cyberpunk 2077 looked fine IMHO,as I played most of it on my GTX1080 anyway.

Atomic Heart seems more like a game which seems developed more around using RT effects though,so I will wait and see how that pans out first. The issue is as long as most of the market is on non-RT dGPUs and the fact most of the RT capable dGPUs/systems many own are a bit crap at RT, it is going to hold back how much RT is being used. It's going to be tacked on as an Ultra quality option more than an integral part of 99% of most games(unless the RT used is basic).

As long as we keep getting games with RT like cp2077, metro, control, dying light 2, it'll scratch my RT itch until the next big leap in RT. Atomic heart looks good but the one I'm really looking forward to is avatar, by the time it comes out, we'll probably have the 50xx/80xx series cards then.
 
As long as we keep getting games with RT like cp2077, metro, control, dying light 2, it'll scratch my RT itch until the next big leap in RT. Atomic heart looks good but the one I'm really looking forward to is avatar, by the time it comes out, we'll probably have the 50xx/80xx series cards then.

It seems to be developed with a degree of RT in mind at its base....but unless its toned down I expect it's is going to push systems. Looks a bit of a mental game TBH!
 
You can make a hack in the settings for FSR UQ and it looks amazing ( at least when you don't move :D ). Which is probably they didn't added the UQ setting in the game. :)
https://imgsli.com/OTQyOTI
I don't understand why some RT effects are more heavy than in ME for example. If we compare the RTGI in ME with the one in this game, this one is very poor and yet it hurts the FPS more. Another weird thing is why RDNA2 is working better on DX11 vs DX12. Usually it works like crap on DX11 and yet here the DX12 perf is worse. The question is: is it worse than crap or is this one of the fewer games where RDNA2 not having problems with DX11?

Good thing i don't like the game, for me it looks like playing AC Unity in 1st person. :)
 
You can make a hack in the settings for FSR UQ and it looks amazing ( at least when you don't move :D ). Which is probably they didn't added the UQ setting in the game. :)
https://imgsli.com/OTQyOTI
I don't understand why some RT effects are more heavy than in ME for example. If we compare the RTGI in ME with the one in this game, this one is very poor and yet it hurts the FPS more. Another weird thing is why RDNA2 is working better on DX11 vs DX12. Usually it works like crap on DX11 and yet here the DX12 perf is worse. The question is: is it worse than crap or is this one of the fewer games where RDNA2 not having problems with DX11?

Good thing i don't like the game, for me it looks like playing AC Unity in 1st person. :)

I think this has more RT effects than metro? That and metro EE version was made/optimised purely for RT. Then of course differences in engines being used.
 
You can make a hack in the settings for FSR UQ and it looks amazing ( at least when you don't move :D ). Which is probably they didn't added the UQ setting in the game. :)
https://imgsli.com/OTQyOTI
I don't understand why some RT effects are more heavy than in ME for example. If we compare the RTGI in ME with the one in this game, this one is very poor and yet it hurts the FPS more. Another weird thing is why RDNA2 is working better on DX11 vs DX12. Usually it works like crap on DX11 and yet here the DX12 perf is worse. The question is: is it worse than crap or is this one of the fewer games where RDNA2 not having problems with DX11?

Good thing i don't like the game, for me it looks like playing AC Unity in 1st person. :)
Great find. Can you share a link on how to enable FSR UQ? It looks really good there.
 
I think this has more RT effects than metro? That and metro EE version was made/optimised purely for RT. Then of course differences in engines being used.
I am talking only about RTGI, i think that is the main RT effect that can transform a game. In ME it is very heavy, sometimes it will take a second for all the bounces but the cost on FPS is not that heavy. Here is not so great but it seems to hit the FPS like a truck.
 
Back
Top Bottom