• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

I see a lot of people saying "I'll just turn off RT", but it's something that will become mainstream.

You will still be able to turn it off though. For goodness sake Fortnite has got Ray Tracing and it doesnt half tank the card when you turn it on, eg Nvidia (my sons both have them)
Its like motion blur I never have that on as I dont like it and you can just turn it off, RT will be same even if it becomes "mainstream" whatever that means :)
 
You will still be able to turn it off though. For goodness sake Fortnite has got Ray Tracing and it doesnt half tank the card when you turn it on, eg Nvidia (my sons both have them)
Its like motion blur I never have that on as I dont like it and you can just turn it off, RT will be same even if it becomes "mainstream" whatever that means :)

Mainstream means that it'll be the default in most games, at least until the game designer decides to stop supporting older cards. Like I said, it's a long-term thing but it will happen.

For now, you'll get both options, but if you're looking to keep the card for 4 years, lack of RT and ML performance won't help the longevity. Also need to look at it from the perspective of mid-tier users, who are the majority and are probably also the ones who aren't budgeting for a new card every 2 years.
 
If the next GPU is a massive leap forwards, then the proceeding architecture will have a reduced usable life if software moves forwards at a similar pace. This is unlike to happen even if the next GPU is 10x the current one as developers target the mainstream to get max sales. A GPU that had a big enough difference to kill the competition was the Nvidia GeForce 256, that was a real monster at the time, but it still took time to destroy older hardware, the next was the ATI 9700 Pro which again took a long time. Now it would take longer still as progression is nowhere near what it was back then.
 
Ray tracing is mainstream right now imo, you just have to look at all the games adding it.

Adding RT does not save any time atm, it is extra work. It may save a lot of time on RT only games but i think any game that is developed primary for consoles in the next 5 years and most of the games that will be developed for PC will not be RT only. There may be a handfull of RT only games sponsored by Nvidia ( or AMD, Intel ) depending on who has an advantage on RT rendering at that point, released for the PC only market.
I don't know if the consoles refresh will use a much better video chipset but the games released for consoles will have to be able to run at least decent on PS5 and XboxX for the whole duration of this generation of consoles. So even if (let's hope) we will see heavier RT titles released for the "PS5 Pro" the game will have dumbed down settings for the PS5 original.

True that, we'll always be held back somewhat because of current gen limitations but I think the visual difference between current games with old gen and RT methods show the laziness/lack of effort being put into said old gen ways i.e. compare RT and old gen methods of shadows, AO and reflections etc. such as deathloop, control, cyberpunk etc., you can see the non-rt areas look considerably worse and even worse than before the RT era of games i.e. games like batman arkham knight, alien isolation, RDR 2 being the peak for the old gen methods of reflections, lighting, shadows and AO, I don't think we'll ever see that kind of quality with the old ways again, instead, it'll be half assed and anyone who wants the best visuals in those areas will have to use ray tracing.

AAA games are designed around consoles, so considering how much faster the desktop cards are then longevity is of no concern due to RT performance. The reason RT is "held back" has exactly 0 to do with AMD and everything to do with needing to re-tool for next-gen (which is a process and hasn't been finished yet) and ofc having to cater to said consoles, so RT will be kept to a minimum outside of where it gives the greatest returns (usually reflections) for the cost. If anything RDNA 2 will age better than Ampere simply because Nvidia is quick to discard its older products as soon as the new one's out the door, but at least AMD & Devs are committed to RDNA 2(+) for this decade. Overall even with Ampere you'll start dropping off RT just because performance requirements will outpace the hardware (and driver) capability, so it will fall back to 99% raster anyway unless you want to drop <1080p or <60fps.

Whilst that is mostly true i.e. consoles come first, we have had plenty of games where ray tracing has been dialled up to cater for pc enthusiasts (which I am actually rather surprised by tbh)

Why is it that amd sponsored games considerably dial back RT effects/complexity especially resolution for likes of RT reflections?

Also, look at fc 6, ray tracing had to be deactivated for consoles. Hasn't it been confirmed that dying light 2 ray tracing won't be supported for consoles either?

I do agree though, at some point ampere owners are going to have to hold back on dialling up the RT effects and as per my comments in another thread, come the time for upgrading:

Upgrading for better ray tracing performance?
Upgrading for better rasterization performance?
Upgrading for more vram?

Which will be the main reason for people upgrading?
 
Mainstream means that it'll be the default in most games, at least until the game designer decides to stop supporting older cards. Like I said, it's a long-term thing but it will happen.

For now, you'll get both options, but if you're looking to keep the card for 4 years, lack of RT and ML performance won't help the longevity. Also need to look at it from the perspective of mid-tier users, who are the majority and are probably also the ones who aren't budgeting for a new card every 2 years.

by the time that happens the peope on here with RDNA2 cards would have moved on years ago

my favorite youtube clip was the linus crew trying to figure out which game had ray tracing on or off
 
by the time that happens the peope on here with RDNA2 cards would have moved on years ago

my favorite youtube clip was the linus crew trying to figure out which game had ray tracing on or off

Appropriate video for the time but ray tracing has come along way since and his main game on that basis was shadow of the tomb raider, regarded as being the worst for ray tracing implementation i.e. just shadows and a very poor job at that.

I like this channel, goes through all/most rt games and shows a decent comparison:

https://www.youtube.com/c/RayTracingRevolution/videos
 
by the time that happens the peope on here with RDNA2 cards would have moved on years ago

Assuming you have some insider knowledge on game engine development.

my favorite youtube clip was the linus crew trying to figure out which game had ray tracing on or off

Like I said, the tech is about improved development and game engine performance, whereas now it's just a feature. Regardless, there are plenty games where it's noticeable.
 
Whilst that is mostly true i.e. consoles come first, we have had plenty of games where ray tracing has been dialled up to cater for pc enthusiasts (which I am actually rather surprised by tbh)
Yes, and the heaviest RT games all had plenty of engineers from Nvidia sent over to make those things happen. Same as it ever was. They needed the marketing push so because they had the $ and talent to put to use they did. But if you look at RT scaling in any of these games it's clear most of the time the RT penalty is much higher (even on NV GPUs) compared to the more robust implementation done for console-only titles using RT (Ratchet etc.) And certainly if you read the dev talks it's clear a lot of the time RT's just tacked on mid-development or near the end just for the sake of the NV sponsorship because they're footing the bill.

Why is it that amd sponsored games considerably dial back RT effects/complexity especially resolution for likes of RT reflections?
One part is coincidence, due to the titles sponsored (on PC), f.ex. Far Cry 6 is just pushing their old codebase hard and the RT is tacked on & as practice for future next-gen only developments; if you look at Riftbreaker though it didn't hold back at all because it they did a lot of work on the engine first so there they went harder with RT - in other words it's easy to cherry pick examples but the overall picture doesn't support the hypothesis of AMD sabotaging RT. Most of it is them not having a big budget for sponsorships that's why all titles sponsored for RX6000 partner showcase were indies/AA with the sole exception of FC6.
The other part is the obvious fact that AMD RT performance is lower than Nvidia (and has no DLSS to fall back on and pretend it's doing more). But even in the titles where Nvidia lends a hand later like Doom Eternal you still see cutbacks in terms of RT res. et al, just because it's an obvious "optimisation" step, and then the only way to change it is to get cheat engine & up it through console commands.

The sad thing for me is that the people are crying about FC6 RT but in fact Ubisoft did something very smart - they strategically used RT to fix situations where screen-space solutions would artifact but did so for a very low performance cost and which with raster solutions would've been much more costly and wouldn't have been as good (PCSS+ et al for shadows, more rays and/or planar reflections for reflections). Not only that but the game is already very heavy on the CPU so a more hardcore RT implementation would've destroyed the framerates even worse, sort of like what happened with WD:L at launch. If anything people should be thankful to AMD that they get any RT & FSR at all because for sure you wouldn't have seen anything if they hadn't gotten involved.

Also, look at fc 6, ray tracing had to be deactivated for consoles. Hasn't it been confirmed that dying light 2 ray tracing won't be supported for consoles either?
They could've absolutely done more with FC6 on consoles, but they didn't care. No reason for them not to use FSR for example, and for sure they could've fit RT Reflections in the budget. As for DL2, classic case of lipstick on a pig, also with heavy Nvidia involvement, as we have seen before - old looking assets & texture work, model detail etc but heavy with RT effects. Similar as what happened with Metro Exodus (which I loved, but still true).

The real next-gen graphics leap is stuff like the Matrix demo, which the consoles can just about do, but obviously no 60 fps then.

Imo the main upgrade will be for RT, at least at the high end, and for sure that's what I'll do. Got to enjoy Cyberpunk maxed at 1080p at least (and maybe Intel will do us a solid and buy XeSS support for it). :cool:
 
But if you look at RT scaling in any of these games it's clear most of the time the RT penalty is much higher (even on NV GPUs) compared to the more robust implementation done for console-only titles using RT (Ratchet etc.)

Robust is the wrong word, you either get overkill (NVIDIA) or minimalist (AMD) depending on who sponsored the game. While it looks great, it's very much a marketing feature - a more mainstream implementation will hit a good middle-ground and be better integrated with the game engine.
 
Robust is the wrong word, you either get overkill (NVIDIA) or minimalist (AMD) depending on who sponsored the game. While it looks great, it's very much a marketing feature - a more mainstream implementation will hit a good middle-ground and be better integrated with the game engine.
It doesn't really break like that, there's a lot of variance but it's not a matter of overkill or minimalist because there's examples of the opposite for both. F.ex. with Nvidia in COD you have excellent RT but it also scales appropriately in terms of a performance penalty, unlike say what happened with BF V initially (and still somewhat the case); but then for AMD you have things like Riftbreaker where you can absolutely not call it minimalist, though in general they didn't have a lot of titles like that (but as I've said before they didn't sponsor many titles to begin with). Again, it's not so much about the sponsorship as much as it is about the project, what engine it's working with and how far in they are. Until cross-gen is over and everyone can upgrade their tools then this is what we'll see most of.
 
If I was going to buy a high-end graphics card, RDNA 2's limitations would definitely make me inclined to wait for RDNA 3, but anything from the 6600 XT below, I doubt it'll ever matter because the 6600, 6500 and RTX 2060 are going to be borderline for raytracing and high settings in the next few years for AAA games (if they aren't already), so they don't have much long-term longevity to be concerned with anyway. I doubt any of the current 'midrange' cards like the 6600 / 3050 will last as long as Polaris/Pascal have, unless the market stagnates the way it has since 2016.
 
Robust is the wrong word, you either get overkill (NVIDIA) or minimalist (AMD) depending on who sponsored the game. While it looks great, it's very much a marketing feature - a more mainstream implementation will hit a good middle-ground and be better integrated with the game engine.


Yep, Robust is a funny way to say Downgrade lol

There is nothing robust about it, it's actually incredibly simple, let me explain for Poneros


Nvidia sponsored RT game: RT image output resolution = 100% of engine input resolution

AMD sponsored RT game: RT image output resolution = 25% of engine input resolution


And these numbers are not made of fluff, it's what Nvidia and AMD tell developers to use in their development support documents
 
Last edited:
As someone who bought an AMD card for the 1st time in about 10 years, I've got zero concerns about longevity.
Raytracing still remains mostly a great way to destroy performance for a small visual difference. Not sure I would even use it if I had an NVidia card. By the time it's good enough to interest me, I'll have probably upgraded.
As far as upscaling goes, FSR is widely supported already, console-style dynamic resolution is starting to come in (Halo), there's Intel's XeSS on the way, there are the game's native upscalers. So I've even less concerns there.
Truthfully, for gaming I could probably live with NVidia's stingy VRam allocations too. But I use the card for CG/ Content creation too, so I prefer to have more VRAM for those things.
So far the 6800 has been a really good experience & just as problem-free as the 1080ti it replaced.
 
FSR ironically might be the undoing of RDNA2 - in path traced games like Quake 2 it allows a boost from ~47FPS to 85-100 FPS at 1440p on a 3070 with barely any quality drop from native aside from some edges.

That increase in performance also allows the alleviation of some of the artefacts from path tracing, etc. which are heavily frame tied for update rates.

Only a work in progress map with a lot of limitations due to the engine but representative (this is with FSR on):

Je3vhv1.jpg

You need a top end AMD GPU to get close to that performance.
 
Yeah, I mean, it kind of re-enforces my thinking on the matter. It doesn't even look good, because so many of the other elements are sub-par.
There are SO MANY factors that make a game look great, from Art Direction, Model Quality, Texturing Quality, Lighting Design, Animation Quality, Particle & Effects Quality, Layout, Post Processing, Tone Mapping etc that Ray Tracing Effects on their own make a pretty small contribution at an extremely high performance cost.

I've been kind of spoilt by using path-tracing render engines like Octane for 10 years (not real-time or game-suitable at all). There, every pixel on screen is the result of path-traced photons.
The current extremely limited game hacks just don't compare. Now don't get me wrong, it's great that this stuff is being developed & it will absolutely get better & better. I just find the current level very take-it-or-leave-it in today's games.
 
Back
Top Bottom