• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

I consider Nvidia's lack of VRam a critical problem, it is a critical problem, if a card cannot run my games properly its flawed. I don't buy things that are 'inherently' broken. AMD can improve software features, I cannot download more VRam from Nvidia.

AMD were losing market to Nvidia while making better and cheaper GPU's, they have a history that is more than 4 years old.
Is there anything fundamentally different that Nvidia users may have to turn down the textures a notch but enjoy superior image quality provided by DLSS vis a vis FSR? You can't even fine tune FSR on the AMD card. On Nvidia, you are not locked to the arbitrary 67%, 58% render scales like with FSR on AMD. You can change the percentages as per your liking and get an overall superior image over FSR. I personally think 67% render scale on DLSS does not look good and its only at 77% and above, DLSS really starts to shine but on FSR you are locked to 67% with no way to change it and it looks even worse than DLSS at 67%.
Also, i don't consider those features added value, especially on a mid range card, if i have to pay extra for them then i don't want them.
While you may not want those features, if the AMD and Nvidia card are in the same price bracket, the Nvidia card is perceived to be more premium by the average consumer because of those features. On high end cards, those features become mandatory. Hence, the 4090 on its own has managed to nearly outsell AMD's entire 7000 series lineup.
One more thing.

AFMF is now actually pretty damned good. and its game agnostic, that trumps DLSS for me.
RT, the 4070 is only faster when you crank it up so high its unplayable on both cards.
AFMF does not work properly with VRR and HDR unlike DLSS. That makes it a non starter at the outset. Also, Frame Gen adds a ton of latency and AMD does not have anything as good as Reflex bundled in. Anti Lag continues to be a work in progress. This is exactly what I mean when I say all AMD does it look at Nvidia's shiny new thing, develop an inferior version of that months and months down the line by which time the Nvidia technology has virtually cemented itself in the market and the AMD tech becomes an after-thought for developers.

Nvidia is constantly working to improve DLSS on a daily basis. Just last week they introduced DLSS 3.7 with Model E which produces a very sharp resolve for the image and in my experience looks superior to native TAA in many ways. And the best part is you can use this DLSS version along with the new model in ALL the games released till now. If you don't like Model E, you can always tinker with Model C, A , F and settle on whichever output you prefer. You cannot do any of this with FSR.

I do not see the same level of work going into FSR by AMD. It took ages after FSR 2.2 to get FSR 3 and months to get 3.1. Within that time, Nvidia advanced DLSS by a couple of versions already and the worst part is you cannot upgrade FSR on the previous games. Older AMD sponsored titles from 2023 still run on ages old FSR builds

Ironic that an open source solution is less customisable and upgradable than closed source.
 
I highly doubt nvidia updating their control panel is anything to do with amd and their customers voicing concern otherwise they would have done it years ago or/and lost far more marketshare due to this "disadvantage". Their control panel was dated compared to not just amd but basically every other application on windows so it's more likely nvidia just updating to move with the times. Also, their are numerous reasons to redesign their app/drivers, same reasons amd did theirs, makes things easier for the devs and gives them the potential to do more, that and as they have said themselves, a lot of the options found in the old control panel are no longer used/relevant. The main pros for me to come with this redesign are added features and further improved functionality i.e. the 120 fps shadowplay recording, rtx hdr with the sliders to get exactly what you want and in the future, supposedly being able to swap dlss versions/presets without the need for manual job or/and using dlss tweaks or/and dlss swapper, it's these kind of things that matter, not a fancy UI or loading 2 seconds faster imo.

Personally I would be happy to stick with the old panel as it's extremely functional and as said, outside of initial setup, I never venture back into it. In terms of nvidia specific things, I would use nvidia profile inspector to enable rebar for games where it's not activated by nvidia but that is it.

Also, expand on UI costing performance? There is no performance difference on my end. Unless you are referring to RTX HDR? In which case, it's to be expected considering it's a beta, that and probably expected performance hit given it is using AI/tensor cores to convert SDR into HDR.

How long did/do AMD take when it came to their features and drivers to achieve stability and fix bugs? Months/years.
If you look at Nvidia's marketing, AMD does not even exist in their performance charts. They are irrelevant to them.

AMD not focusing on high end with RDNA 4 is a dangerous gamble IMO. If Nvidia wants to take AMD out of the running entirely and have the entire market to themselves, all they have to do is price the 5080 like they did with the 3080 and its game over.
 
Is there anything fundamentally different that Nvidia users may have to turn down the textures a notch but enjoy superior image quality provided by DLSS vis a vis FSR? You can't even fine tune FSR on the AMD card. On Nvidia, you are not locked to the arbitrary 67%, 58% render scales like with FSR on AMD. You can change the percentages as per your liking and get an overall superior image over FSR. I personally think 67% render scale on DLSS does not look good and its only at 77% and above, DLSS really starts to shine but on FSR you are locked to 67% with no way to change it and it looks even worse than DLSS at 67%.

While you may not want those features, if the AMD and Nvidia card are in the same price bracket, the Nvidia card is perceived to be more premium by the average consumer because of those features. On high end cards, those features become mandatory. Hence, the 4090 on its own has managed to nearly outsell AMD's entire 7000 series lineup.

AFMF does not work properly with VRR and HDR unlike DLSS. That makes it a non starter at the outset. Also, Frame Gen adds a ton of latency and AMD does not have anything as good as Reflex bundled in. Anti Lag continues to be a work in progress. This is exactly what I mean when I say all AMD does it look at Nvidia's shiny new thing, develop an inferior version of that months and months down the line by which time the Nvidia technology has virtually cemented itself in the market and the AMD tech becomes an after-thought for developers.

Nvidia is constantly working to improve DLSS on a daily basis. Just last week they introduced DLSS 3.7 with Model E which produces a very sharp resolve for the image and in my experience looks superior to native TAA in many ways. And the best part is you can use this DLSS version along with the new model in ALL the games released till now. If you don't like Model E, you can always tinker with Model C, A , F and settle on whichever output you prefer. You cannot do any of this with FSR.

I do not see the same level of work going into FSR by AMD. It took ages after FSR 2.2 to get FSR 3 and months to get 3.1. Within that time, Nvidia advanced DLSS by a couple of versions already and the worst part is you cannot upgrade FSR on the previous games. Older AMD sponsored titles from 2023 still run on ages old FSR builds

Ironic that an open source solution is less customisable and upgradable than closed source.

On the first part, I'm not sure if you're correct on this, dlss percentage instead of using a preset slider? Or is this option present in dlss tweaks? Personally I only change the preset to E. That is one thing which many do overlook though, the very "fact" that you can use likes of dlss balanced and in some cases, even performance mode and still get better IQ than FSR 2 UQ/Q and native + TAA shows just how good DLSS is and how bad AMDs fsr is. Hopefully the next big update by amd for FSR will be the real deal but alas, I'll wait to see how the situation is outside of the amd sponsored showcase for the next version....

Must admit, this is quite funny:

Ironic that an open source solution is less customisable and upgradable than closed source.

This is where I have said it time and time again, amd have done wonders by convincing people that "closed source bad, open source good". We have come along way with nvidia solutions where things like physx, hairworks were/are dodgy. In my day to day work, we get to work with a lot of open source and closed source tools and often have to evaluate which one is best for our needs, sometimes an open source way is best, sometimes closed source tool is best, both have their pros and cons. The fact that nvidia made an open source tool called streamline (which intel were onboard with) and amd refused to get on board for "reasons" and people diss this tech. again just shows the hypocrisy that goes on. That and the fact that amd have zero choice on the path to go about their solutions, you can't be last to the market by months/years with an inferior solution and make it closed source and either way, amd prefer open source methods as it works better for them i.e. as Roy and the following lot that came after him said it themselves, it allows them to hand it over to the community to contribute to it, which means they can be more hands of and not stuck supporting a product i.e. frees up their engineers which means less time and money for them ultimately, can you imagine how many games FSR would be in if amd made it closed source, I be amazed if it would be more than 10... Essentially, nvidia and amd have different business models, neither approach is "bad/wrong" but imo and arguably as evidenced, one business model is working a lot better than the other.






To me, the things amd need to nail to get back to the greatness they once had is have:

- consistent good upscaling
- better RT perf, it doesn't have to be matching/beating 4090 level but it needs to at the very least be 4080 levels for next round (as shown, their RDNA 3 RT is not even quite on par with the equilvalent ampere gpu i.e. a 3+ year old gpu now.....), the real problem with amds rt which is overlooked by a lot of reviewers is when you add multiple RT effects to the mix, this is where amd completely crumble
- assuming we are talking about £600+ price bracket, be priced at least a £150 less than the equilvalent nvidia gou if they can't achieve the above (based on my gaming needs/wants, these 2 things are used in 99% of the games I play now)

There are other things which I would like to see now i.e. a competitor to RTX HDR, nvidia have somewhat made this my most must have feature now due to the **** show of hdr gaming at times and this makes a huge difference to improving IQ for HDR gamers along with their DLDSR feature. Having their anti lag working well would be nice to compete against reflex but it's not a deciding factor for me, however, if/when using FG then it would be pretty important. Ray reconstruction would be nice too but given nvidia only have this in 3 titles at the moment, it's not a must have but that could change quickly....
 
Last edited:
Is there anything fundamentally different that Nvidia users may have to turn down the textures a notch but enjoy superior image quality provided by DLSS vis a vis FSR? You can't even fine tune FSR on the AMD card. On Nvidia, you are not locked to the arbitrary 67%, 58% render scales like with FSR on AMD. You can change the percentages as per your liking and get an overall superior image over FSR. I personally think 67% render scale on DLSS does not look good and its only at 77% and above, DLSS really starts to shine but on FSR you are locked to 67% with no way to change it and it looks even worse than DLSS at 67%.

While you may not want those features, if the AMD and Nvidia card are in the same price bracket, the Nvidia card is perceived to be more premium by the average consumer because of those features. On high end cards, those features become mandatory. Hence, the 4090 on its own has managed to nearly outsell AMD's entire 7000 series lineup.

AFMF does not work properly with VRR and HDR unlike DLSS. That makes it a non starter at the outset. Also, Frame Gen adds a ton of latency and AMD does not have anything as good as Reflex bundled in. Anti Lag continues to be a work in progress. This is exactly what I mean when I say all AMD does it look at Nvidia's shiny new thing, develop an inferior version of that months and months down the line by which time the Nvidia technology has virtually cemented itself in the market and the AMD tech becomes an after-thought for developers.

Nvidia is constantly working to improve DLSS on a daily basis. Just last week they introduced DLSS 3.7 with Model E which produces a very sharp resolve for the image and in my experience looks superior to native TAA in many ways. And the best part is you can use this DLSS version along with the new model in ALL the games released till now. If you don't like Model E, you can always tinker with Model C, A , F and settle on whichever output you prefer. You cannot do any of this with FSR.

I do not see the same level of work going into FSR by AMD. It took ages after FSR 2.2 to get FSR 3 and months to get 3.1. Within that time, Nvidia advanced DLSS by a couple of versions already and the worst part is you cannot upgrade FSR on the previous games. Older AMD sponsored titles from 2023 still run on ages old FSR builds

Ironic that an open source solution is less customisable and upgradable than closed source.

I switched from 8 years with Nvidia to AMD.

I would only use upscaling tech as an absolute last resort, this is the problem with people trying a convincing argument as to why i should still be buying Nvidia, upscaling tech is a critical part of the equation, for me it should not even exist, you're willing to pay more money for a lesser card and then make back the difference with DLSS.
To me that's asinine.

Just buy the better card, for less and then not need it.
 
Last edited:
I switched from 8 years with Nvidia to AMD.

I would only use upscaling tech as an absolute last resort, this is the problem with people trying a convincing argument as to why i should still be buying Nvidia, upscaling tech is a critical part of the equation, for me it should not even exist, you're willing to pay more money for a lesser card and then make back the difference with DLSS.
To me that's asinine.

Just buy the better card, for less and then not need it.

So you're able to nearly double your fps or more without sacrificing graphical effects/iq then?

What about everything else that the Nvidia GPU does better at? Or is that not relevant?

No one is saying you should buy Nvidia GPU because they provide you with xyz features (after all some might not need/want these) but that amd can't expect to charge in the same ballpark when they don't offer the same package in a number of ways... You are literally paying similar price for a lesser package overall unless you literally and truly only care for more vram and nothing else.
 
Last edited:
I switched from 8 years with Nvidia to AMD.

I would only use upscaling tech as an absolute last resort, this is the problem with people trying a convincing argument as to why i should still be buying Nvidia, upscaling tech is a critical part of the equation, for me it should not even exist, you're willing to pay more money for a lesser card and then make back the difference with DLSS.
To me that's asinine.

Just buy the better card, for less and then not need it.
Apparently upscaling tech allows the use of RT for “underpowered” GPUs like ours :cry:

Personally I much rather use the gpu native grunt and accept that I can’t run some features like RT etc.
 
Last edited:
Apparently upscaling tech allows the use of RT for “underpowered” GPUs like ours :cry:

Personally I much rather use the gpu native grunt and accept that I can’t run some features like RT etc.

So in effect, you are getting lesser iq by having to sacrifice graphical effects and are happy with that? That's fine but a flawed argument in many retrospects given the mounting evidence we have on upscaling (well for dlss) Vs native now.

Also, upscaling isn't just to allow you to turn on RT, it's a pathway to many benefits:

- GPU runs quieter since it doesn't run as hot
- allows you to achieve higher fps regardless of RT which in returns benefits motion clarity and input latency
- allows you to use upsampling in combination with upscaling tech to achieve a far superior image with similar perf as native render Res at 1440p, which is pretty beneficial on <4k displays since assets and lod only load in higher when 4k or higher Res is used
- good upscaling such as dlss is leaps better than native taa (there are a few games where fsr is better than native taa too) unless you can and do turn off the in game taa (which is very rare option) in which case, you then get to experience all the shimmering, aliasing, jaggies and so on
 
Last edited:
I switched from 8 years with Nvidia to AMD.

I would only use upscaling tech as an absolute last resort, this is the problem with people trying a convincing argument as to why i should still be buying Nvidia, upscaling tech is a critical part of the equation, for me it should not even exist, you're willing to pay more money for a lesser card and then make back the difference with DLSS.
To me that's asinine.

Just buy the better card, for less and then not need it.
7900xtx vs 4080 super , basically the same (in raster), AMD weaker in Raster, a bit cheaper - not that would actually matter.

7800xt/7900GRE vs 4070 Super - same story.

7700xt vs 4060 16gb (I know is linked the 8gb version, but is doesn't really matter in this case), AMD has a larger lead with raster, but falls behind with RT - although another card is review, just look how they rank in the benchmark. Priced similarly.

Not going lower than that as is not worth it. So chances are you're still going to need to use DLSS/FSR rather sooner than later.
 
I am happy to use whatever you want to call it as long as the IQ and frames are the best I can get.

Exactly.

I always find it a weird argument people saying they want more FPS than fancy visuals i.e. RT (especially the ones who say they can't see the difference) yet they are adamant that upscaling/dlss is the devil's work and is massively flawed despite what all the comparisons show....
 
Apparently upscaling tech allows the use of RT for “underpowered” GPUs like ours :cry:

Personally I much rather use the gpu native grunt and accept that I can’t run some features like RT etc.

Is funny, because turning down settings so you can stay within your "native" resolution, you'll end up with a worse presentation for the same frame rate (that if you can turn off/down sufficient settings to achieve the same frame rate - not always possible). But at least you're "pure". :p
 
Last edited:
Last edited:
Such a lovely day outside,can tell the lack of RT(Real and True) puddles is having a negative effect. Don't worry guys it's the UK,the puddles will re-appear! It's quite clear some of you got lost on the way to the upscaling thread:
Is like saying DX12 doesn't matter, just DX11 does.
Upscaling is a feature of current and feature GPUs. How good is one at it matters.
 
Is like saying DX12 doesn't matter, just DX11 does.
Upscaling is a feature of current and feature GPUs. How good is one at it matters.

Exactly. Features are all part of the package and therefore relevant to the thread.

What will be interesting to see is amds upcoming ai upscaling and if it will be tied to rDNA 4 and 3 hardware, imo, it should be if it means they can achieve dlss or better quality.
 
Exactly.

I always find it a weird argument people saying they want more FPS than fancy visuals i.e. RT (especially the ones who say they can't see the difference) yet they are adamant that upscaling/dlss is the devil's work and is massively flawed despite what all the comparisons show....
We know RT is the future as you and others have stated earlier in other threads.

I just can’t stomach such a hit to FPS without using upscale. Especially as I’m aware FSR isn’t as good as DLSS- that for me is why I don’t like turning it on.

Btw cyberpunk is one of a select few games I own where I’ve turned on RT at ultra and thought it doesn’t really change my view of the game and general gameplay as I noticed I was too obsessed knocking out bad guys to worry about the neon signs or puddles or the way light shines in dark corridors in the game worlds’ many shady establishments! :cry:

I’m sure I’ll use RT implementation as in Avatar as it doesn’t seem to hit my fps as much as it does in cyberpunk.

It does look pretty in this game and I can see a difference.

AMD needs to come up with a better way of boosting RT other than using upscale.
 
Last edited:
Exactly. Features are all part of the package and therefore relevant to the thread.

What will be interesting to see is amds upcoming ai upscaling and if it will be tied to rDNA 4 and 3 hardware, imo, it should be if it means they can achieve dlss or better quality.
I would hope RDNA5 offers a massive leap on par with the Nvidia offerings.
 
Is like saying DX12 doesn't matter, just DX11 does.
Upscaling is a feature of current and feature GPUs. How good is one at it matters.

Don't get me involved in that Hot Potato! I saw all the extra pages this weekend and thought - By Jove! there must be some new RDNA4 rumours that got everyone in a tizzy. Nope! It's Upscaling Wars: The Return Part 27. Part 9 was the best,part 27 seems like a lazy rehash.
 
Last edited:
We know RT is the future as you and others have stated earlier in other threads.

I just can’t stomach such a hit to FPS without using upscale. Especially as I’m aware FSR isn’t as good as DLSS- that for me is why I don’t like turning it on.

Btw cyberpunk is one of a select few games I own where I’ve turned on RT at ultra and thought it doesn’t really change my view of the game and general gameplay as I noticed I was too obsessed knocking out bad guys to worry about the neon signs or puddles or the way light shines in dark corridors in the game worlds’ many shady establishments! :cry:

I’m sure I’ll use RT implementation as in Avatar as it doesn’t seem to hit my fps as much as it does in cyberpunk.

It does look pretty in this game and I can see a difference.

AMD needs to come up with a better way of boosting RT other than using upscale.

Time to buy a Nvidia GPU then :p :cry:

But yes, hopefully amd improve their fsr so it gives their customers more choice in how they wish to play.

Well the thing with avatar is that you can't turn RT off but it is very well optimised and an example of what can be achieved when RT is done from the ground up although certain RT effects are incredibly held back like the RT reflections for resolution and so on.

I'm sure amd will come up with something but there probably won't be any real uplift until the next gen consoles are due.
 
Back
Top Bottom