• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 3090 DLSS quality vs 7900 XT native vs wait

I'm coming around to the idea of a 7900 XT over a 3090. Largely because:

- Jitters about the used market for the 3090, and prices seem to be trending upward for 3090s since the VRAM apocalypse became apparent
- I have no use for CUDA as far as I know
- There are new games without DLSS (i.e. Dead Island 2), and in those scenarios the 7900 XT is easily head and shoulders above the 3090
- FSR may yet get better, and I heard somewhere that RDNA3 may have some as yet untapped potential for upscaling in its hardware, or maybe that's just for frame generation, or maybe I dreamt it
- Cyberpunk 2077 RT overdrive. Having seen how good that looks, making the old RT psycho settings look like a**, I'm thinking I should save my replay for another few years until cards can handle path tracing
On the other side of that coin:
- UE5 with its software RT. Hardware ray tracing is to now 'increases accuracy at the expense of performance' or some such. What sort of incentive is that for turning on hardware ray tracing at all?
- Sounds like all the 7900 XT non-reference models run quiet, which I like
 
The 3090 is a exceptional gpu especially at 650 and this is coming from someone who owns a 4090, the 4090 genuinely feels future proofed in the sense it only runs at 80% at max at 4k 120hz. But with at 60hz 4k the 3090 will manage that just tweak a couple of settings and your good
 
By the way warranties are not transferrable. By all means get a 3090 if that’s what you want. Just to scare you some more buildzoid has done some vids on the damage mining does on these cards. My advice buy new or get something else new. So endith the lesson.
 
I believe I've narrowed down my next GPU choice to an RTX 3090 (will have to be used) vs a new 7900 XT.

As per my recent thread pointing out problems with FSR 2.1 in Cyberpunk, it seems I'm unable to tolerate that upscaling tech.

Display-wise I have a Freesync 3440x1440 @ 100 Hz and a 4K TV @ 60 Hz. I'm thinking DLSS 3 would be unusable on my high res but low refresh displays, and I'm becoming convinced having a mega amount of VRAM will be best for avoiding a headache a few years down the line, especially at the resolutions I play at. It therefore seems the entire RTX 4000 series is lost on me, unless willing to stretch to a 4090 (nope), or a 4080 and tolerate the VRAM of a Radeon 6800 @ £1k+, which is another nope.

Or maybe I should just chill and replay some classics, because better alternatives will soon be coming at the price point these two cards currently occupy?
New vs used and 2 years old at that? I wouldn't
----------

Planned Obsolescence. Nvidia give you just enough VRam on the day so by the time they release a new GPU you're forced to upgrade.

Don't do that, don't be a fool. :) get the one that will last you.
 
Last edited:
I couldn't resist any longer and went for a Sapphire Pulse 7900 XT. It's a phenomenal jump in performance over the 5700 XT, as it should be :cool:
I hope it serves you well. I've been thinking about an 7900 XT myself but I cannot justify the price coming from a 6700XT. I paid 5000 DKK for it during all the craze after selling a 3070 for 6000 DKK. I would have to fork out around 7000 DKK for a 7900XT and considering the performance uplift I just don't consider it enough for the money. A 6950 would make more sense at 5200DKK for the red devil and still... it's just not good enough value in my book considering what I already have. Please BTW leave bench results here for the rest of us to go over.. I'm curious how it performs with that 5700x of yours.
 
Please BTW leave bench results here for the rest of us to go over.. I'm curious how it performs with that 5700x of yours.
I think they were coming from a 5700XT GPU which is exactly the switch I just made. I’m running a Zen 2 3900X and it’s basically fine if you are running at higher resolution. Going from 1440p to 4k I get about 20-30% fps extra on top of the resolution boost.
 
I hope it serves you well. I've been thinking about an 7900 XT myself but I cannot justify the price coming from a 6700XT. I paid 5000 DKK for it during all the craze after selling a 3070 for 6000 DKK. I would have to fork out around 7000 DKK for a 7900XT and considering the performance uplift I just don't consider it enough for the money. A 6950 would make more sense at 5200DKK for the red devil and still... it's just not good enough value in my book considering what I already have. Please BTW leave bench results here for the rest of us to go over.. I'm curious how it performs with that 5700x of yours.

I can see your dilemma regarding upgrading from a 6700 XT. At least it should have enough VRAM for now, so no cliff edge fall in performance due to memory yet.

I've just done the benchmark circuit on this forum for the games I have - feel free to check them out via my post history. Hopefully Kaapstad will update the scoreboards soon.

I only previously posted a couple of 5700 XT benchmarks here when paired with my O/Cd Ryzen 2700.

Comparing those results to the 7900 XT + Ryzen 5700X @ UWQHD, the difference is:

Shadow of the Tomb Raider:
5700XT + Ryzen 2700 = 37 FPS
7900XT + Ryzen 5700X = 100 FPS

Forza Horizon 4:
5700XT + Ryzen 2700 = 74.2 FPS
7900XT + Ryzen 5700X = 179.7 FPS

I'm pleased. The 7900 XT definitely feels more than double the power of the 5700 XT. Tech power up has the difference at 244%, which sounds about right :D
 
The used market is almost worse than the new market and that is saying something. Given a choice between a used card and a new one with roughly equal performance(i'm doing the 3090 a huge favor here) and a small saving of 100 pounds, I wouldn't touch the used one with a ten foot pole. No warranty, no idea of what it has been used for. I would need to save at least 40% on the purchase going with a used card with a questionable history and no warranty vs equal performance from a new current gen and that is not happening in this current market. I'd rather wait then tbh.

@Nexus18 Could you clarify what is it about FSR and ultrawide that is so bad? genuinely curious as I use it atm when playing RDR2 on my 3440x1440 panel.

The average person probably won't notice the difference.

If you need to use DLSS or FSR you're probably using it because your FPS is so bad/settings low you have no choice as the game would look even worse without using it, so it naturally will look better and good in comparison.

Most of the arguments are just based on the fact people feel the need to publicly display their need or gratification that they made the right purchasing choice.


It still confuses me to this day how people are so proud of spending thousands on a card for generated frames.

Both of them before better/worse in different ways so it's really personal preference in what you think looks good anyway.
 


I'm not sure if HUB are on the reputable trusted sources list any more but when they post something shaming vram, they seem to be the leading experts in all things hardware/software at that time but narrative/agenda and all that you know :)

When it comes to fsr, dlss, each to their own, since I "paid" for the luxury of having access to a superior upscaling method for all these years, I'll use dlss when present (thankfully majority of the time unless it is an amd sponsored title....) where as with fsr, chances are I won't use it (as I did with FC 6, riftbreaker, RE etc. games) and I'll drop settings instead due to how bad it can be in certain games as shown in star wars jedi survivor.
 
  • Haha
Reactions: TNA
Just going from memory here, but wasn't the conclusion that more often than not native looked better?

Overall with his setup, yup but if he had replaced dlss files with 2.5.1 or above, dlss would have come out top i.e.

- god of war
- spiderman
- rdr 2
- deathloop
- f1 22
- witcher 3

All using < 2.5.1 where none of the improvements such as less shimmering in the foliage are there.

Obviously not everyone will do this although I imagine a good chunk of pc gamers would (given it is literally copy and paste one file or use dlss swapper to do it for you), since I do that means in my use case, dlss comes out on top overall.

EDIT:

Also, not sure about his testing but for spiderman with the fur, you need to disable all the post processing effects to avoid these kind of issues including DOF (guardians of the galaxy had that same issue because of DOF), these can harm DLSS and FSR.
 
Last edited:
  • Like
Reactions: TNA
Just going from memory here, but wasn't the conclusion that more often than not native looked better?

If native does not look better than DLSS then the developers have not done their job properly. There is no reason that an upscaled image/footage should look better than the native one unless there are errors in the original image/footage that upscaling fixes.
 


I'm not sure if HUB are on the reputable trusted sources list any more but when they post something shaming vram, they seem to be the leading experts in all things hardware/software at that time but narrative/agenda and all that you know :)

When it comes to fsr, dlss, each to their own, since I "paid" for the luxury of having access to a superior upscaling method for all these years, I'll use dlss when present (thankfully majority of the time unless it is an amd sponsored title....) where as with fsr, chances are I won't use it (as I did with FC 6, riftbreaker, RE etc. games) and I'll drop settings instead due to how bad it can be in certain games as shown in star wars jedi survivor.

Is it crap in the new Star Wars game too? Bloody hell. Thought maybe Resident Evil 4 was a one of oversight, but it seems AMD just don’t care enough. How can you sponsor and let devs release crappy versions of FSR?

Something not quite right about that company. I remember some years ago Scott Wasson from the tech report (rip) joined them and thought he might be able to be of some influence in sorting out things like this. I actually just looked and it looks like he works for Intel now :cry:
 
If native does not look better than DLSS then the developers have not done their job properly. There is no reason that an upscaled image/footage should look better than the native one unless there are errors in the original image/footage that upscaling fixes.

When they say looks better than native, they mean once you enable something like TAA which removes the jaggies. If the TAA implementation is subpar then yes DLSS can look better.

Personally I have found DLSS does a very good job for the most part and unless you go pixel peeping you will be fine. What I like is I can either use it and gain fps or pick a higher resolution using DLDSR and end up with similar fps but much better image quality. At least on my monitor on the games I tried anyway.
 
Is it crap in the new Star Wars game too? Bloody hell. Thought maybe Resident Evil 4 was a one of oversight, but it seems AMD just don’t care enough. How can you sponsor and let devs release crappy versions of FSR?

Something not quite right about that company. I remember some years ago Scott Wasson from the tech report (rip) joined them and thought he might be able to be of some influence in sorting out things like this. I actually just looked and it looks like he works for Intel now :cry:

Yup, watch Alex's video on DF, it's bad :(

Sadly it's just AMDs approach to solutions, it's their whole approach to everything they do i.e. they want to be as hands of as possible and throw over the fence and let the "community" do the work for them, Roy basically said this himself in an interview from years back, that's largely why they love open source as with closed source, you are solely responsible for how the tech will grow and be adopted, probably the best and only move amd have tbf since they don't have the same resources nvidia have, that and obviously different paths i.e. nvidia are more about their ecosystem of features so their features have to be top quality. It's a shame AMD don't at least have a more unified method in the implementation of FSR to be able to achieve a consistently good implementation/experience, at least when I last read on their github thread on how FSR can be implemented, according to one of the devs, it is entirely up to the developers on how to implement and tune FSR to get the best from it.

When they say looks better than native, they mean once you enable something like TAA which removes the jaggies. If the TAA implementation is subpar then yes DLSS can look better.

Personally I have found DLSS does a very good job for the most part and unless you go pixel peeping you will be fine. What I like is I can either use it and gain fps or pick a higher resolution using DLDSR and end up with similar fps but much better image quality. At least on my monitor on the games I tried anyway.

Which sadly is the case for 99% of games now and has been for a long time now. The AA methods that once used to be fantastic are long gone now i.e. SMAA, MSAA (although was a performance hog) due to how the games are designed around motion vectors etc. so if you use anything but TAA, the games will have awful temporal stability issues to the point where hair, foliage has really bad artifacting, as shown in my videos for RDR 2, spiderman.
 
  • Like
Reactions: TNA
Speaking of AMD sponsored games, good thread here.

Ruined by AMD™

Blame AMD for something Nvidia control. That level of intelligence among the general population is why Nvidia have a 90% share.

A far right of the bell curve in the comments.

DLSS is propriatary to Nvidia. They will not in a million years let AMD use that software implementation.
 
Last edited:
Blame AMD for something Nvidia control. That level of intelligence among the general population is why Nvidia have a 90% share.

A far right of the bell curve in the comments.

Unfortunately Nvs marketing brainwashing is working with the general PC public. They will ignore the fact that this game works reasonably well on a PS5 and then blame AMD for not allowing DLSS to try and fix what is a broken PC port. No reason this game should not work just as well on a high end PC that has x4 the power of a PS5.

FSR is open source, frankly i hope it does to DLSS what Free-Sync did to proprietary G-Sync and OpenGPU did to Game-Works.

In time it will. What will be funny is that in 5 years time there will be 4090 owners using FSR 3.0 on new titles because Nv will be limiting DLSS 4.0 for their newest cards and blocking it for the older cards.

I am just not a Star Wars fan so this game has no interest for me. I am certainly not going to be paying £60 for a terribly optimised game when I can pay £10-15 in 2 years time when it is potentially playable. This is the strength of PC gaming. No need to play games on release day, they do not go away and the games I play are WH3 (I am counting WH 1 and 2 in the timeline) and Stellaris that are effectively +7 years old and have been supported and added to for the entire period of time.
 
Back
Top Bottom