• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Wasting your time trying to educate him :cry:
Not at all, it's his opinion based on what he's seen. I'm upscaling (I think) from 1440p, and not always to 4k, so I'm not seeing a huge drop off but a fair performance benefit. If he's upscaling from a lower res to get to a same resolution output, he may well be seeing worse results. I see upscaling as a really useful tool based on what I can see, and am extrapolating my opinion, but he may have the opposite opinion for equally valid reasons. Education can work both ways.
 
Nexus, you keep posting reams of pointless tripe that misses the point. Like it or not the vast majority of PC games being release do not need, or have DLSS. I use it and it is a very welcome addition at 4K. I just accept it is not as popular or mandatory that you seem to think it is.
 
Last edited:
Ah, its not as bad as all that, for me at least. In the right place, both FSR and DLSS can be game changers as far as I'm concerned, especially lower down the GPU stack. If FSR 3 (or 2.3) comes with RDNA 3, that will be very interesting.

Presume you're referring to amds fluid motion/frame generation? That's not till end of next year most likely but shall be interesting to see how it does especially with latency. At least more and more games are now finally adding FSR 2+ going forward now.

Yeah, but you need to have 12gb of ram to be able to open the settings menu else you're screwed!

Don't forget force enabling rebar too :cry:

Not at all, it's his opinion based on what he's seen. I'm upscaling (I think) from 1440p, and not always to 4k, so I'm not seeing a huge drop off but a fair performance benefit. If he's upscaling from a lower res to get to a same resolution output, he may well be seeing worse results. I see upscaling as a really useful tool based on what I can see, and am extrapolating my opinion, but he may have the opposite opinion for equally valid reasons. Education can work both ways.

Opinion is one thing but problem is when you go around stating a load of BS "looks like minecraft" and so on without backing up such statements despite every reviewer, gamers nexus, HUB, TPU, dsogaming, computerbase.de, pcgamershardware, oc3d etc. all pointing out with evidence in multiple areas and games where fsr/dlss looks better than native.... Sorry I forgot, "nvidia shills", oh wait, they have also pointed out where fsr is better than native too so it can't be that!!!! :cry: We then have people who say "oh that's because it's using TAA (which sadly is 95+% of games), use proper AA", done that too.... have a look at SMAA in spiderman, have a look at msaa in rdr 2 and they look even worse than TAA :cry:

The only time DLSS/FSR/xess will look worse than native is:

- playing at <1080P (which he isn't)
- upscaling tech. has been poorly implemented (especially on launch when issues will show before a patch or driver update)
- using a higher preset, such as performance or ultra performance, more so if you're using these presets at lower resolutions
- you have the post processing effects like CA, DOF, motion blur, film grain turned on, these can really harm upscaling IQ
- the game has done a fantastic job with anti-aliasing, be that a really good TAA, smaa, msaa implementation, which is few and far between
 
Last edited:
Not at all, it's his opinion based on what he's seen. I'm upscaling (I think) from 1440p, and not always to 4k, so I'm not seeing a huge drop off but a fair performance benefit. If he's upscaling from a lower res to get to a same resolution output, he may well be seeing worse results. I see upscaling as a really useful tool based on what I can see, and am extrapolating my opinion, but he may have the opposite opinion for equally valid reasons. Education can work both ways.

Nah, pretty sure DLSS works better at higher resolutions than lower. For example at 1440p Quality setting is around the Performance setting at 4K as they both upscale from 1080p from what I recall.

If you use Quality setting you upscale from 1440p which is obviously going to be better. Thought I would educate you a bit :p:D

Some just don't like DLSS full stop. But somehow FSR is good. Make of that what you will :cry:
 
Nexus, you keep posting reams of pointless tripe that misses the point. Like it or not the vast majority of PC games being release do not need, or have DLSS. I use it and it is a very welcome addition at 4K. I just accept it is not as popular of mandatory that you seem to think it is.

One way to ignore the points :cry:

You keep on believing this then:

If you're willing to only accept that people play at 1080P or/and are stuck at 60hz or/and play old games or/and not demanding games

Also since one of your examples was the ascent, can you post what your fps is and what your display is? As per yours and my post:

Hell the last PC game I played a few weeks back is Alien Isolation from 2014 and before that the Ascent (an indie game). If all we played was AAA titles with RT at 165 FPS (because apparently that's all that counts), we would be waiting a very very long time between our gaming fixes.
The ascent does not run well without dlss... unless you are reducing/turning off RT, in which case, fine, but accept that you are vastly reducing the IQ by doing that.

khe3KUM.png


CRGfQOL.png


As I said, if you're happy to turn off RT, great but I rather have better visuals thus only way to achieve that is with upscaling tech. in the ascent.
 
Presume you're referring to amds fluid motion/frame generation? That's not till end of next year most likely but shall be interesting to see how it does especially with latency. At least more and more games are now finally adding FSR 2+ going forward now.
I don't see how DLSS or FSR can avoid issues with latency in the way the technology seems to work. It would need to be predictive of human action, which it can't be. Is there any word if next gen of FSR looking to be more or less aggressive with frame generation?
Nah, pretty sure DLSS works better at higher resolutions than lower. For example at 1440p Quality setting is around the Performance setting at 4K as they both upscale from 1080p from what I recall.

If you use Quality setting you upscale from 1440p which is obviously going to be better. Thought I would educate you a bit :p:D
Teach me more! I thought that essentially FSR and DLSS upscaled from a lower resolution depending on quality requested, i.e. taking 4k as an example target resolution, performance upscaled from 720, balanced from 1080 and quality from 1440 (numbers are absolute guesses).
 
Not at all, it's his opinion based on what he's seen. I'm upscaling (I think) from 1440p, and not always to 4k, so I'm not seeing a huge drop off but a fair performance benefit. If he's upscaling from a lower res to get to a same resolution output, he may well be seeing worse results. I see upscaling as a really useful tool based on what I can see, and am extrapolating my opinion, but he may have the opposite opinion for equally valid reasons. Education can work both ways.

I had a 1080p monitor briefly when I was waiting on my 4K arriving and DLSS at that res was poor. I think most reviewers are of the consensus that it works best with higher resolutions and 4K being the best.
 
I don't see how DLSS or FSR can avoid issues with latency in the way the technology seems to work. It would need to be predictive of human action, which it can't be. Is there any word if next gen of FSR looking to be more or less aggressive with frame generation?

Teach me more! I thought that essentially FSR and DLSS upscaled from a lower resolution depending on quality requested, i.e. taking 4k as an example target resolution, performance upscaled from 720, balanced from 1080 and quality from 1440 (numbers are absolute guesses).

Nvidia way is tied to their optical flow accelerator hardware, no idea on how much or exactly why it helps reduce latency though but I'm sure with either the right hardware or/and software optimisation, latency can at least be reduced to the point it wouldn't be noticeable, due to the way the tech works, I don't think we'll ever see latency less than no FG but there are probably a lot of ways around it as talked about in this video, spoiler tag as to do with dlss 3/fg:


One thing, which many overlook with dlss (and probably even fsr and xess with whatever their equivalents are) is using DLDSR in combination with dlss, it can really help for those who want sharper/more clarity in gaming but obviously it hits performance somewhat, however, iirc, still less of a performance hit than running native 4k or higher....
 
Teach me more! I thought that essentially FSR and DLSS upscaled from a lower resolution depending on quality requested, i.e. taking 4k as an example target resolution, performance upscaled from 720, balanced from 1080 and quality from 1440 (numbers are absolute guesses).
4K:

Quality: 2560x1440p
Balance: 2227x1253p
Performance: 1920x1080p
Ultra Perf: 1280x720p
 
Snip. As I said, if you're happy to turn off RT, great but I rather have better visuals thus only way to achieve that is with upscaling tech. in the ascent.

The debate between you and I was never about what RT and DLSS bring to the table. You claimed the majority of new PC games in the past 2-3 years have DLSS, this is wrong... period.
 
Last edited:
4K:

Quality: 2560x1440p
Balance: 2227x1253p
Performance: 1920x1080p
Ultra Perf: 1280x720p
Thanks - I think this is essentially what I was getting at in the first place - If @humbug is upscaling from a lower resolution, he will not be seeing the same results as I am as the source information is not as good, which is potentially a reason why he is not as enthused about it. This may even be made worse by a frame generation technology.

Are AMD showing any particular technology approaches with RDNA 3 like they did with RDNA 2? Rage mode, SAM etc?
 
The debate was never about what RT and DLSS bring to the table. You claimed the majority of new games have DLSS, this is wrong... period.

Well then why did you bring the ascent into a discussion about "not needing DLSS"? :confused:

Nothing you have posted contradicts what I said, most games do not need upscaling to get decent/good performance because most games are not AAA games released in the past 3 years. Those TPU charts are specifically skewed massively towards the minority of demanding games because it is a benchmark test. The majority are not exclusively playing demanding AAA games from the past 3 years.

That's a fact and pointless skewed benchmark tests do not change that.

Hell the last PC game I played a few weeks back is Alien Isolation from 2014 and before that the Ascent (an indie game). If all we played was AAA titles with RT at 165 FPS (because apparently that's all that counts), we would be waiting a very very long time between our gaming fixes.


You claimed the majority of new games have DLSS, this is wrong... period.

As for this bit, I don't really care for going through a huge list of games, nor is this the right thread for it but aside from amd sponsored games and a couple of non amd sponsored games, I would say about 90% of the games I've played over the past 2 years (which is a lot, as @TNA can attest too ;)) have all had dlss... not like upscaling tech. is going anywhere either.... but lets leave it there as this isn't the right thread for it.

Thanks - I think this is essentially what I was getting at in the first place - If @humbug is upscaling from a lower resolution, he will not be seeing the same results as I am as the source information is not as good, which is potentially a reason why he is not as enthused about it. This may even be made worse by a frame generation technology.

Are AMD showing any particular technology approaches with RDNA 3 like they did with RDNA 2? Rage mode, SAM etc?

He's upscaling from 2560x1440, given he keeps saying about "ultra" when talking about dlss and looking like minecraft, starting to wonder if he is basing his "opinion" on the "ultra performance" mode :cry:

"Rage mode", whatever happened to that? :D They haven't shown anything new.
 
Thanks - I think this is essentially what I was getting at in the first place - If @humbug is upscaling from a lower resolution, he will not be seeing the same results as I am as the source information is not as good, which is potentially a reason why he is not as enthused about it. This may even be made worse by a frame generation technology.

Are AMD showing any particular technology approaches with RDNA 3 like they did with RDNA 2? Rage mode, SAM etc?

Ah my bad, thought you was talking about our local axe man, as he is on 4K.

Personally I liked DLSS/FSR at 4K. It is good at 1440p, but obviously not as good. I think using it at 4K on Quality setting is very good in particular if done right. Even performance was quite good.

Took a while for me to adjust to 1440p ultrawide. 4K is simply just better. But damn, this QD-OLED Alienware monitor is just so damned good. Best monitor I have had by a long shot. No more back light bleed crap and game changing HDR.

Hopefully we get a 27-32" 4K varient of it in the next couple of years and I will upgrade to that.

Back on topic though, getting my popcorn ready for tomorrow :D
 
Turning settings down a notch or two is still better than using DLSS, as you say going from the highest setting to one or two down makes little if any difference to the graphics quality, but it can improve performance greatly, more so than running DLSS Ultra because anything lower than that and everything is Minecraft.
Not at 4k
 
Back on topic though, getting my popcorn ready for tomorrow :D

Got ten buckets worth here :D

But in all seriousness, as I said a bit further back, we already have a pretty good idea what to expect:

Thing is we already know that the 7900xtx is going to most likely beat a 4080 in raster, lose in RT (on par with a 3090) and as per gibbos posts, cost from £999-1250 depending on reference or custom. The only thing which would be a shock tomorrow is if the 7900xtx is only just matching or loses compared to a 4080.

Also, the question as to how much AIB custom will gain over reference.....
 
  • Haha
Reactions: TNA
Turning settings down a notch or two is still better than using DLSS, as you say going from the highest setting to one or two down makes little if any difference to the graphics quality, but it can improve performance greatly, more so than running DLSS Ultra because anything lower than that and everything is Minecraft.

At 4K DLSS (and FSR) in their lates iterations are certainly not killing IQ as you imply. At lower resolutions, there is a fair bit more blurring, but not at 4K.
 
Got ten buckets worth here :D

But in all seriousness, as I said a bit further back, we already have a pretty good idea what to expect:



Also, the question as to how much AIB custom will gain over reference.....
7900xt trading blows with the 4080
7900xtx consistantly better than the 4080 in rasta but slower in RT 10-20% faster than the 4080 but 4090 being about 30% faster than the 7900xtx
AIBS around 5-10% faster than ref

Thats my current expectations.
 
Last edited:
Thanks - I think this is essentially what I was getting at in the first place - If @humbug is upscaling from a lower resolution, he will not be seeing the same results as I am as the source information is not as good, which is potentially a reason why he is not as enthused about it. This may even be made worse by a frame generation technology.

Are AMD showing any particular technology approaches with RDNA 3 like they did with RDNA 2? Rage mode, SAM etc?

Up-scale from 1440P to 4K and back down to 1440P with DLSS?
 
Last edited:
7900xt trading blows with the 4080
7900xtx consistantly better than the 4080 in rasta but slower in RT 10-20% faster than the 4080 but 4090 being about 30% faster than the 7900xtx
AIBS around 5-10% faster than ref

Thats my current expectations.

I will be interesting to see how they stack up. At 4K the 4090 is on average 25% - 35% faster than a 4080 (review dependent), so not much room for the 7900 XTX to slot in.
 
Back
Top Bottom