• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

No need to enable any sync in software if you have a G-Sync display tbh, probs even better if it's got an ultimate module too than just G-Sync compatible. Never had smoothness issues since going QD-OLED on the DW. So smooth, so slick.

I still keep vsync on in nvcp as likes of blurbusters etc. all still recommend it.
 
I don't understand the obsession with the Ultimate module. Sure it offers a lower range but as someone who has tested that back when I had a screaming fast TN panel with ultimate module in it(and an nvidia gpu), its a freaking slide show due to the sample and hold nature of modern monitors. I actually prefer going the other way and limit the lower range(and LFC kick in) and while input lag will increase slightly at least my eyeballs wont get cancer from playing a power point presentation. My current monitor has a lower limit of 48 that I've increased to 71 and due to that I can actually tolerate lower framerates(again due to LFC) should I be so unlucky and see any. I guess it's very dependent on the person doing the watching and my eyeballs may just be overly sensitive to certain things.

Back in the day, the hardware module was a huge advantage not just from a range pov (agree you don't really want to be dropping below 50 fps at most regardless of free/g sync anyway) but most importantly, having variable pixel overdrive/response i.e. essentially motion clarity for when your fps is fluctuating. TFTcentral (arguably one of the best monitor reviewers) did a good article explaining the pros and cons:


There is a lot more to it than just the range differences.

I would argue that for LCD screens, the gsync module holds more pros and less for oled since oleds pixel response is instantaneous, however, PCM2 has stated even here, the gsync module has it's advantages:


I still have an affinity to the ‘DW’ due to the other benefits I mentioned on the first page of this thread – related to the G-SYNC module (lower VRR flickering, seamless operation throughout VRR range and less ‘micro stuttering’).
 
I remember PCM2 as I had a discussion with him about it a long time ago on this very topic where we certainly didn't agree (I got nothing against him and have used his site plenty of times on other matters). Variable pixel overdrive, as you call it, is also possible through the use of adaptiv sync/freesync. Problem here is not the tech itself but how the companies decide to implement it, or incase of adaptive sync, not implement it most of the time. Though the amount of adaptive sync displays that also make use of variable overdrive is increasing. I just don't see the value of the Gsync Ultimate module, even considering variable overdrive. From what I've read and experienced it's only a feature that really comes into its own under a certain hertz and by then, in my case atleast, LFC has long kicked in. I guess its preference thing in the end.

Are you sure you're not confusing these 2 things:

- variable refresh rate i.e. the range at which g/free sync operates in e.g. sync happens between 30-144 hz/fps
- variable overdrive with regards to pixel response time depending what your fps is at i.e. "NVIDIA also talk about how their G-sync technology allows for “variable overdrive” where the overdrive is apparently tuned across the entire refresh rate range for optimal performance." so for example on a freesync monitor a pixel response overdrive setting of +1 will only look good with fps of over 100 but if you drop to say 50 fps, that +1 setting will exhibit inverse ghosting because the pixel response setting is set to high and you would need to drop this to 0 or -1 to eliminate the inverse ghosting

My last iiyama freesync premium screen was a perfect example of where gsync module would have been perfect as when fps where over 100 fps and with a setting of + 2, motion looked nice and clear but as soon as fps went to sub 80 fps, noticeable inverse ghosting started thus had to drop the setting to 0.

Either way, there is more to gsync ultimate module than just that, essentially it's not quite the "freesync does exactly what gsync ultimate does" that some would have you believe, I would say the more important advantage with having gsync ultimate is better QC/certification i.e. amds lack of qc allows monitor manufacturers to ship lesser monitors out with things like worse HDR performance.
 
  • Like
Reactions: mrk
The modder PureDark has released his DLSS2+XeSS mod for Starfield for free on Nexus, His DLSS3 mod however will be exclusive to Patreon.


Look forward to the comparisons where dlss and xess comes out ahead of the official fsr integration, I'm sure wizzard from TPU has got the template ready to go :p :D

Anyone tried this yet? Thoughts? Have read FSR 2 has the same old issues of fizzling, shimmering etc. Going to try myself shortly.
 
  • Like
Reactions: TNA
Nope. Its kinda astounding that more people don't know about this but I guess that is mainly due to the how few adaptive sync panels there is with proper variable overdrive support. If I recall correctly the Gigabyte M27Q X is suppose to sport variable overdrive. I also think you are missing the point of going the adaptive sync way as a company. It makes it possible to offer cheaper monitors with lesser features as well as premium ones.

Look at the Alienware DW and DWF. The DWF offers the same HDR implementation, better input lag and better console support and is still cheaper. I've owned a fair bit of Gsync panels(with module) in the early days of gsync and some of them certainly lacked in the QC department just as some of the adaptive sync displays I've owned. Now granted due to the sheer number of freesync monitors on the market your more likely to end up with a poor one unless you do your research. But that is exactly why we have reviews. The stickers mean absolutely nothing.

However, it is telling that there barely is any new monitors left on the market sporting a gsync module. If the difference really was that big then there would be a ton of them as there would be a reason to purchase them. In this case I think the market speaks for itself.

Well I haven't seen any freesync monitor with variable overdrive so that kind of ties in with the poor QC by amd for freesync screens.... Also, chances are if you are paying for a really good freesync monitor (since they aren't skimping on hardware bits like the scalars), it will be pretty costly in its own right, maybe not as much as the gsync version but still on the more expensive side. When the DW came out, it was cheaper than majority of "top end" monitors, I got mine for like £650-700 as did most other people due to vouchers. The DWF didn't launch for that much cheaper either, think it was like £200 extra which when £800+ isn't a huge amount imo.

I don't disagree that free/adaptive sync is fantastic and would have no problems buying one if it offers an experience without compromises, point is, that gsync module still does provide advantages, far more so back in the day but less so now and the most important thing which nvidia/gsync had when it first launched, it had the market to itself for a good 2 years, again, this is always the difference between nvidia and amd, if people are happy to wait years for something, great but many aren't.

BTW, the DWF had been plagued for a long time with poor HDR implementation, it is only in the last month or 2 it got fixed and even then, gsync version still provides slightly better hdr experience (outside of side by side, you probably wouldn't notice though)
 
Last edited:
  • Like
Reactions: TNA
I hate to say it but what did I say.........

Look forward to the comparisons where dlss and xess comes out ahead of the official fsr integration, I'm sure wizzard from TPU has got the template ready to go :p :D

Anyone tried this yet? Thoughts? Have read FSR 2 has the same old issues of fizzling, shimmering etc. Going to try myself shortly.

:p
 
So is the conclusion the DLSS a modder whipped up together is better than FSR 2 the devs released it with? If so, how sad...

Not surprising though tbf since this has been the case with every game where dlss had to be modded in. I would have expected a better showcase of fsr 2 in this though tbh given how big of a title it is.
 
In an interview yesterday Scott herkelman said he's frustrated that there is not a single FSR industry standard, he's not happy that XeSS and dlss exist giving developers 3 tools to implement

The reality is that Nvidia created a free single tool that the developer implements and it gives the game FSR, dlss and XeSS in one - but AMD refuses to participate in the program so currently the tool only gives the game dlss and xess

Scott and AMD are now starting to behave like spoilt little brats and don't like that free market capitalism exists and would prefer a more communistic choice where gamers are all forced to use FSR and have no choices

Yup sadly people don't see the problems with amds approach though. Amd know what's best for consumers even though, it's clear their vision and options they provide have to actually be as good or better if they want to be the only solution.

Maybe if amd led the way and didn't follow 2+ years later, we wouldn't be in such a situation....

Always the way though, victim mentality is a very serious issue.
 
Last edited:
There was a video from AMD a few years ago, I think it was on 1 of their campuses and it showed a lot of the employees, A fair few wearing shirts sporting a hammer+sickle, Che Guevara, The usual idiot hipster BS which by rights should be outlawed the same as the swastika is considering they too committed horrendous atrocities... but once I saw those their behaviour in various avenues through the years made a lot more sense i.e our way or the highway.

It's rather embarrassing tbh when you really look into it especially the video where alex interviews the chief amd engineer about lack of support for streamline. If you're going to be last to the market and with a sub par solution, you can't give of about others coming up with their own solutions lol, whether you like the following things or not is another matter but can you imagine how long we would be waiting for the likes of adaptive sync, upscalers, frame gen, ray tracing in gaming if nvidia hadn't steamed ahead..... The only reason amd have such a hard on for free/open source is for these reasons. As it is they probably don't see the need to lead the way/innovate as they have the console market and we all know, console market is where the money really is for when it comes to gaming so nvidia can't exactly sit around twiddling their thumbs waiting for amd to provide options. I really am not joking when I say "victim mentality" comes to mind.

People need to remember none of these companies are your friends, you simply buy what fits your needs/wants within your budget.
 
Last edited:
Back when Hairworks was a thing, AMD were barely in the market so that tech likely excluded 2% of gamers.

Even with FSR3 AMD are prefacing it with the fact that only their most recent cards will run it. FG could probably work on the 3000 series but not as effective as with the 4000 cards.

Also, given they made a big deal about the activating fg from the driver control panel, don't be expecting many official implementations..... Hopefully the driver way will be good. Consoles? Well, remind me how many games have FSR 2 lol....
 
That’s a separate thing to fsr3 for games that don’t officially support it. Don’t know how you come to the conclusion to not expect many official implementations when FSR is present in almost every release recently.

Only took them how many years/months to get FSR 2 in games? DLSS 2+ is still in more games than FSR 2. If you want to include FSR 1 then yeah "FSR" wins out but no one implements/uses fsr 1 now....

Have you read/watched the videos on FSR 2 and read their github guide? FSR is entirely up to the developers and community to do as they please. This has always been amds approach, it's nothing new from a software development pov where devs want to throw their work over the fence and let the ongoing ownership and growth be someone else's problem/mission, which is perfectly valid if the company can't spare the resources to drive growth etc. And for consoles, FSR 2 is still not in more than 2-3 games..... Why is that? After all, it's free and open source and amd powered......

This is where amd always fail, it's nothing new, truehd or whatever their audio was, same with tressfx and so on. That's all perfectly ok btw, the problem is, having this "victim mentality" and then complaining that oh "it's not fair and not right these other companies doing similar things but closing it of to just themselves and their customer base", if amd want to change this, well then come out first and with a good solution rather than following the market leader, not years later then cry foul. Despite what many would have you believe, it's not nvidias fault amd are in the position they're in and this is why they'll keep failing in the gpu space as they too busy pointing fingers. They have shown they can turn things around i.e. cpu space so they can do it again, nvidia have given them a wide open goal for the past year here and they still haven't acted on it.
 
Not quoting that wall but you seem to have went off on a complete tangent there. FSR 3 will be a tool available to developers. If developers don’t use it, how exactly is that AMD’s fault?

You want them to not follow the market leader which they look to be doing so regarding driver level frame generation.

Not exactly a wall but whatever.

Not a tangent, it's relevant to provide the insight as to "why it's amds fault".

I'll quote the main bit though:

FSR 2 is still not in more than 2-3 console games..... Why is that? After all, it's free and open source and amd powered......

I rather pay extra money to get "advertised" features in games sooner than later especially if it is going to provide a huge boost to the experience. How long did ampere/turing have dlss for over amd gpus? Maybe people are happy to wait months/years, which tbh, does seem to be the motto for amd gamers, which again, is completely ok, what isn't is the crying foul and victim mentality, less of this and more action and focussing on your own issues. It's a bit like their rdna 3 reveal, instead of focussing on what made their gpus stand out and look good based on their strengths, they had to resort to the petty attacks of "oh, you don't need a new case, won't burn the house down" then fast forward to release, they had overheating/fire risk/issues lol....
 
If it’s not in a game, the developer didn’t want to implement it. Pretty simple really. Coming over as a massive Nvidia fanboy here. Nvidia clearly got ahead by developing dlss and AMD had to develop their own solution. That took time, so what? AMD are split between the cpu and gpu space while Nvidia are concentrating on gpu and AI.

Again, this is all going off an a total tangent from my original point that fsr2 is in almost every recent release so there’s no reason to think that fsr3 won’t also be widely used like you previously stated.

And would you rather not have amd take the initiative to change that stance?

No, like I said:

People need to remember none of these companies are your friends, you simply buy what fits your needs/wants within your budget.

It's always "bad nvidia" etc. which is just finger pointing, nothing else. Again, look up victim mentality:

A victim mentality is where you often feel like a victim, even when the evidence says otherwise. Signs include frequently blaming others and having trouble accepting personal responsibility. We all have days when we feel like the world is against us.

AMD didn't like gsync - well then release adaptive/free sync first
AMD didn't like dlss - well then release fsr first
AMD didn't like hairworks - well then release tressfx first
AMD didn't like frame gen - well then release your frame gen first

See the trend here?

As for this bit:

That took time, so what?

Again see this bit:

Maybe people are happy to wait months/years, which tbh, does seem to be the motto for amd gamers, which again, is completely ok

I play games usually on launch week and get new hardware when/if I need an upgrade so personally I want to get the best from that time onwards not months/years later, if people can wait this length of time, great, I wish I had this patience but life is too short to be waiting months/years.

AMD are split between the cpu and gpu space while Nvidia are concentrating on gpu and AI

You are missing the point, this point:

This is where amd always fail, it's nothing new, truehd or whatever their audio was, same with tressfx and so on. That's all perfectly ok btw, the problem is, having this "victim mentality" and then complaining that oh "it's not fair and not right these other companies doing similar things but closing it of to just themselves and their customer base"



With regards to this point:

original point that fsr2 is in almost every recent release so there’s no reason to think that fsr3 won’t also be widely used like you previously stated

Maybe but to me it sounded like their focus will be on driver level, which again is completely ok if it works just as well as what an official implementation would, they have had a very hard uphill battle to get FSR 2 in games, I doubt they want to face the same issues all over again with FSR 3/frame gen i.e. years behind.
 
The driver level bit was added on at the end of the fsr3 presentation. You seem to have taken that how you want to take it.

You are correct in that none of these companies are your friend. Bit strange for you to hold that opinion though when your posts are the most pro Nvidia on the whole forum.

I also don’t get your obsession with victim mentality. I can’t recall anyone from amd saying “it’s not fair”

Yes it's an educated guess based on amds history and development approach, time will tell.

You only see it being "pro" as amds own brainwashing has led you in to believing nvidia are the bad guy, take a neutral stance and you'll see there is no one to blame for amds issues but themselves, same way nvidia has no one to blame but themselves when their methods backfire i.e. over done tessellation in said sponsored games which harmed their own customers.

Watch AMDs interviews and it's perfectly clear to see they don't like nvidia and xess etc. having their versions as both are better.

If amd are serious about FSR and getting everything to embrace it:

- then work with microsoft and sony to get every console game working with it
- make it so that customers can update FSR themselves, same way rtx users can with dlss or maybe do it in such a way with OTA through the drivers if possible

There are a lot of "solutions" to amd "problems", it is only them who can do better.

Again, it's "amds way or the highway", let the customer choose and decide what is best for themselves.

In an interview yesterday Scott herkelman said he's frustrated that there is not a single FSR industry standard, he's not happy that XeSS and dlss exist giving developers 3 tools to implement

The reality is that Nvidia created a free single tool that the developer implements and it gives the game FSR, dlss and XeSS in one - but AMD refuses to participate in the program so currently the tool only gives the game dlss and xess

Scott and AMD are now starting to behave like spoilt little brats and don't like that free market capitalism exists and would prefer a more communistic choice where gamers are all forced to use FSR and have no choices

Grims posts sums it up perfectly.

I also recommend watching this video:


Also, read the github page on the closed/open issues where developers have brought up the concerns around it.

Nah I think if anything, Nvidia and their ultra fanboys are the ones with the victim mentality here. Their technologies like DLSS are proprietary and yet they want it to be forced into every game at the cost of the developers. If Nvidia had made DLSS like XeSS with an cross platform fallback that allowed it to be used with any GPU, there wouldn't even be an argument that would not be in favour of it ahead of FSR. If both AMD and Intel can make their upscalers work openly, what is Nvidia's reason for not doing so other than pure competitive advantage?

Streamline is not a solution, but rather like a strategic trojan horse for Nvidia to have some degree of control over rival technologies. FSR does not take exactly the same inputs as DLSS, and you can see that through github. With streamline Nvidia are saying "Here's an interface that limits your upscaler to only the inputs we decide to expose to you. Need other inputs from the engine? Too bad, you'll have to do our work yourself to update the interface." The end result? Developers implementing FSR through streamline may just have less data to work with than if they had implemented it directly into the engine, resulting in a possibly worse implementation than they otherwise could have had. Nvidia very cleverly portrayed AMD as being a "non-partner" for the interface, despite FSR being fully open source, meaning that Nvidia themselves could have implemented full support for it through streamline. Again, imagine Nvidia released DLSS with cross platform support, AMD and Intel likely wouldn't even bother to create their own upscaler unless they had image quality rivalling or surpassing DLSS. Nvidia would have had the result of DLSS being used and plastered everywhere, it says a lot about them that they'd rather limit it to only RTX GPUs and then create a whole interface for third party upscalers.

I've used way more Nvidia GPUs than I've ever used AMD/ATI, but I do not like the whole double standards thing that has been going on with Nvidia fans. A lot of it is pure spite and entitlement. If the DLSS situation was the other way round, i.e. AMD released a proprietary and vendor locked upscaler first, I would absolutely bet that we'd have seen long petitions from Nvidia fans to get AMD to make it cross platform and they would have branded AMD as being anti-competitive for not doing so.

From my understanding of streamline, it was only the interface/api to allow all 3 to be implemented in one swoop but yes you are right in that FSR needs a lot more tuning from the devs to get the best from it where as dlss is more of a t shirt size approach i.e. pick what preset works/looks best.

Also, streamline is open source so if there was any dodgy tactics, it would be picked up on straight away by the community as well as amd and intel, wouldn't exactly be good pr for nvidia now would it.....

In all honesty, I would rather amd lock their features down (if it meant getting better quality and consistently good implementations) but again, they like the over the fence approach so they can be as hands of as possible (Roy etc. have all more or less said this themselves but dressed up along the lines of "we want the whole community to take it and do as they please and run with it") so this will never happen, not to mention, when you are last to the market and with a subpar solution, you can't exactly take nvidias approach especially not with 10% (?) of the market.....

See it's funny, I have owned far more amd hardware than intel and nvidia combined (was amd from the 3850 right up to the vega 56 and then cpu wise, 2600x, 5600x and now 5800x3d but nope, "nvidia pro, nvidia fanboy" :rolleyes:)

FSR3 FG will almost certainly pick quick since it works on consoles. If some modder can add DLSS3 FG to a game like Starfield then there should be no reason why they can't add FSR3. Maybe modders may even find a way to add the driver level FSR3 FG feature so it works on every card. We will have to wait and see.

Wasn't the same said for FSR 1 and then FSR 2? Yet still 2-3 console games only have it?

It will be interesting to see the modder implementation though as the modder implementation sometimes was better than the official integration but then other times, it was awful in terms of ghosting since the modders don't have access to motion vectors etc.

Also, read the fine print of amds FSR 3, to get the best from it (mainly surrounding lag), you will need rdna 3 gpus.
 
Last edited:
So for sake of using your argument that has stated on multiple times of record, you will pay extra for features, are you against AMD for adding features that only on their gpus?

Granted non of this matters to me since all upscalers look bad and always turn it off.


It seems though AMD has a strong edge in starfield when using anything from a 7900 XT and XTX, so cant see it needs it even at 4K.

I have always said I'm happy to pay to get access to features "if" they are good/worthwhile :confused: I never saw the appeal with physx hence why I never bought a nvidia gpu to get access to this. Back in the day when I was amd gpu only, I funnily bought into their hardware because like many others I believed with amd powering consoles, we would see the benefits on pc side too, so in some ways I was buying into the amd system paying of in the long run but nope, that never happened and in fact, it's nvidia who often do better, some still stand by that, well, still waiting, like I said, imo, the amd motto now.

If amd locking features to work only on their gpus meant their solutions being far better and most importantly, consistently good, then great, that's the way to do it, as it is, based on my own testing and as confirmed by every comparison so far, fsr is still not a patch on dlss and given how important this is now (game devs are relying on this more than ever to "optimise" their games), this is a pretty big selling point and for me, it's made amd a no go until they get their solutions on par with what their competition is offering, thankfully my 3080 still doing great so I have no need to give either company money towards their extortionate gpus "at the moment". If amd had a good upscaler solution though, I would be pretty tempted by the 7800xt(x) deals that were going about recently but alas they don't.

Like I said, don't think so much about it buddy, buy what works best for your budget, if that is amd, nvidia or even a console, crack on, for me at this moment in time, it just so happens to be a nvidia based gpu, a couple years back, ps 4 pro exclusives were fantastic i.e. last of us, hzd, spiderman, god of war etc. hence why I bought it, now not so much with them coming to pc and sony not having the same good titles that interest me hence why it got sold on as it has no appeal.

And nope, you're wrong there, at 4k, you still need upscaling regardless unless you like fps dropping below 60?

MCNJucr.png


vrLzvGp.png


Although based on my own testing, you will want to keep FSR res. as high as possible which means dropping other settings unless you have access to dlss in order to avoid the usual issues with fsr 2. This is also confirmed by HUB:


Oh and just for the record too, I paid less for my 3080 than what I would have for a 6800xt or even a 6800 thanks to amd not providing a MSRP option in the UK like nvidia did. So didn't really pay anything extra at all and if you want to compare MSRP to MSRP, an extra £50 has proved well worth it for having access to a superior upscaler for 3+ years now (and not to mention the other things like ray tracing grunt and so on)
 
Last edited:
if you drop maybe a setting or 2, you will get 60 + FPs on the 7900 XT alone, thats 4K ultra and still almost 60 FPS average on that GPU is pretty solid.

tell me again where upscaling is needed there?

Did you watch gamer nexus video?


Even dropping to high preset, you are not getting a consistent 60+ fps Sure average fps may just be over 60 fps but I rather have a "consistent" high fps, which is only done by using upscaling (and settings also needing lowered in this game still), what if you're on a high refresh rate display, just drop settings to low instead of using upscaling?

4HUrQwY.png
 
So tweak the settings?

I saw the video and guess what, performance on the AMD GPUs are fine

I knew you were gonna make up that stance, what about the high refresh rate 4K settings?

Why does your goalposts keep changing, its clear looking at the charts the game can be run at 4K without any upscaling at all.

Tweak the settings.... Way to miss the whole point of upscaling.

Low settings, but at least is native, peasant! /s :))

Seems you were right there calan *facepalm*

Make up high refresh rate 4k displays? Lmao. 4k 144+Hz gamers, sell your displays to drop back to 4k60 and just play on low :cry:

Keep working on the mental gymnastics, I'm away to explore the universe :)
 
Last edited:
Back
Top Bottom