• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: AMD Screws Gamers: Sponsorships Likely Block DLSS

Are AMD out of order if they are found to be blocking DLSS on Starfield

  • Yes

  • No


Results are only viewable after voting.
It's not good for the industry having to maintain and support both DLSS and FSR.

They should pick one or the other. Either let AMD and Intel support DLSS and abandon FSR or abandon DLSS and then Nvidia can invest heavily in FSR.

What AMD have done may be scummy but surely Nvidia keeping DLSS closed is even more scummy.
Not when FSR is in consoles.
 
Cyberpunk is a bad example since that game was messed up on launch and fsr and dlss were the least of its issues lol.
I suspect that they engine programmers spent far too much time with RT and DLSS rather than fixing actual problems. In a $250 million budget game, having so much resources dedicated to making a RT tech demo at Nvidia's behest always seemed like a very strange move. With that huge budget, any Nvidia sponsorship must have been relatively minor.
This is a waste of AMD's time anyway, FSR will never win adoption over DLSS and nobody buys AMD for FSR.
Probably never will get anyone to buy AMD for FSR, but the only way I can see FSR loosing the adoption war is if Nvidia really put the effort in to get DLSS to work on every GPU and especially the default game targets: consoles.

However, Nvidia have left it far too late for that now. IMO DLSS will go the way of G-Sync: technically better but nobody cares anymore.

At the end the day aside from very few outliners, all game development is console first, and second, and with PCs a distant third.
But do they (MS/Bethesda) really have so little regard for their art and their customers that they're willing to compromise the visual quality of their game for a few extra bucks? I guess so.
Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?
(That and the poor AA which default nowadays and makes DLSS sometimes look better than poor-AA native.)

If perf/cost hadn't stagnated so much the last few years, then far less people would have upscalers of any kind in the first place.
 
Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?
No, DLSS is in the unique position that when implemented right, the visuals can and do become better. All the big channels have covered this in video reviews already so just look them up. Essentially anything 1440P+ DLSS can show more detail, better AA, better performance than the native image render.

On top of that with Nvidia we have access to DLAA/DLDSR which are superior than anything else.
 
Last edited:
Surely, by definition adding any support for upscalers is 100% compromsing the visual quality of their games?
(That and the poor AA which default nowadays and makes DLSS sometimes look better than poor-AA native.)
At least if a dev implements DLSS they have the option of enabling DLAA which is the best AA solution available to modern games.
 
I've not has chance to watch this video but HW Unboxed but the Q&A video last week with one of the questions about this. Then they basically said they don't believe AMD or Nvidia force devs to block certain features in their sponsored titles but they may ask them not to or try and persaude them but ultimately its the devs choice. Now from what I gather from reading this thread they are suggesting AMD are blocking the devs :confused:

If true it is a scummy move from AMD but as said by this type of behaviour have been going on for years from all vendors with very few eye lids bat.

Also why no flack at the Starfield devs, surely they could have turned down the deal and implemented whichever tech they like. It's not like the game isn't going to sell well regardless of tech, quality and bugs included.
 
Yup nvidia have an open source solution called streamline that implements their solutions all in one go i.e. reflex, dlss and FG, intel were onboard with it but not amd due to very iffy reasoning by their chief engineer (in DF interview with Alex) even though it would have benefitted just not consumers (being able to use what works best for their hardware) but also developers (do all 3 in one go as opposed to separately) and also would have benefitted amd since uptake would be larger and quicker for them.


Unsure of how good it works.
 
No, DLSS is in the unique position that when implemented right, the visuals can and do become better. All the big channels have covered this in video reviews already so just look them up. Essentially anything 1440P+ DLSS can show more detail, better AA, better performance than the native image render.
There was a reason I put in:
(That and the poor AA which default nowadays and makes DLSS sometimes look better than poor-AA native.)
This better than native nonsense is almost 100% because the default AA is so poor.
Better than native with a decent AA, no AA or one of those computationally expensive old-fashioned AA is unlikely. But I believe that render-ahead rendering is the technical reason why the old-fashioned AA no can no longer work with most modern engines.
 
There was a reason I put in:

This better than native nonsense is almost 100% because the default AA is so poor.
Better than native with a decent AA, no AA or one of those computationally expensive old-fashioned AA is unlikely. But I believe that render-ahead rendering is the technical reason why the old-fashioned AA no can no longer work with most modern engines.
Like I said, just watch any reviews that deep dive into comparing them all. Once Naughty Dog fixed the DLSS = no AA issue in the recent patches, the sharpness returned in Last of Us, and DLSS offers the best IQ vs the others once again. That doesn't mean some reflections on water are perfect, but it's far better than FSR/XeSS in this game, but above all else the sharpness and AA quality is back to where it should be.

The same applies to Cyberpunk too where DLSS Quality just renders everything to a better quality than the others, whilst looking near identical to native, whilst pushing much higher framerates.

In Witcher 3 DLSS is once again much sharper with better image quality, and that was at DLSS Performance too on a 3080 at the time, so even better on more powerful cards with Quality mode. There are lots of other games too, some out of the box, some with DLSS injected in.

It would be quite hilarious,if Starfield launches and still does better on Nvidia cards!
:cry:

I would bet solid money that this is exactly what does happen :p
 
Last edited:
In Witcher 3 DLSS is once again much sharper with better image quality, and that was at DLSS Performance too on a 3080 at the time, so even better on more powerful cards with Quality mode. There are lots of other games too, some out of the box, some with DLSS injected in.
Have to say that when I tried TW3 I did not think so, but at the time I was mostly doing a bunch RT on/off comparisons. I might go back and have another look.
 
Be sure to use a later version of the dll file for DLSS with Witcher, as it ships with an old version that didn't have the later revision to use NIS instead of the legacy sharpening etc.

Edit* The bit many will no doubt miss from the HUB video:

QIkd3Ip.png


AMD's response to a similar question question to Gamers Nexus? "no comment"... So basically a yes? A previous statement, whilst to HUB AMD said "we don't know what to say yet":

kMzSrSB.png
 
Last edited:
This isn't an 'either/or' - FSR, DLSS and XeSS all use the same inputs from the engine and all of them have tools to aid implementation - all PC games should be supporting all of these technologies.

Once you've implemented FSR you're already 90%+ of the way there to integrate the others.

Then the remaining 10% does seem to become an either/or.
This just circles back to the old shocker of a company just doing the bare minimum it needs to that covers the most bases and then moving on.
 
Be sure to use a later version of the dll file for DLSS with Witcher, as it ships with an old version that didn't have the later revision to use NIS instead of the legacy sharpening etc.

Edit* The bit many will no doubt miss from the HUB video:

QIkd3Ip.png


AMD's response to a similar question question to Gamers Nexus? "no comment"... So basically a yes? A previous statement, whilst to HUB AMD said "we don't know what to say yet":

kMzSrSB.png

I mean these are both just boiler plate PR guff.
Just Nvidia PR dept had a better statement prepped and ready to go, while AMD's didn't.
 
Boiler plate in that one outright states they do not block the use of other upscalers and it's down to developers, whilst the other vendor outright avoids using any such words leading the reader to conclude that the answer is yes, they do block it.

In favour of Nvidia, why would they block anyway, DLSS is better, and they make it easy to implement. Having the competition upscalers in the same games allows DLSS to benchmark against them better too so it's in their interests to make sure all upscalers feature as a result.
 
Last edited:
I mean these are both just boiler plate PR guff.
Just Nvidia PR dept had a better statement prepped and ready to go, while AMD's didn't.

Just like when it was clear the iPhone 4 had problems,and Apple said people were holding it wrong. Or bumpgate,where there was no problem,because they never talked about it!

Still wondering why Cyberpunk 2077 had no FSR for a year. Oh,yes that would be the devs fault. Starfield having no DLSS at launch,is not the devs fault,but AMD conspiring to make sure Nvidia cards can't use upscaling with an open source upscaler.

Might be true - who knows! But,I wouldn't trust AMD PR,Nvidia PR let alone Intel PR.

 
Last edited:
Back
Top Bottom