• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Jedi Survivor is getting official DLSS support in its next patch which should be today, Was not expecting that.


Main notes of importance for PC -

Funny timing how these amd sponsored games are now all of a sudden getting dlss, gamer nexus called it perfectly:

prZUmlQ.png


pEBaPsv.png


I've played and completed the game and no intentions on going back now though. Had to make do with FSR but thankfully nvidias DLDSR made it tolerable.
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Whatever, it just stinks that we have to rely on a modder to give consumers a choice in upscalers.

I said before, I hope AMD don't get to 'sponsor' too many more AAA titles.

Unless FSR3 turns out be better than DLSS, even so, I'd still like the choice.

I wouldn't worry, amd have been caught red handed and called out, looks like they have changed their tune given the sudden u turn on amd sponsored games now getting/having dlss ;)
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Really?

Good to hear, what games are those then?

Some guy on Reddit keeping a list up to date.

Star wars Jedi survivor got dlss added yesterday too :)
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
FSR, bloody dog****. Worst part of owning an AMD card, the lack of a decent hardware solution :(

If they had one then the XT/XTX would be a seriously long-term card.

Yup it's beyond a joke now, basically 3 years behind.... :o Thing is even if they did improve FSR now (which probably won't happen as it is down to the devs to get the best from it), they are yet again behind in other areas now e.g. dlss 3.5/ray reconstruction, so not only are they behind in RT performance but also considerably behind on IQ going forward too. Not to mention, we have no idea how FSR 3/FG will even be but if history is anything to go by.... no doubt will be behind here too. It wouldn't be a big deal if FG/upscaling had no "need" to be used but as shown, it is necessary now.

This video summed up the core issue with FSR VS dlss, "you can use a lower preset/res with dlss and get better IQ AND performance than if you were using a higher res/preset with FSR"


Considering it took them 4 and a half months to implement super resolution and frame generation which are both plugins in Unreal which JS is based on I have very low expectations for a fix :D

And probably absolutely nothing to do with who sponsors them ;) Funny, amd get called out and all of a sudden a game that has been out for a good while gets dlss but so happens to be broken.... it's almost as if amd said here quickly go and implement dlss but no one tested it :cry:

As it is, seems some random modder can get it working better though as dlss was modded in ages ago for jedi survivor too:


Beauty of the t-shirt size fit with being able to use what works best too and not having to completely rely on devs to get somewhat good results:

 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
I guarantee you it won't be seen as fine wine by a few people ;) :p

It definitely isn't a CPU bottleneck but more something where power usage isn't correct, my gpu is constantly pegged at 99% but power is low.




One problem that I have with a lot of these benchmarks and comparisons is rebar, a lot of sites say they turn it on but reality is, it isn't turned on as nvidia have a whitelist for it, which is basically only several games.... Some of these differences that you can see in amd favourable games are diminished a good chunk when rebar is forced on via nvidia profile inspector e.g. dead space remake


And now starfield:


I'm getting a nice little boost overall in starfield. A nvidia rep said they will be enabling rebar in their next update but I think there is more to be done from nvidia AND beth side regarding optimisation for intel and nvidia gpus.
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
I seen that posted on reddit a week back but it was only one or 2 guys so didn't think much of it but indeed, big oooooooooof, amd cheating? They would never! :p ;)

More than likely just a bug though. For example, if you force 16x AF in NVCP, the shadows are bugged unless you clear the game shader cache.

Big oooff, It's like that controversy a few years back where Nvidia was apparently using drivers to lower image quality to get better benchmark scores in reviews.

Haha yup exactly, amd fans will probably say this is a "feature" :p

I do remember nvidia gpus having issues with the first doom remake where it was missing particle and some lighting effects but generally this whole nonsense of one has better visuals or one displays colour is complete and utter bs and been debunked many times now, ok yeah the monitor colours thing might be valid if you don't get in and change the default settings for nvidia but who doesn't do this? :o People also tried using HZD to show nvidia rendering textures as you move about, which also was debunked as it also happened on amd gpus :cry:

Thing is though, dlss 3.5/ray reconstruction is going to make it extremely hard for benchmark comparisons now though as the ray reconstruction will be offering superior IQ whilst maintaining the same or better performance, AMD need to get something like this asap.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.

Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its slower. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings.

We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.

Also @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them.

So maybe Nvidia needs to fix its drivers too.



I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.

#Totally normal behaviour.
relative-performance-rt-1920-1080.png

rt-cyberpunk-2077-1920-1080.png

The lack of stars etc. was noticed by a couple of end users on Reddit so it's not just some press site making this up.... So it's either an AMD or/and game issue, same way nvidias lesser performance is either driver or/and game related.

As for ray tracing and cyberpunk decreasing amd performance, well that is 100% down to amds lack of investment in dedicated hardware acceleration for rt and it's nothing new as this is the case across every RT title (aside from 1-2 amd sponsored rt games) so it's not really surprising, of course if you want to play the "Nvidia bad" card, you be better of using the rtx remix titles where amd gpus completely crash to like 5 fps and have graphical artifacts.

I suppose amd not having ray reconstruction like Nvidia thus lesser IQ and worse performance will also be nvidias fault too?
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Which will have zero effect on performance in the interior spaces? So is that AMD/Bethesda Games Studio fault too?

You literally quoted what I said:



So when Nvidia does badly in a game its due to sabotage and nothing to do with Nvidia hardware,drivers and poor dev relations. When AMD has poor performance its down to AMD hardware,drivers and poor dev relations as you said.

Also,Digital Foundry said its quite clear that many RTX remix titles have no optimisation for AMD,so not to expect good performance:

That is their own video and they said Starfield is no better than Portal RTX,where one company has an advantage in "IHV and driver optimisations" and it wasn't some conspiracy. Timestamp is at 23 minutes.

So basically "poor" performance in Starfield must be down to Nvidia hardware,drivers and dev relations. Thanks for clearing that up. Because reading this thread you would think it wasn't the case.

@keef247 on here confirmed the Nvidia drivers from months ago and the newest one have zero effect on game performance with an RTX4070.

The last Nvidia driver was on the 22nd of August. So maybe you should be asking why Nvidia hasn't bothered launching drivers for three weeks now? Is that the fault of AMD and Bethesda games studio? I thought Nvidia with their billions from AI,would have a super performance driver out by now. Oh well,apparently not yet.

Hopefully they will! I would rather stick with what I have! However,performance is still OK on my card,so it will do the job!:)

I don't think anyone has stated that amd have "deliberately" sabotaged performance? If you watch the video by Alex, it's more than likely down to nvidia not getting access to the game until launch (maybe you could perhaps insinuate that this is part of amds sponsorship that caused this?) or/and a game/driver issue that needs fixed. The only thing regarding "potential" foul play regarding amd and starfield is the lack of dlss.

I'm not quite sure what your point is either tbh, the only thing I can seem to ever grasp from these posts (which is spreading across multiple threads now) is the usual "nvidia bad, amd good". You went on a rant about how cp performance goes down the drain with RT and well yeah, why is that surprising? It's one of the hardest games in terms of ray tracing, it's not some title where ray tracing reflections are only being applied on puddles and at like a 1/4 resolution....). AMD actually does very well given how much is happening in cp with RT and RDNA 3 is performing exactly how it should i.e. 7900xtx on par with a 3080 ti/3090..... This was covered pretty well in one of DF videos, iirc, it was for dying light 2 but essentially RDNA 3 and even RDNA 2 can cope quite well with light RT or/and when the raster side of things aren't demanding, which means the gpu can allocate/process ray tracing better, the issue with amds gpus and RT is when it comes to not just "multiple" ray tracing features but also when the ray tracing is used throughout a scene.

Again, if you want to stick to the "nvidia bad" examples, keep to the rtx remix ones as it's clear that there is some dodgy shenanigans happening there by nvidia....
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
I would much prefer well optimised PC games with vendor specific features. As a 4080 owner (been mainly Nvidia sine the 2080) I am happy to see DLSS is implemented where possible. Bethesda saying they had no contact from Nvidia re optimisations seems plausible but that does not mean they could not include at least a working implementation of DLSS.

What strikes me as hypocritical is the faux outrage that suddenly vendor optimisation is bad because it affects Nvidia.

AMD sponsor and optimise a game and it’s all boo hiss the shame of it

Nvidia sponsor a game and it’s just seen as par for the course “cus majority”. If you want the best features buy Nvidia… right?

Was there similar outrage when BG3 released with only FSR1? If Starfield had DLSS1 would you all be OK as Bethesda could claim Nvidia owners are catered for?

I’m not arguing that Starfield is well optimised, or that DLSS should not be included from launch. I’m just asking for some perspective and lack of hypocrisy.

There is a big difference here.

In nvidia sponsored games, FSR and/or XESS is included too therefore nvidia sponsored titles are not locking out upscaling tech to non-rtx users. In amd sponsored games, yes we can use FSR but as shown time and time again, it is just downright ****, more so if you're playing at a res below 4k or/and using a lesser preset where FSR falls apart entirely so ok yes, a tech has been included which everyone can use but you are directly gimping/harming owners who "could" have a better experience but they can't because said companies have decided they know what is best because "reasons"....

Everyone should be advocating to have all tech included in games especially when as shown by both amds, intel and nvidias own guides as well as engines including said features in their engine, these upscaling things are not exactly hard/time consuming to implement, well for EA, it seems they can't do it well....

As for FSR 1 and 2, well that is more amds fault or rather their direction they go in i.e. it's up the community and developers to do as they please because "open source!!!!!!", go read the github page for fsr 2 for example where you have devs asking how can end users update FSR themselves like what can be done with dlss and the answer is basically you can't because of how amd have engineered FSR, it's up the devs on how they want to go about doing it and in order to update FSR to a newer version, it has to be done on the game devs end. DLSS and FSR whilst they achieve the same goal, they are both very different in how they are implemented/operated. FSR 2 is completely different to FSR 1 where as DLSS 2 is an update to DLSS 1
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Lol, the mental gymnastics is truly amazing.

You are suggesting that AMD and Bethesda deliberately left out Nvidia features and optimisations. Yet the reason BG3 only has FSR1 is because of AMDs fault. It couldn’t possibly be that Nvidia insisted, nope we want AMD features to look gash.

I’m not saying either are true, just pointing out the double standards.

AMD optimised and Nvidia features missing or poorly implemented = AMD plot.

Nvidia optimised and AMD features missing or poorly implemented = AMD stupid

Your posts are getting very boring now, when peoples view points don't fit with yours, it becomes a case of "mental gymnastics" etc. and the usual rubbish, how about addressing just the points rather than having to resort to the usual rhetoric and making it out to be a nvidia vs amd thing? You have been doing this throughout the starfield thread too.

So is it nvidias fault that "AMD SPONSORED" games are using FSR 1 and not 2+ then? e.g. as dicehunter mentioned assassins creed vahalla and there are plenty more of amd games where they are stuck on FSR 1, not to mention, what about the FSR 2.1 games? Is it nvidias fault that they aren't updated to 2.2?

Learn the difference between FSR 1 and FSR 2, your comparison to dlss 1 and dlss 2 is not relevant at all.

PS. where did I say about beth/amd leaving out nvidia features and optimisations? I made a statement of how "rumours" are and the "potential" chance that nvidia may not have got early access to the game in order to get the best from it, same way at the time of cp 2077 launch, AMD owners could not use RT at launch. Lets look at a more relevant example on this shall we? Ratchet and clank, ps 5 had RT yet on release, only nvidia could use RT, is that nvidias fault too then? There is also the chance that beth did not put in any or as much effort to optimise the game for nvidia hardware.... Time will tell when/if nvidia or/and game updates arrive, we have already learned that simply enabling rebar on nvidia side has brought a good improvement and nvidia have stated they will be enabling this from their side with the next driver update.

Again, it's not a big shock when companies do this, they maybe aren't in there going out of their way to completely ruin xyz but there is no doubt there is some influence happening in order to highlight xyz companies strengths.... You have said you prefer amd, that's fine but stop making out like they are the peoples champion, reality is, they aren't and you're just falling for their "good guy image" that their PR/marketing guys have portrayed.
 
Last edited:
  • Like
Reactions: TNA
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
So much for the amd blocking conspiracy theories.

Just because they are adding it now doesn't mean there wasn't any dodgy shenanigans happening before, if anything, given how there is a significantly higher % of amd sponsored games being announced with dlss as well as older games now getting it added since getting called out by gamer nexus and hub, this says to me that they did have some say in it (well devs have "supposedly" already confirmed they had to remove dlss in other games due to sponsorship according to John from DF so whilst not directly starfield, no reason why you couldn't apply this logic to other games assuming it is true.....)

If not any shady actions directly by amd, well then they are probably "encouraging" it to be added now given how everyone has highlighted/called out the majority of amd sponsored titles missing it and well it kind of goes against their whole business motto and is just downright bad PR for them, which they really don't need anymore of.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Or it could have been the game was being rushed to completion and they had to add fsr as part of contractual obligations and add dlss after the fact.. as they already had enough to cram into the game before release. The state the game released in certainly supports a scramble to the finishing line.


Not everything needs to be a 911 conspiracy theory, though in this sub section thats par for the course....

That could also be a reason but again, all I'm saying is just because amd sponsored games are all of a sudden getting dlss now.... don't completely rule out some dodgy shenanigans all because amd have this "white knights" image about them, especially when you look at the bigger picture and take on board all view points i.e.


Either way, hopefully this will be the end of it now and we'll see all games including ALL upscalers going forward, maybe amd will get on board with streamline now too......
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Normally I would agree but when John Linneman from Digital foundry came out and said he talked with multiple people that worked on AMD sponsored games that added DLSS prior to release and they were then asked to pull the feature.... if it walks like a duck and FSR's like a duck....

Yup, obviously take with some salt too but he is a far more credible source than likes of MLID :o I'm 99% sure the main game he was referring to was boundary as that game was shown to be running dlss and ray tracing before amd swooped in and all nvidia tech got removed......
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
And it's not the first time we've heard this from developers

A couple years ago, a developer on assassins creed Valhalla stated in an interview that due to their contract with AMD, they were not allowed to optimise the game for Nvidia GPUs and had to focus all their time to optimise for AMD, subsequently AMD GPUs performed about 30% better than they did relative to Nvidia in other games

Link to this?

The large gap in performance was down to rebar/sam and when nvidia fixed/improved their dx 12 along with rebar, the gap closed significantly.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,345
Bluedot55 ran both DLSS and third party scalers on an Nvidia RTX 4090 and measured Tensor core utilisation. Looking at average Tensor core usage, the figures under DLSS were extremely low, less than 1%.

Initial investigations suggested even the peak utilisation registered in the 4-9% range, implying that while the Tensor cores were being used, they probably weren't actually essential. However, increasing the polling rate revealed that peak utilisation is in fact in excess of 90%, but only for brief periods measured in microseconds.

When you think about it, that makes sense. The upscaling process has to be ultra quick if it's not to slow down the overall frame rate. It has to take a rendered frame, process it, do whatever calculations are required for the upscaling, and output the full upscaled frame before the 3D pipeline has had time to generate a new frame.

So, what you would expect to find is exactly what Bluedot55 observed. An incredibly brief but intense burst of activity inside the Tensor cores when DLSS upscaling is enabled.

Of course, Nvidia's GPUs have offered Tensor cores for three generations and you have to go back to the GTX 10 series to find an Nvidia GPU that doesn't support DLSS at all. However, as Nvidia adds new features to the DLSS overall superset, such as Frame Generation, newer hardware is being left behind.

What this investigation shows is that while it's tempting to doubt Nvidia's motives whenever it seeming locks out older GPUs from a new feature, the reality may be simply be that the new GPUs can do things old ones can't. That's progress for you

 
Back
Top Bottom