• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.

Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its slower. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings.

We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.

Also @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them.

So maybe Nvidia needs to fix its drivers too.



I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.

#Totally normal behaviour.
relative-performance-rt-1920-1080.png

rt-cyberpunk-2077-1920-1080.png

The lack of stars etc. was noticed by a couple of end users on Reddit so it's not just some press site making this up.... So it's either an AMD or/and game issue, same way nvidias lesser performance is either driver or/and game related.

As for ray tracing and cyberpunk decreasing amd performance, well that is 100% down to amds lack of investment in dedicated hardware acceleration for rt and it's nothing new as this is the case across every RT title (aside from 1-2 amd sponsored rt games) so it's not really surprising, of course if you want to play the "Nvidia bad" card, you be better of using the rtx remix titles where amd gpus completely crash to like 5 fps and have graphical artifacts.

I suppose amd not having ray reconstruction like Nvidia thus lesser IQ and worse performance will also be nvidias fault too?
 
The lack of stars etc. was noticed by a couple of end users on Reddit so it's not just some press site making this up.... So it's either an AMD or/and game issue, same way nvidias lesser performance is either driver or/and game related.

As for ray tracing and cyberpunk decreasing amd performance, well that is 100% down to amds lack of investment in dedicated hardware acceleration for rt and it's nothing new as this is the case across every RT title (aside from 1-2 amd sponsored rt games) so it's not really surprising, of course if you want to play the "Nvidia bad" card, you be better of using the rtx remix titles where amd gpus completely crash to like 5 fps and have graphical artifacts.

I suppose amd not having ray reconstruction like Nvidia thus lesser IQ and worse performance will also be nvidias fault too?


Which will have zero effect on performance in the interior spaces? So is that AMD/Bethesda Games Studio fault too?

You literally quoted what I said:

I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.

So when Nvidia does badly in a game its due to sabotage and nothing to do with Nvidia hardware,drivers and poor dev relations. When AMD has poor performance its down to AMD hardware,drivers and poor dev relations as you said.

Also,Digital Foundry said its quite clear that many RTX remix titles have no optimisation for AMD,so not to expect good performance:

That is their own video and they said Starfield is no better than Portal RTX,where one company has an advantage in "IHV and driver optimisations" and it wasn't some conspiracy. Timestamp is at 23 minutes.

So basically "poor" performance in Starfield must be down to Nvidia hardware,drivers and dev relations. Thanks for clearing that up. Because reading this thread you would think it wasn't the case.

@keef247 on here confirmed the Nvidia drivers from months ago and the newest one have zero effect on game performance with an RTX4070.

The last Nvidia driver was on the 22nd of August. So maybe you should be asking why Nvidia hasn't bothered launching drivers for three weeks now? Is that the fault of AMD and Bethesda games studio? I thought Nvidia with their billions from AI,would have a super performance driver out by now. Oh well,apparently not yet.

Hopefully they will! I would rather stick with what I have! However,performance is still OK on my card,so it will do the job!:)
 
Last edited:
That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? This was a bug reported 4 days ago. Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.

Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its faster. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings. So it can't be a lack of stars then.

We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.

Nvidia quite clear CBA with releasing drivers for this game - AMD has had several releases. Nvidia hasn't. The last release for my card is three weeks ago.



BTW @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them:



So maybe Nvidia needs to fix its drivers too. But then OTH they seem fine with their card,and even if my RTX3060TI could do better,I can just drop a few settings and its fine enough to complete the game with.

I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.

I wonder if my RTX3060TI will be quicker in Cyberpunk 2.0?! :cry:

#Totally normal behaviour.
relative-performance-rt-1920-1080.png

rt-cyberpunk-2077-1920-1080.png
Yep, but the internet will tell you I'm lying... Even though they don't own my hardware/pray to the clickbait shill review gods!

I ironically was running drivers from the 26th of July (when my card arrived - IIRC there was a driver 8 days before I'd installed it, based on the release info) they've been so stable on the 30-40 games I've played/nearly completed since then, that I didn't even remember to update them... I did last week and made no difference good/bad, on ANY game.

With the OG dlss2 PureDark mod it's been even more impressive, as has the newer DLSS3/3.5/Frame Gen mod... I run natively at 1440p ultra with around 52-60 fps in the busiest areas or 82-110fps with the latest DLSS3.5/FG using 100% scaling with no motion blur/no film grain/no variable resolution scaling.

TLDR people, JUST TRY IT FOR YOURSELF and get a refund within the time limit on steam if you dislike the game/don't get the performance you want. Not a big deal to just leave a download on for a few hours/overnight and play it on a day off/weekend and click refund if it doesn't float your boat.

I'd rather do that than waste my time believing nonsense clickbait shock factor 'reviews' online... Those people don't control my wallet, I DO...

OH AND PLOT TWIST... I myself AM an AMD guy... I have BOTH AMD and Nvidia gpu's in my 2 rigs... But I won't for a second not buy something because of some pathetic brand loyalty/racism. As with cars, if I have 2 engine swaps to consider I'll go for the one that creates the performance vs displacement I want and fits the best in the engine bay, as with anything in life, so NO fanboy nvidia'ism going on here, if you want to taint me with a bias I'd be more AMD if anything, as the only Intel CPU's I own have been because that's what came in my macs...
 
Last edited:
So basically you are admitting poor AMD performance in Cyberpunk 2077 is Nvidia/CDPR fault too? Oh,OK. But I thought it was because AMD drivers,hardware and dev relations were not very good!

But DF said on their video they didn't think so:

Without even being conspiratorial I think it is reasonable to say that there is probably not a level playing field with regard to driver maturity and ihv path optimisations at launch here. That would just make sense - I would also imagine the same for an AMD card in a title like Portal RTX - definitely not a level playing field in that game hence why I complained about AMD GPU performance on launch for that title.

So DF Alex on the Nvidia is now off the AMD naughty list to the Nvidia one?:o

Bit I am still asking why the last Nvidia driver was out on the 22nd of August? Even Intel has released drivers quicker.

Yep, but the internet will tell you I'm lying... Even though they don't own my hardware/pray to the clickbait shill review gods!

I ironically was running drivers from the 26th of July (when my card arrived - IIRC there was a driver 8 days before I'd installed it, based on the release info) they've been so stable on the 30-40 games I've played/nearly completed since then, that I didn't even remember to update them... I did last week and made no difference good/bad, on ANY game.

With the OG dlss2 PureDark mod it's been even more impressive, as has the newer DLSS3/3.5/Frame Gen mod... I run natively at 1440p ultra with around 52-60 fps in the busiest areas or 82-110fps with the latest DLSS3.5/FG using 100% scaling with no motion blur/no film grain/no variable resolution scaling.

TLDR people, JUST TRY IT FOR YOURSELF and get a refund within the time limit on steam if you dislike the game/don't get the performance you want. Not a big deal to just leave a download on for a few hours/overnight and play it on a day off/weekend and click refund if it doesn't float your boat.

I'd rather do that than waste my time believing nonsense clickbait shock factor 'reviews' online... Those people don't control my wallet, I DO...

With a few tweaked settings it runs much better than I expected on my card. You also have a Ryzen 7 5700X like my mate with an RX6700XT has too.
 
Last edited:
So basically you are admitting poor AMD performance in Cyberpunk 2077 is Nvidia/CDPR fault too? Oh,OK.

But DF said on their video they didn't think so:



So DF Alex on the Nvidia is now off the AMD naughty list to the Nvidia one?:o

Bit I am still asking why the last Nvidia driver was out on the 22nd of August? Sure



With a few tweaked settings it runs much better than I expected on my card. You also have a Ryzen 7 5700X like my mate with an RX6700XT has too.

YEP! And I purposely choose a 5700x over a 5800x3d or AM5 build despite having a 2-2.5k budget on the rig alone due to wanting to build a performance per watt per degree per decibel silent cool machine, over sitting next to a loud furnace whilst inevitably thermal throttles and or requires water cooling which I personally am not into. And yet it still runs awesome, go figure! I also just saw a 6950xt will only do 50s with FSR2 at 4k ultra, I BET you money I can get the same fps or more with DLSS3.5/FG at 4k ultra... But that person will say I'm lying despite proving in the past I can match his fps/res/settings haha!

Just because you missed my edit I'll repeat this section for the shills...

"OH AND PLOT TWIST... I myself AM an AMD guy... I have BOTH AMD and Nvidia gpu's in my 2 rigs... But I won't for a second not buy something because of some pathetic brand loyalty/racism. As with cars, if I have 2 engine swaps to consider I'll go for the one that creates the performance vs displacement I want and fits the best in the engine bay, as with anything in life, so NO fanboy nvidia'ism going on here, if you want to taint me with a bias I'd be more AMD if anything, as the only Intel CPU's I own have been because that's what came in my macs..."

I personally cannot stand DF, nor that passive aggressive little smart arse Alex, he's so up himself cause he did some camp modelling in germany, laughable grade stuff. Now look at him, bitches for a living on youtube doing clickbait moaning ha!
 
Last edited:
I actually don't care, I'm just posting various flavors of YES :D

This seems appropriate for the thread:

It's from YES.

I personally cannot stand DF, nor that passive aggressive little smart arse Alex, he's so up himself cause he did some camp modelling in germany, laughable grade stuff. Now look at him, bitches for a living on youtube doing clickbait moaning ha!

LOL.
 
Last edited:
Which will have zero effect on performance in the interior spaces? So is that AMD/Bethesda Games Studio fault too?

You literally quoted what I said:



So when Nvidia does badly in a game its due to sabotage and nothing to do with Nvidia hardware,drivers and poor dev relations. When AMD has poor performance its down to AMD hardware,drivers and poor dev relations as you said.

Also,Digital Foundry said its quite clear that many RTX remix titles have no optimisation for AMD,so not to expect good performance:

That is their own video and they said Starfield is no better than Portal RTX,where one company has an advantage in "IHV and driver optimisations" and it wasn't some conspiracy. Timestamp is at 23 minutes.

So basically "poor" performance in Starfield must be down to Nvidia hardware,drivers and dev relations. Thanks for clearing that up. Because reading this thread you would think it wasn't the case.

@keef247 on here confirmed the Nvidia drivers from months ago and the newest one have zero effect on game performance with an RTX4070.

The last Nvidia driver was on the 22nd of August. So maybe you should be asking why Nvidia hasn't bothered launching drivers for three weeks now? Is that the fault of AMD and Bethesda games studio? I thought Nvidia with their billions from AI,would have a super performance driver out by now. Oh well,apparently not yet.

Hopefully they will! I would rather stick with what I have! However,performance is still OK on my card,so it will do the job!:)

I don't think anyone has stated that amd have "deliberately" sabotaged performance? If you watch the video by Alex, it's more than likely down to nvidia not getting access to the game until launch (maybe you could perhaps insinuate that this is part of amds sponsorship that caused this?) or/and a game/driver issue that needs fixed. The only thing regarding "potential" foul play regarding amd and starfield is the lack of dlss.

I'm not quite sure what your point is either tbh, the only thing I can seem to ever grasp from these posts (which is spreading across multiple threads now) is the usual "nvidia bad, amd good". You went on a rant about how cp performance goes down the drain with RT and well yeah, why is that surprising? It's one of the hardest games in terms of ray tracing, it's not some title where ray tracing reflections are only being applied on puddles and at like a 1/4 resolution....). AMD actually does very well given how much is happening in cp with RT and RDNA 3 is performing exactly how it should i.e. 7900xtx on par with a 3080 ti/3090..... This was covered pretty well in one of DF videos, iirc, it was for dying light 2 but essentially RDNA 3 and even RDNA 2 can cope quite well with light RT or/and when the raster side of things aren't demanding, which means the gpu can allocate/process ray tracing better, the issue with amds gpus and RT is when it comes to not just "multiple" ray tracing features but also when the ray tracing is used throughout a scene.

Again, if you want to stick to the "nvidia bad" examples, keep to the rtx remix ones as it's clear that there is some dodgy shenanigans happening there by nvidia....
 
I would much prefer well optimised PC games with vendor specific features. As a 4080 owner (been mainly Nvidia sine the 2080) I am happy to see DLSS is implemented where possible. Bethesda saying they had no contact from Nvidia re optimisations seems plausible but that does not mean they could not include at least a working implementation of DLSS.

What strikes me as hypocritical is the faux outrage that suddenly vendor optimisation is bad because it affects Nvidia.

AMD sponsor and optimise a game and it’s all boo hiss the shame of it

Nvidia sponsor a game and it’s just seen as par for the course “cus majority”. If you want the best features buy Nvidia… right?

Was there similar outrage when BG3 released with only FSR1? If Starfield had DLSS1 would you all be OK as Bethesda could claim Nvidia owners are catered for?

I’m not arguing that Starfield is well optimised, or that DLSS should not be included from launch. I’m just asking for some perspective and lack of hypocrisy.
 
Last edited:
I would much prefer well optimised PC games with vendor specific features. As a 4080 owner (been mainly Nvidia sine the 2080) I am happy to see DLSS is implemented where possible. Bethesda saying they had no contact from Nvidia re optimisations seems plausible but that does not mean they could not include at least a working implementation of DLSS.

What strikes me as hypocritical is the faux outrage that suddenly vendor optimisation is bad because it affects Nvidia.

AMD sponsor and optimise a game and it’s all boo hiss the shame of it

Nvidia sponsor a game and it’s just seen as par for the course “cus majority”. If you want the best features buy Nvidia… right?

Was there similar outrage when BG3 released with only FSR1? If Starfield had DLSS1 would you all be OK as Bethesda could claim Nvidia owners are catered for?

I’m not arguing that Starfield is well optimised, or that DLSS should not be included from launch. I’m just asking for some perspective and lack of hypocrisy.
Yeah it is mental really how people side with what suits their own narrative isn't it, especially with double standards... That's why I like that I've tried a modern mid tier Nvidia gpu this time despite being an amd guy, as with building cars/engine swaps etc I'd go with the engine/box/diff that suits the situation/performance/space required and never be loyal to just 1 brand...

This whole brand racism thing is utter madness to me, it sounds like something some woke 13 year old with aggressive hormones has invented that 'has a right to be offended m9' over everything - that probably doesn't even own a pc haha!
 
I would much prefer well optimised PC games with vendor specific features. As a 4080 owner (been mainly Nvidia sine the 2080) I am happy to see DLSS is implemented where possible. Bethesda saying they had no contact from Nvidia re optimisations seems plausible but that does not mean they could not include at least a working implementation of DLSS.

What strikes me as hypocritical is the faux outrage that suddenly vendor optimisation is bad because it affects Nvidia.

AMD sponsor and optimise a game and it’s all boo hiss the shame of it

Nvidia sponsor a game and it’s just seen as par for the course “cus majority”. If you want the best features buy Nvidia… right?

Was there similar outrage when BG3 released with only FSR1? If Starfield had DLSS1 would you all be OK as Bethesda could claim Nvidia owners are catered for?

I’m not arguing that Starfield is well optimised, or that DLSS should not be included from launch. I’m just asking for some perspective and lack of hypocrisy.

There is a big difference here.

In nvidia sponsored games, FSR and/or XESS is included too therefore nvidia sponsored titles are not locking out upscaling tech to non-rtx users. In amd sponsored games, yes we can use FSR but as shown time and time again, it is just downright ****, more so if you're playing at a res below 4k or/and using a lesser preset where FSR falls apart entirely so ok yes, a tech has been included which everyone can use but you are directly gimping/harming owners who "could" have a better experience but they can't because said companies have decided they know what is best because "reasons"....

Everyone should be advocating to have all tech included in games especially when as shown by both amds, intel and nvidias own guides as well as engines including said features in their engine, these upscaling things are not exactly hard/time consuming to implement, well for EA, it seems they can't do it well....

As for FSR 1 and 2, well that is more amds fault or rather their direction they go in i.e. it's up the community and developers to do as they please because "open source!!!!!!", go read the github page for fsr 2 for example where you have devs asking how can end users update FSR themselves like what can be done with dlss and the answer is basically you can't because of how amd have engineered FSR, it's up the devs on how they want to go about doing it and in order to update FSR to a newer version, it has to be done on the game devs end. DLSS and FSR whilst they achieve the same goal, they are both very different in how they are implemented/operated. FSR 2 is completely different to FSR 1 where as DLSS 2 is an update to DLSS 1
 
As for FSR 1 and 2, well that is more amds fault or rather their direction they go in i.e. it's up the community and developers to do as they please because "open source!!!!!!", go read the github page for fsr 2 for example where you have devs asking how can end users update FSR themselves like what can be done with dlss and the answer is basically you can't because of how amd have engineered FSR, it's up the devs on how they want to go about doing it and in order to update FSR to a newer version, it has to be done on the game devs end. DLSS and FSR whilst they achieve the same goal, they are both very different in how they are implemented/operated. FSR 2 is completely different to FSR 1 where as DLSS 2 is an update to DLSS 1

I've said it a few times but there's a fair few games that would benefit from being updated from FSR1 to FSR2, AC:Valhalla specifically. I know it'll never happen though.
 
Last edited:
There is a big difference here.

In nvidia sponsored games, FSR and/or XESS is included too therefore nvidia sponsored titles are not locking out upscaling tech to non-rtx users. In amd sponsored games, yes we can use FSR but as shown time and time again, it is just downright ****, more so if you're playing at a res below 4k or/and using a lesser preset where FSR falls apart entirely so ok yes, a tech has been included which everyone can use but you are directly gimping/harming owners who "could" have a better experience but they can't because said companies have decided they know what is best because "reasons"....

Everyone should be advocating to have all tech included in games especially when as shown by both amds, intel and nvidias own guides as well as engines including said features in their engine, these upscaling things are not exactly hard/time consuming to implement, well for EA, it seems they can't do it well....

As for FSR 1 and 2, well that is more amds fault or rather their direction they go in i.e. it's up the community and developers to do as they please because "open source!!!!!!", go read the github page for fsr 2 for example where you have devs asking how can end users update FSR themselves like what can be done with dlss and the answer is basically you can't because of how amd have engineered FSR, it's up the devs on how they want to go about doing it and in order to update FSR to a newer version, it has to be done on the game devs end. DLSS and FSR whilst they achieve the same goal, they are both very different in how they are implemented/operated. FSR 2 is completely different to FSR 1 where as DLSS 2 is an update to DLSS 1

Lol, the mental gymnastics is truly amazing.

You are suggesting that AMD and Bethesda deliberately left out Nvidia features and optimisations. Yet the reason BG3 only has FSR1 is because of AMDs fault. It couldn’t possibly be that Nvidia insisted, nope we want AMD features to look gash.

I’m not saying either are true, just pointing out the double standards.

AMD optimised and Nvidia features missing or poorly implemented = AMD plot.

Nvidia optimised and AMD features missing or poorly implemented = AMD stupid
 
Back
Top Bottom