Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Jedi Survivor is getting official DLSS support in its next patch which should be today, Was not expecting that.
![]()
STAR WARS Jedi: Survivor™ Patch Notes
查看《星球大战 绝地:幸存者》针对 PC、PlayStation 5 和 Xbox Series X/S 的最新更新。www.ea.com
Main notes of importance for PC -
Whatever, it just stinks that we have to rely on a modder to give consumers a choice in upscalers.
I said before, I hope AMD don't get to 'sponsor' too many more AAA titles.
Unless FSR3 turns out be better than DLSS, even so, I'd still like the choice.
Really?
Good to hear, what games are those then?
FSR, bloody dog****. Worst part of owning an AMD card, the lack of a decent hardware solution
If they had one then the XT/XTX would be a seriously long-term card.
Considering it took them 4 and a half months to implement super resolution and frame generation which are both plugins in Unreal which JS is based on I have very low expectations for a fix![]()
Big oooff, It's like that controversy a few years back where Nvidia was apparently using drivers to lower image quality to get better benchmark scores in reviews.
That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.
Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its slower. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings.
We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.
Also @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them.
So maybe Nvidia needs to fix its drivers too.
I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.
#Totally normal behaviour.
![]()
![]()
Which will have zero effect on performance in the interior spaces? So is that AMD/Bethesda Games Studio fault too?
You literally quoted what I said:
So when Nvidia does badly in a game its due to sabotage and nothing to do with Nvidia hardware,drivers and poor dev relations. When AMD has poor performance its down to AMD hardware,drivers and poor dev relations as you said.
Also,Digital Foundry said its quite clear that many RTX remix titles have no optimisation for AMD,so not to expect good performance:
![]()
Starfield PC - Digital Foundry Tech Review - Best Settings, Xbox Series X Comparisons + More
Starfield on PC delivers the best way to play the game, assuming your hardware is capable enough - but it's clear that there's a lot of issues that Bethesda ...youtu.be
That is their own video and they said Starfield is no better than Portal RTX,where one company has an advantage in "IHV and driver optimisations" and it wasn't some conspiracy. Timestamp is at 23 minutes.
So basically "poor" performance in Starfield must be down to Nvidia hardware,drivers and dev relations. Thanks for clearing that up. Because reading this thread you would think it wasn't the case.
@keef247 on here confirmed the Nvidia drivers from months ago and the newest one have zero effect on game performance with an RTX4070.
The last Nvidia driver was on the 22nd of August. So maybe you should be asking why Nvidia hasn't bothered launching drivers for three weeks now? Is that the fault of AMD and Bethesda games studio? I thought Nvidia with their billions from AI,would have a super performance driver out by now. Oh well,apparently not yet.
Hopefully they will! I would rather stick with what I have! However,performance is still OK on my card,so it will do the job!![]()
I would much prefer well optimised PC games with vendor specific features. As a 4080 owner (been mainly Nvidia sine the 2080) I am happy to see DLSS is implemented where possible. Bethesda saying they had no contact from Nvidia re optimisations seems plausible but that does not mean they could not include at least a working implementation of DLSS.
What strikes me as hypocritical is the faux outrage that suddenly vendor optimisation is bad because it affects Nvidia.
AMD sponsor and optimise a game and it’s all boo hiss the shame of it
Nvidia sponsor a game and it’s just seen as par for the course “cus majority”. If you want the best features buy Nvidia… right?
Was there similar outrage when BG3 released with only FSR1? If Starfield had DLSS1 would you all be OK as Bethesda could claim Nvidia owners are catered for?
I’m not arguing that Starfield is well optimised, or that DLSS should not be included from launch. I’m just asking for some perspective and lack of hypocrisy.
Lol, the mental gymnastics is truly amazing.
You are suggesting that AMD and Bethesda deliberately left out Nvidia features and optimisations. Yet the reason BG3 only has FSR1 is because of AMDs fault. It couldn’t possibly be that Nvidia insisted, nope we want AMD features to look gash.
I’m not saying either are true, just pointing out the double standards.
AMD optimised and Nvidia features missing or poorly implemented = AMD plot.
Nvidia optimised and AMD features missing or poorly implemented = AMD stupid
So much for the amd blocking conspiracy theories.
Or it could have been the game was being rushed to completion and they had to add fsr as part of contractual obligations and add dlss after the fact.. as they already had enough to cram into the game before release. The state the game released in certainly supports a scramble to the finishing line.
Not everything needs to be a 911 conspiracy theory, though in this sub section thats par for the course....
Normally I would agree but when John Linneman from Digital foundry came out and said he talked with multiple people that worked on AMD sponsored games that added DLSS prior to release and they were then asked to pull the feature.... if it walks like a duck and FSR's like a duck....
And it's not the first time we've heard this from developers
A couple years ago, a developer on assassins creed Valhalla stated in an interview that due to their contract with AMD, they were not allowed to optimise the game for Nvidia GPUs and had to focus all their time to optimise for AMD, subsequently AMD GPUs performed about 30% better than they did relative to Nvidia in other games
Bluedot55 ran both DLSS and third party scalers on an Nvidia RTX 4090 and measured Tensor core utilisation. Looking at average Tensor core usage, the figures under DLSS were extremely low, less than 1%.
Initial investigations suggested even the peak utilisation registered in the 4-9% range, implying that while the Tensor cores were being used, they probably weren't actually essential. However, increasing the polling rate revealed that peak utilisation is in fact in excess of 90%, but only for brief periods measured in microseconds.
When you think about it, that makes sense. The upscaling process has to be ultra quick if it's not to slow down the overall frame rate. It has to take a rendered frame, process it, do whatever calculations are required for the upscaling, and output the full upscaled frame before the 3D pipeline has had time to generate a new frame.
So, what you would expect to find is exactly what Bluedot55 observed. An incredibly brief but intense burst of activity inside the Tensor cores when DLSS upscaling is enabled.
Of course, Nvidia's GPUs have offered Tensor cores for three generations and you have to go back to the GTX 10 series to find an Nvidia GPU that doesn't support DLSS at all. However, as Nvidia adds new features to the DLSS overall superset, such as Frame Generation, newer hardware is being left behind.
What this investigation shows is that while it's tempting to doubt Nvidia's motives whenever it seeming locks out older GPUs from a new feature, the reality may be simply be that the new GPUs can do things old ones can't. That's progress for you