Well, nothing stopping the media and game devs going after Nvidia.
The answer for why Ampere or Turing doesn't get dlss3 is due to the hw present on those cards isn't fast enough to worth it. At least they didn't say "no comment"
)
I guess you do need hw to get significant performance and still good enough IQ. If it can be done the same in software or just on regular hw, AMD and Intel are welcomed to prove otherwise.
Well they won't because they are probably scared Nvidia will cut them off.I would imagine an RTX3090TI would have more Tensor core output than an RTX4060,so surely it would work on the fastest Ampere based dGPUs? If the RTX3090TI lacked the Tensor hardware like a GTX1080TI did versus an RTX2080 it might be understandable.
We all know the real reason for the DLSS lockout - Nvidia knew they were going to jack up pricing with the RTX4000 series,so DLSS3 is being used to justify the price increase. This is why the massive push around DLSS - I don't think there was such a massive social media push for it even a year ago?
Maybe people need to be asking why be so concerned about DLSS/FSR,instead of asking why BOTH Nvidia/AMD are releasing trash releases below £600? RTX4060TI,RTX4060 and RX7600 are joke priced releases. The RTX4070 looks OK until you realise you are paying £200 more for a 40% increase over an RTX3060TI when the latter was that much faster over an RTX2060 Super for the same price. Or AMD rebadging the RX7800XT as the RX7900XT with a massive price hike,etc.
Also Epic games has its own temporal upscaler called TSR. So if that works on all cards,there was nothing stopping Nvidia using that as a fallback layer to enable DLSS on older Nvidia cards and competitor cards? They get on well - surely just license that? I question why AMD didn't even do the same?
Nvidia still sell the mainstream GTX1650 and GTX1660 series even now which are Turing based and they have to rely on FSR or XeSS:
They are very popular cards!
History likes to repeat itself, though by the time this happens, hardly anyone remembers the previous time. And you're right, both about CP2077 and HL2 (FX cards were absolutely horrible in DX9 - not ATI's fault!).
The whole timing of this "outrage" is very suspect. As a big Fallout 4 player,that game not only had Nvidia exclusive technologies but even to this day runs better on Nvidia hardware(I run one of the most up to date benchmarking threads for it). I didn't see people complain about that.
And yet with AAA games costing often over $200 million to develop they will cut any single penny they can from production cost. Expect smaller and indie devs to implement such things, do not expect big AAA publishers to do anything unless they earn monies on it in some ways. Apparently they do not in this case, so they simply won't do it. Also, it's not just implementation - you can turn on DLSS and other with UE plugin. But then you have to spend money and time testing it and ironing any possible bugs (which games are full on already as is anyway). Possibly optimise some textures and/or other assets to make them look better with DLSS turned on - some might be generating unexpected artefacts. Then you have to support it during the lifetime as well - people will contact support about it (for whatever stupid reason, it doesn't matter), that cost money too.
In the end, every single smallest thing you change/add to the game can cost a lot more money to fix/optimise/support later than you can imagine. It's not about just turning it on, it's about all other related cost. Nothing in development is free. And big publishers are horrible penny pinchers, just to extract every single smallest drop of profit they can.
This is also likely they reason why they almost never update DLSS in patches after they release the game (often they don't update FSR 2.x either) - even though in theory it's just simple DLSS swap. They have to test it well first, fix bugs, optimise, prepare support for it etc. It all cost a lot of money for these big publishers and they simply will not do it.
This is also Bethesda Games Studio. Forget RT,FSR and DLSS. Even without those,they have a history of releasing game with bugs,because they are actually not that huge in size for the type of games they produce. This is why they take years and years to make games. Fallout 5 might end up being released in the 2030s at this rate!
I would rather they release the game in a good state without RT,DLSS or FSR. These can always be added in afterwards.
A lot of these internet commentators who think the game won't sell because of "not enough RT" or "no DLSS" would have never bought Fallout4,Fallout 3,Fallout:New Vegas or even Skyrim. These were not the best looking games,and were full of bugs and other problems. Yet they sold in their 10s of millions.
This sounds like the social media backlash against Hogwarts Legacy due to JK Rowling or 8GB cards.
Starfield will be a success not based on graphics. It will be a success on whether the gameplay,world building,characters and story are decent.
If people want pretty games full of modern graphics tech,CDPR is where people need to be looking at,and even with a huge number of people see what happened with CB2077?
It had all the RT,all the DLSS,etc and yet the core RPG elements,AI,etc were all half completed. If it had launched with the core elements in a better state,it would done better,because you can always add better technology to a game which has the base elements which are good. Doing it the other way around is much harder.