• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
14 Aug 2009
Posts
2,932
Actual game developer from Nixxes weighs in on the missing DLSS debate, says FSR, DLSS and XeSS are all very easy to implement so any excuses people make are weak

True.

At the same time, setting up a game to support multidisplay configs can be trivial at times (done by modders very quickly), but games fail to do so by themselves...

Sometimes is lazyness, sometimes just business (contracts).
 
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
True.

At the same time, setting up a game to support multidisplay configs can be trivial at times (done by modders very quickly), but games fail to do so by themselves...

Sometimes is lazyness, sometimes just business (contracts).
One has to then question why Cyberpunk 2077 took over a year to implement FSR then? I didn't see all this moaning from the tech press about that? Most people in late 2020 didn't have Turing or Ampere cards so had to use in game resolution scaling including me on my GTX1080.

I didn't see any of the same media approaching Nvidia asking why there was no FSR for so long?

The same media and PCMR had no issues because they implied Nvidia sponsored the title and it was the fault of AMD for not doing so. The same media which kept quiet when it was quite clear that if XeSS could be made to work in some way on non Intel cards, why couldn't DLSS? Why is DLSS3 not available on Ampere.

The biggest Elephant in the room is why is DLSS locked to only some Nvidia cards when FSR and XeSS on all? If Nvidia made a fallback layer for other cards, then devs would probably just use DLSS even on consoles.

It's almost like Nvidia got beaten by AMD for a big gaming title so is trying a media blitz to force Microsoft to put in DLSS. I remember 20 years ago with HL2 when Nvidia got beaten by ATI in that game, something similar happened too.
There was even a backlash that ATI was trying to gimp DX9 performance on Nvidia FX cards in the game due to the sponsorship.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,932
One has to then question why Cyberpunk 2077 took over a year to implement FSR then? I didn't see all this moaning from the tech press about that? Most people in late 2020 didn't have Turing or Ampere cards so had to use in game resolution scaling including me on my GTX1080.

I didn't see any of the same media approaching Nvidia asking why there was no FSR for so long?

The same media and PCMR had no issues because they implied Nvidia sponsored the title and it was the fault of AMD for not doing so. The same media which kept quiet when it was quite clear that if XeSS could be made to work in some way on non Intel cards, why couldn't DLSS? Why is DLSS3 not available on Ampere.

The biggest Elephant in the room is why is DLSS locked to only some Nvidia cards when FSR and XeSS on all? If Nvidia made a fallback layer for other cards, then devs would probably just use DLSS even on consoles.

It's almost like Nvidia got beaten by AMD for a big gaming title so is trying a media blitz to force Microsoft to put in DLSS. I remember 20 years ago with HL2 when Nvidia got beaten by ATI in that game, something similar happened too.
There was even a backlash that ATI was trying to gimp DX9 performance on Nvidia FX cards in the game due to the sponsorship.
Well, nothing stopping the media and game devs going after Nvidia.

The answer for why Ampere or Turing doesn't get dlss3 is due to the hw present on those cards isn't fast enough to worth it. At least they didn't say "no comment" :))

I guess you do need hw to get significant performance and still good enough IQ. If it can be done the same in software or just on regular hw, AMD and Intel are welcomed to prove otherwise.
 
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
Actual game developer from Nixxes weighs in on the missing DLSS debate, says FSR, DLSS and XeSS are all very easy to implement so any excuses people make are weak

And yet with AAA games costing often over $200 million to develop they will cut any single penny they can from production cost. Expect smaller and indie devs to implement such things, do not expect big AAA publishers to do anything unless they earn monies on it in some ways. Apparently they do not in this case, so they simply won't do it. Also, it's not just implementation - you can turn on DLSS and other with UE plugin. But then you have to spend money and time testing it and ironing any possible bugs (which games are full on already as is anyway). Possibly optimise some textures and/or other assets to make them look better with DLSS turned on - some might be generating unexpected artefacts. Then you have to support it during the lifetime as well - people will contact support about it (for whatever stupid reason, it doesn't matter), that cost money too.

In the end, every single smallest thing you change/add to the game can cost a lot more money to fix/optimise/support later than you can imagine. It's not about just turning it on, it's about all other related cost. Nothing in development is free. And big publishers are horrible penny pinchers, just to extract every single smallest drop of profit they can.
This is also likely they reason why they almost never update DLSS in patches after they release the game (often they don't update FSR 2.x either) - even though in theory it's just simple DLSS swap. They have to test it well first, fix bugs, optimise, prepare support for it etc. It all cost a lot of money for these big publishers and they simply will not do it.
 
Last edited:
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
They said in the 80s we would have space lasers to shoot down nuclear missiles. I think we are still waiting for that particular Star Wars project to develop.
We had that tech for ages. It was decided to be too expensive and not worth it - Earth is huge, orbits are huge and such laser satellites are expensive, big, and have relatively small coverage, plus they constantly run around Earth, so it takes way too long to get one into position to shoot down a missile in time. You'd need a lot of them (again money and not worth it) plus they're very easy to shoot down from the surface. Very much not comparable to AIG development (we don't have that tech yet, unlike said satellites). That said, they're quickly developing such lasers for AA defense from Earth (military ships, etc.).
 
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
One has to then question why Cyberpunk 2077 took over a year to implement FSR then? I didn't see all this moaning from the tech press about that? Most people in late 2020 didn't have Turing or Ampere cards so had to use in game resolution scaling including me on my GTX1080.

I didn't see any of the same media approaching Nvidia asking why there was no FSR for so long?

The same media and PCMR had no issues because they implied Nvidia sponsored the title and it was the fault of AMD for not doing so. The same media which kept quiet when it was quite clear that if XeSS could be made to work in some way on non Intel cards, why couldn't DLSS? Why is DLSS3 not available on Ampere.

The biggest Elephant in the room is why is DLSS locked to only some Nvidia cards when FSR and XeSS on all? If Nvidia made a fallback layer for other cards, then devs would probably just use DLSS even on consoles.

It's almost like Nvidia got beaten by AMD for a big gaming title so is trying a media blitz to force Microsoft to put in DLSS. I remember 20 years ago with HL2 when Nvidia got beaten by ATI in that game, something similar happened too.
There was even a backlash that ATI was trying to gimp DX9 performance on Nvidia FX cards in the game due to the sponsorship.
History likes to repeat itself, though by the time this happens, hardly anyone remembers the previous time. And you're right, both about CP2077 and HL2 (FX cards were absolutely horrible in DX9 - not ATI's fault!).
 
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
Well, nothing stopping the media and game devs going after Nvidia.

The answer for why Ampere or Turing doesn't get dlss3 is due to the hw present on those cards isn't fast enough to worth it. At least they didn't say "no comment" :))

I guess you do need hw to get significant performance and still good enough IQ. If it can be done the same in software or just on regular hw, AMD and Intel are welcomed to prove otherwise.

Well they won't because they are probably scared Nvidia will cut them off.I would imagine an RTX3090TI would have more Tensor core output than an RTX4060,so surely it would work on the fastest Ampere based dGPUs? If the RTX3090TI lacked the Tensor hardware like a GTX1080TI did versus an RTX2080 it might be understandable.

We all know the real reason for the DLSS lockout - Nvidia knew they were going to jack up pricing with the RTX4000 series,so DLSS3 is being used to justify the price increase. This is why the massive push around DLSS - I don't think there was such a massive social media push for it even a year ago?

Maybe people need to be asking why be so concerned about DLSS/FSR,instead of asking why BOTH Nvidia/AMD are releasing trash releases below £600? RTX4060TI,RTX4060 and RX7600 are joke priced releases. The RTX4070 looks OK until you realise you are paying £200 more for a 40% increase over an RTX3060TI when the latter was that much faster over an RTX2060 Super for the same price. Or AMD rebadging the RX7800XT as the RX7900XT with a massive price hike,etc.

Also Epic games has its own temporal upscaler called TSR. So if that works on all cards,there was nothing stopping Nvidia using that as a fallback layer to enable DLSS on older Nvidia cards and competitor cards? They get on well - surely just license that? I question why AMD didn't even do the same?

Nvidia still sell the mainstream GTX1650 and GTX1660 series even now which are Turing based and they have to rely on FSR or XeSS:

They are very popular cards!


History likes to repeat itself, though by the time this happens, hardly anyone remembers the previous time. And you're right, both about CP2077 and HL2 (FX cards were absolutely horrible in DX9 - not ATI's fault!).
The whole timing of this "outrage" is very suspect. As a big Fallout 4 player,that game not only had Nvidia exclusive technologies but even to this day runs better on Nvidia hardware(I run one of the most up to date benchmarking threads for it). I didn't see people complain about that.

And yet with AAA games costing often over $200 million to develop they will cut any single penny they can from production cost. Expect smaller and indie devs to implement such things, do not expect big AAA publishers to do anything unless they earn monies on it in some ways. Apparently they do not in this case, so they simply won't do it. Also, it's not just implementation - you can turn on DLSS and other with UE plugin. But then you have to spend money and time testing it and ironing any possible bugs (which games are full on already as is anyway). Possibly optimise some textures and/or other assets to make them look better with DLSS turned on - some might be generating unexpected artefacts. Then you have to support it during the lifetime as well - people will contact support about it (for whatever stupid reason, it doesn't matter), that cost money too.

In the end, every single smallest thing you change/add to the game can cost a lot more money to fix/optimise/support later than you can imagine. It's not about just turning it on, it's about all other related cost. Nothing in development is free. And big publishers are horrible penny pinchers, just to extract every single smallest drop of profit they can.
This is also likely they reason why they almost never update DLSS in patches after they release the game (often they don't update FSR 2.x either) - even though in theory it's just simple DLSS swap. They have to test it well first, fix bugs, optimise, prepare support for it etc. It all cost a lot of money for these big publishers and they simply will not do it.

This is also Bethesda Games Studio. Forget RT,FSR and DLSS. Even without those,they have a history of releasing game with bugs,because they are actually not that huge in size for the type of games they produce. This is why they take years and years to make games. Fallout 5 might end up being released in the 2030s at this rate! :(

I would rather they release the game in a good state without RT,DLSS or FSR. These can always be added in afterwards.

A lot of these internet commentators who think the game won't sell because of "not enough RT" or "no DLSS" would have never bought Fallout4,Fallout 3,Fallout:New Vegas or even Skyrim. These were not the best looking games,and were full of bugs and other problems. Yet they sold in their 10s of millions.

This sounds like the social media backlash against Hogwarts Legacy due to JK Rowling or 8GB cards.

Starfield will be a success not based on graphics. It will be a success on whether the gameplay,world building,characters and story are decent.

If people want pretty games full of modern graphics tech,CDPR is where people need to be looking at,and even with a huge number of people see what happened with CB2077?

It had all the RT,all the DLSS,etc and yet the core RPG elements,AI,etc were all half completed. If it had launched with the core elements in a better state,it would done better,because you can always add better technology to a game which has the base elements which are good. Doing it the other way around is much harder.
 
Last edited:
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
If people want pretty games full of modern graphics tech,CDPR is where people need to be looking at,and even with a huge number of people see what happened with CB2077?

It had all the RT,all the DLSS,etc and yet the core RPG elements,AI,etc were all half completed. If it had launched with the core elements in a better state,it would done better,because you can always add better technology to a game which has the base elements which are good. Doing it the other way around is much harder.
Just by the way that, I suggest people read what VP of PR of CDPR said recently about state of CP2077 on release - pure gaslight of players and reality, where he pretty much said game was in great state on release, people just hated it because it “became a cool thing not to like it.”. :) That is how all these big corporations treat their clients - gaslight, lies, contempt, etc. And it's always just about money in the end - we have it, they want every single penny. Sad thing is - a lot of people already forgot how CP2077 looked on release, same as other bits of history. And then it will just repeat itself.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
Just by the way that, I suggest people read what CEO of CDPR said recently about state of CP2077 on release - pure gaslight of players and reality, where he pretty much said game was in great state on release, people just hated it because it “became a cool thing not to like it.”. :) That is how all these big corporations treat their clients - gaslight, lies, contempt, etc. And it's always just about money in the end - we have it, they want every single penny. Sad thing is - a lot of people already forgot how CP2077 looked on release, same as other bits of history. And then it will just repeat itself.

The worst thing is the depreciated RPG aspects for me. Their demos beforehand looked amazing. Although at least with a Bethesda Games Studio game we know it's going to be buggy - a Bethesda Games Studio tradition! :rolleyes: :p
 
Last edited:
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
The worst thing is the depreciated RPG aspects for me. Their demos beforehand looked amazing. Although at least with a Bethesda Games Studio game we know it's going to be buggy - a Bethesda tradition! :rolleyes: :p
Agreed, though they were so delayed and changed the development ideas mid-development so many times, it's almost a miracle the game came out at all in the end. Bethesda is different - we know and they know how it will look and work on release. And then they'll fix it over time. One can wait or jump on early and experience bug-fest. Some people enjoy it, I'll wait. ;)
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
Agreed, though they were so delayed and changed the development ideas mid-development so many times, it's almost a miracle the game came out at all in the end.

I wish they just concentrated on the RPG elements first. I hope Phantom Liberty improves on that,although apparently it might run worse now. Hopefully AMD CPUs get proper optimisations in the game for SMT.
 
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
I wish they just concentrated on the RPG elements first. I hope Phantom Liberty improves on that,although apparently it might run worse now. Hopefully AMD CPUs get proper optimisations in the game for SMT.
I don't see how AMD could help them or pay for it, as it's NVIDIA-locked sponsored game. Which means, devs by themselves won't be bothered to do it and likely not much will change in that regard. It swings back to the "missing DLSS" thing here - if big devs get no money/help from vendor, they hardly can be bothered to do it themselves.
 
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
I don't see how AMD could help them or pay for it, as it's NVIDIA-locked sponsored game. Which means, devs by themselves won't be bothered to do it and likely not much will change in that regard. It swings back to the "missing DLSS" thing here - if big devs get no money/help from vendor, they hardly can be bothered to do it themselves.

You would think with the consoles using Zen2 CPUs it would make sense!
 
Associate
Joined
6 Nov 2005
Posts
2,442
My basket at OcUK:

Total: £287.99 (includes delivery: £0.00)​




I've just seen that palit have only put an 8x connection on this card, I've never noticed anyone doing this before, they usually put a 16x and only have half of it wired up? I know it doesn't make a difference, it just kind of caught me by surprise. Has palit, or anyone else, bothered to do this before?
 
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
My basket at OcUK:

Total: £287.99 (includes delivery: £0.00)​




I've just seen that palit have only put an 8x connection on this card, I've never noticed anyone doing this before, they usually put a 16x and only have half of it wired up? I know it doesn't make a difference, it just kind of caught me by surprise. Has palit, or anyone else, bothered to do this before?
The RTX4060 series has only a PCI-E 8X connector.
 
Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
What needs to happen is 4080ti at £1200 or under and 4080 at sub £1k, consider £850 would be a nicer price to work from. They need to do this or cards are just not gonna sell especially in todays economic climate.
You know what these companies will do. Just slowly make sure the older cards get less optimisations in newer games, then spin you need DLSS3 or FSR3 to run the games which is blocked from the older ones, so people will be forced to upgrade to a poor value card. Then blame consoles and do the same next generation.
 
Last edited:
Soldato
Joined
22 May 2010
Posts
12,370
Location
Minibotpc
You know what these companies will do. Just slowly make sure the older cards get less optimisations in newer games, then spin you need DLSS3 or FSR3 to run the games, so people will be forced to upgrade to a poor value card.

They'll only be kicking themselves in the teeth then, not everyone upgrades every gen and if they choose to take this route, it will just means less sales on games if they can't be run properly on older hardware.

Surely devs won't want this to happen and implement better support for hardware across the board to keep sales up.
 
Back
Top Bottom