• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
9 Nov 2009
Posts
24,979
Location
Planet Earth
That would be pretty nice wouldn't it. Can only hope but September may a little soon for FSR3?

Well considering RDNA3 launched last year they really need to get off their arses,because they need to at least have it from a marketing standpoint.
Do Nvidia GPU's still have a performance advantage over AMD for VR do we know? I dabble with my original Rift occasionally and may upgrade at some point but wondered if that means I may be forced to go Nvidia?

I think Nvidia are still better,but remember Starfield is AMD sponsored. Creation Engine historically been more Nvidia biased(Fallout 4),but if there is a scenario like with the newest COD games,then it might swing the other way.

Best thing is to wait and see how the benchmarks look - the game is on MS Gamepass too.
 
Associate
Joined
7 Jul 2016
Posts
163
I am no expert on VR and do not play with it but my understanding is that Nv works ok whereas AMD drivers are pretty awful for VR purposes.

Check again when you are looking to upgrade but it appears that VR is a very low priority for the AMD driver team.
Cheers, yes this is what I'm hearing.
 
Associate
Joined
7 Jul 2016
Posts
163
Well considering RDNA3 launched last year they really need to get off their arses,because they need to at least have it from a marketing standpoint.


I think Nvidia are still better,but remember Starfield is AMD sponsored. Creation Engine historically been more Nvidia biased(Fallout 4),but if there is a scenario like with the newest COD games,then it might swing the other way.

Best thing is to wait and see how the benchmarks look - the game is on MS Gamepass too.
Yeah looks definitely like a 'wait and see' scenario.

But then Amazon Prime day is just around the corner........ :D
 
Soldato
Joined
22 May 2010
Posts
12,370
Location
Minibotpc
Micron announced its gddr7 memory modules will become available for use in graphics cards by middle of 2024

first Gen gddr7 operates at 36Gbps. Here is the potential bandwidth of GPUs operating at these bus widths

  • 128-bit @ 36 Gbps: 576 GB/s
  • 192-bit @ 36 Gbps: 846 GB/s
  • 256-bit @ 36 Gbps: 1152 GB/s
  • 320-bit @ 36 Gbps: 1440 GB/s
  • 384-bit @ 36 Gbps: 1728 GB/s

Might be a reason why 5000 series is delayed till 2025 then possibly? It would be a tight squeeze releasing 5000 series a few months after the GDDR7 chips become available.
 
Soldato
Joined
6 Feb 2019
Posts
17,929
Gamersnexus got more details on the missing dlss support in AMD sponsored titles and also an interesting reply from AMD.

When asked to comment on the recent article, AMD did the same wavy reply beating about the bush, creating a long monologue and not answering if it blocks DLSS and XeSS.

Then the Starfield news came out that the game is an AMD exclusive partner and Gamers Nexus went back and asked if the contract between AMD and Bethesda say Bethesda may not implement support for other upscaling features like DLSS and XeSS.

This time, AMD didn't want to talk at all, no long essay on how to not answer a question. The answer was simply: No Comment

 
Associate
Joined
16 Aug 2017
Posts
1,155
Location
London
Gamersnexus got more details on the missing dlss support in AMD sponsored titles and also an interesting reply from AMD.

When asked to comment on the recent article, AMD did the same wavy reply beating about the bush, creating a long monologue and not answering if it blocks DLSS and XeSS.

Then the Starfield news came out that the game is an AMD exclusive partner and Gamers Nexus went back and asked if the contract between AMD and Bethesda say Bethesda may not implement support for other upscaling features like DLSS and XeSS.

This time, AMD didn't want to talk at all, no long essay on how to not answer a question. The answer was simply: No Comment
HU also sent them similar question and described in latest video how they see it. It's rather simple situation - when NVIDIA sponsors things, Devs still want their games to be well supported on as many platforms as possible (to sell more), so they add at least FSR to it but neither Nvidia nor AMD help with implementing it properly, so it's not always optimal. When AMD helps implement technologies in the game (sponsoring it) they also have no say what else Devs will exactly implement, but as fsr works on everything and Nvidia won't assists with games that AMD is sponsoring, Devs simply can't be bothered to add and then support other tech like DLSS or xess - it all cost money. In short words - it's on Devs what they implement and neither vendor blocks anything but cost cutting seems to be the main limitation here with what we see in the game in the end. Penny pinching Devs are the real culprit here.
 
Last edited:
Soldato
Joined
16 Sep 2018
Posts
12,726
Didn't they also say that that was what they suspect was happening? I'm not saying they're wrong as their logic and reasons seems to be without fault to me. Are devs going to spend time and money implementing DLSS/XESS or squashing bugs, the answer seems pretty straight forward, more so when you take into account cross platform titles.

I said it before but costs are the biggest barrier Nvidia face when it comes to the adoption of DLSS and RT (maybe less so RT), if the hardware to run it and development costs to implement it are too high it's going to die a slow and painful death. Not because it's technologically bad but because people simply see it as not being worth the extra costs. History is littered with the corpses of great technology that simply proved to expensive for the majority of people, G-Sync and Optane are two recent examples that spring to mind.

If Nvidia truly wanted DLSS & RT to become the standard they'd be trying to get the hardware that can run it into as many hands as possible, even making a loss on the hardware, so a developer has little choice but to implement those features because 80-90% of people are using hardware that supports it.
 
Soldato
Joined
6 Feb 2019
Posts
17,929
Didn't they also say that that was what they suspect was happening? I'm not saying they're wrong as their logic and reasons seems to be without fault to me. Are devs going to spend time and money implementing DLSS/XESS or squashing bugs, the answer seems pretty straight forward, more so when you take into account cross platform titles.

I said it before but costs are the biggest barrier Nvidia face when it comes to the adoption of DLSS and RT (maybe less so RT), if the hardware to run it and development costs to implement it are too high it's going to die a slow and painful death. Not because it's technologically bad but because people simply see it as not being worth the extra costs. History is littered with the corpses of great technology that simply proved to expensive for the majority of people, G-Sync and Optane are two recent examples that spring to mind.

If Nvidia truly wanted DLSS & RT to become the standard they'd be trying to get the hardware that can run it into as many hands as possible, even making a loss on the hardware, so a developer has little choice but to implement those features because 80-90% of people are using hardware that supports it.

Going my steam hardware survey, over 30% of PCs have GPUs that support dlss, which is quite significant- that's over 60 million dlss systems, more than all Xbox series and PS5 consoles sold combined
 
Last edited:
Soldato
Joined
16 Sep 2018
Posts
12,726
And when you factor in how many games actually support it, how many of those estimated 60 million people are actually playing games with DLSS.

Because to me 30% is not getting the hardware into as many hands as possible, quiet the opposite in fact. (e: Also just having checked the numbers 30% of 132m is more like 40m)

e: Like i said, as a developer would you rather spend your time and money doing something like squashing bugs that are going to benefit 100% of you customers or implementing a feature that's only going to be used by a small percentage.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,932
Didn't they also say that that was what they suspect was happening? I'm not saying they're wrong as their logic and reasons seems to be without fault to me. Are devs going to spend time and money implementing DLSS/XESS or squashing bugs, the answer seems pretty straight forward, more so when you take into account cross platform titles.

I said it before but costs are the biggest barrier Nvidia face when it comes to the adoption of DLSS and RT (maybe less so RT), if the hardware to run it and development costs to implement it are too high it's going to die a slow and painful death. Not because it's technologically bad but because people simply see it as not being worth the extra costs. History is littered with the corpses of great technology that simply proved to expensive for the majority of people, G-Sync and Optane are two recent examples that spring to mind.

If Nvidia truly wanted DLSS & RT to become the standard they'd be trying to get the hardware that can run it into as many hands as possible, even making a loss on the hardware, so a developer has little choice but to implement those features because 80-90% of people are using hardware that supports it.
DLSS is supported since rxt2000 series, that's like 5 years back almost.
Another way to read it, is that 70% of PCs don't support DLSS which is even more significant.

And probably a huge chunk of that would not play modern games either way. I think I've read somewhere that once you do the work of implementing one tech, there isn't that much to do for the others. After all, even modders got DLSS working so is just bad practice from developers.
 
Associate
Joined
27 Jan 2020
Posts
1,404
Location
West Sussex
HU also sent them similar question and described in latest video how they see it. It's rather simple situation - when NVIDIA sponsors things, Devs still want their games to be well supported on as many platforms as possible (to sell more), so they add at least FSR to it but neither Nvidia nor AMD help with implementing it properly, so it's not always optimal. When AMD helps implement technologies in the game (sponsoring it) they also have no say what else Devs will exactly implement, but as fsr works on everything and Nvidia won't assists with games that AMD is sponsoring, Devs simply can't be bothered to add and then support other tech like DLSS or xess - it all cost money. In short words - it's on Devs what they implement and neither vendor blocks anything but cost cutting seems to be the main limitation here with what we see in the game in the end. Penny pinching Devs are the real culprit here.
FSR is used on consoles as well isn't it?

I think I remember seeing the option on both Modern Warfare 2 and on Jedi Survivor - so its probably more efficient for developers, if they are going to implement/focus on one, to focus on the one that has the much wider reach in platforms (pretty much exactly like you said).
 
Soldato
Joined
12 May 2014
Posts
5,291
And probably a huge chunk of that would not play modern games either way. I think I've read somewhere that once you do the work of implementing one tech, there isn't that much to do for the others. After all, even modders got DLSS working so is just bad practice from developers.
Fair point on the first part.
One compliant I've seen on this forum and elsewhere is that some games poorly implement DLSS/FSR. If your statement was true and it was that simple, there would be no such as a good or bad implementation of DLSS/FSR.
 
Associate
Joined
12 Jun 2021
Posts
1,663
Location
Leeds
HU also sent them similar question and described in latest video how they see it. It's rather simple situation - when NVIDIA sponsors things, Devs still want their games to be well supported on as many platforms as possible (to sell more), so they add at least FSR to it but neither Nvidia nor AMD help with implementing it properly, so it's not always optimal. When AMD helps implement technologies in the game (sponsoring it) they also have no say what else Devs will exactly implement, but as fsr works on everything and Nvidia won't assists with games that AMD is sponsoring, Devs simply can't be bothered to add and then support other tech like DLSS or xess - it all cost money. In short words - it's on Devs what they implement and neither vendor blocks anything but cost cutting seems to be the main limitation here with what we see in the game in the end. Penny pinching Devs are the real culprit here.

If Nvidia want DLSS to be on all new games then integrate it into the DX standard. Problem solved.
 
Soldato
Joined
19 Oct 2008
Posts
5,954
If next gen is 2025 maybe will get a good bump in performance especially mid to lower end via a 4000 series refresh this year, hopefully giving gamers what the 4000 series should have been. I think NV will want to boost sales mid-life if it's gonna be dragged out to a three year life
 
Associate
Joined
22 Nov 2020
Posts
1,472
If next gen is 2025 maybe will get a good bump in performance especially mid to lower end via a 4000 series refresh this year, hopefully giving gamers what the 4000 series should have been. I think NV will want to boost sales mid-life if it's gonna be dragged out to a three year life
What would a good bump be compared to a 3080 performance wise?
 
Back
Top Bottom