• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

They only care about their profits and the resulting share price.

In the early days of capatalism you would price low and try to sell in large volumnes to maximise profits. This is not really the case now and more and more companies are following the Apple blueprint of pricing high for higher margins per unit sold.

Nvidia can crunch the numbers how they like and whether their profitability and share price increases is of no concern to me. I will buy one of their products when it suits me and my wallet.
 
So do you think Nvidia care?
Yes.

Nvidia wants to make money and there's money to be made by selling graphics cards to gamers. Sure, they sell to other markets, (I know Ai is big now) but I just can't see any CEO being happy making "most" of the money instead of "all" of the money.

AI + Gamers is more money than AI alone.

Also, after the way they got bit by mining, diversification is also more likely to keep profits high through changing markets.
 
If they keep releasing incremental expensive upgrades then people will just keep their 3080 or whatever card they currently have.
But then there will be DLSS 4 or whatever they want to call it and no, it won't be compatible with all the existing cards about there so you'll have to sell up and buy a new one, sorry about that.
 
But then there will be DLSS 4 or whatever they want to call it and no, it won't be compatible with all the existing cards about there so you'll have to sell up and buy a new one, sorry about that.

Is there any reason that DLSS 3 / frame gen won't work on a 3000 serious cards or is this another artificial Nvidia FkU?
 
Technically it can work, because some of the components exist on RTX 30. The issue is they are a generation old and not efficient, so whilst they could enable it to work, the performance would be worse and everyone would complain, that doesn't look good to investors... A few devs have commented on it in interviews with the tech media.

DLSS 2 (currently at version 3.x+) will continue to evolve and improve DLSS performance on all RTX cards though. So as DLSS 3 implements new super resolution techniques, DLSS 2 part of the system will get updated too so all cards benefit from this. Frame gen is what requires specific minimum HW level to work efficiently.
 
Last edited:
Technically it can work, because some of the components exist on RTX 30. The issue is they are a generation old and not efficient, so whilst they could enable it to work, the performance would be worse and everyone would complain, that doesn't look good to investors... A few devs have commented on it in interviews with the tech media.

DLSS 2 (currently at version 3.x+) will continue to evolve and improve DLSS performance on all RTX cards though. So as DLSS 3 implements new super resolution techniques, DLSS 2 part of the system will get updated too so all cards benefit from this. Frame gen is what requires specific minimum HW level to work efficiently.

Ok. Thanks for the reply and taking the time to explain that. So there will be some backward compatible improvements to DLSS on older cards too, Nvidia isn't abandoning them completely?
 
Ok. Thanks for the reply and taking the time to explain that. So there will be some backward compatible improvements to DLSS on older cards too, Nvidia isn't abandoning them completely?
DLSS 2 will always support older cards and continue to get updates. DLSS 3 is a superset of DLSS 2, the major additions for 3 being SER (shader execution re-ordering to speed up RT) and Frame Generation. If you disable FG in a game, but leave "DLSS 3" enabled, you're just using DLSS 2, but it runs a bit better on 40 series because of the newer gen Tensor cores which is to be expected. Assuming it's not a clapped out 40 series model of course :cry:
 
Last edited:
DLSS 2 will always support older cards and continue to get updates. DLSS 3 is a superset of DLSS 2, the major additions for 3 being SER (shader execution re-ordering to speed up RT) and Frame Generation. If you disable FG in a game, but leave "DLSS 3" enabled, you're just using DLSS 2, but it runs a bit better on 40 series because of the newer gen Tensor cores which is to be expected. Assuming it's not a clapped out 40 series model of course :cry:

Lol. Understood. Much appreciated!
 
Keep in mind Nvidia has still to enable that neural filtering tech on RTX cards which is said to further improve RT performance. I suspect they are holding off on this until AMD finally get round to doing RT in a proper way, for obvious reasons. This should apply to RTX 30 series too so will be interesting to see what happens there.
 
Keep in mind Nvidia has still to enable that neural filtering tech on RTX cards which is said to further improve RT performance. I suspect they are holding off on this until AMD finally get round to doing RT in a proper way, for obvious reasons. This should apply to RTX 30 series too so will be interesting to see what happens there.

I wasn't aware such a tech was in the pipeline. Clearly I have some reading to do. To Google...
 
We should be seeing it in Cyberpunk via an update soon :p

 
Nvidia has been doing that every generation now for the last few - suporting tech at the hardware level that won't be used for quite a while. RTX owners are still waiting for GPU based direct storage which was promised years ago now, the GPUs support it but software and games support is lacking. It's probably the same with this neural rendering thing, it's a way to speed up RT calculations to improve performance but RTX4000 has been out for almost 1 year now and no games yet uses neural rendering

It will be interesting to see how this works in practice because the research makes it sound like:

1) it adds additional 2-3ms rendering time, which means that it's not going to work for games with high framerates to begin with, it's going to work best when with neural rendering off you'r sitting at under 30fps, so the extra latency from neural rendering will be more than offset by the latency reduction from the relative framerate boost.

2) Neural rendering works by precaching data in real-time as the game runs, so to me it sounds like there could be a delay for the end user. For example you playing cyberpunk and you enter a new building and the game has to calculate all this data and it gives you let's say 30fps and then over the next couple seconds the framerate shoots up to 60 as the neural network caches the data for future frames. That's what I'm worried about, having a more unstable framerate. But perhaps all thisnis done fast enough that there is no framerate impact - but digital foundry has shown how this already happens in some games with the graphics quality, where you turn a camera and stand still and the RT graphics quality improves over the next second or two as it gets more data, it's not fast enough for me real time and is noticeable
 
Last edited:
Is there any reason that DLSS 3 / frame gen won't work on a 3000 serious cards or is this another artificial Nvidia FkU?
DLSS3 is probably artificially locked out on the highest end RTX3000 series dGPUs,unless someone really believes the tensor core output on an RTX3090TI is less than that of an RTX4050..sorry RTX4060,or the even more cut down RTX4050 which will be released!
 
Last edited:
DLSS3 is probably artificially locked out on the highest end RTX3000 series dGPUs,unless someone really believes the tensor core output on an RTX3090TI is less than that of an RTX4050..sorry RTX4060,or the even more cut down RTX4050 which will be released!
Exactly BS limitations so they can sell poor hardware with new software gimmicks, 50 series DLSS4 and locked to that gen and so on.. This is Nvidia now.. Reality gamers should be asking for better hardware not software gimmicks in their purchase.. Nvidia now the software company and even forces reviewers to highlight the gimmicks in their reviews or they may loose free hardware for reviews.. Just have to watch some reviewers either super shills or will state Nvidia asked us to add this in..

Nvidia are slowly going to hang themselves as other tech companies are doing or have done in the past and gone the way of the dodo.. Nvidia seems to forget the tech industry history and the famous huge companies before it that don't even exist now.
 
Exactly BS limitations so they can sell poor hardware with new software gimmicks, 50 series DLSS4 and locked to that gen and so on.. This is Nvidia now.. Reality gamers should be asking for better hardware not software gimmicks in their purchase.. Nvidia now the software company and even forces reviewers to highlight the gimmicks in their reviews or they may loose free hardware for reviews.. Just have to watch some reviewers either super shills or will state Nvidia asked us to add this in..

Nvidia are slowly going to hang themselves as other tech companies are doing or have done in the past and gone the way of the dodo.. Nvidia seems to forget the tech industry history and the famous huge companies before it that don't even exist now.
Nvidia seem to treat consumers with irritance nowadays- how dare they ask for bang for buck or best price to performance gpus :rolleyes:?!!!
 
Last edited:
Nvidia has been doing that every generation now for the last few - suporting tech at the hardware level that won't be used for quite a while. RTX owners are still waiting for GPU based direct storage which was promised years ago now, the GPUs support it but software and games support is lacking. It's probably the same with this neural rendering thing, it's a way to speed up RT calculations to improve performance but RTX4000 has been out for almost 1 year now and no games yet uses neural rendering

It will be interesting to see how this works in practice because the research makes it sound like:

1) it adds additional 2-3ms rendering time, which means that it's not going to work for games with high framerates to begin with, it's going to work best when with neural rendering off you'r sitting at under 30fps, so the extra latency from neural rendering will be more than offset by the latency reduction from the relative framerate boost.

2) Neural rendering works by precaching data in real-time as the game runs, so to me it sounds like there could be a delay for the end user. For example you playing cyberpunk and you enter a new building and the game has to calculate all this data and it gives you let's say 30fps and then over the next couple seconds the framerate shoots up to 60 as the neural network caches the data for future frames. That's what I'm worried about, having a more unstable framerate. But perhaps all thisnis done fast enough that there is no framerate impact - but digital foundry has shown how this already happens in some games with the graphics quality, where you turn a camera and stand still and the RT graphics quality improves over the next second or two as it gets more data, it's not fast enough for me real time and is noticeable

One of the problems with AI at the moment, and this will probably be no different, it gives you an interpretation of the thing but often not the thing itself, techniques like this might seem to produce a decent result if you don't look too closely so to speak but will fall apart when you really push it.
 
We should be seeing it in Cyberpunk via an update soon :p


I expect we'll see about 1fps average improvement if that lol
 
Back
Top Bottom