• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs. It additionally highlights how poor a lot of these these dGPUs are,and the reality of how much performance stagnation we had over the last 5 years and also VRAM stagnation. I can understand consoles needing them because they are built to a cost,and are relatively affordable. But a lot of these dGPUs will soon fall off a cliff,especially with the stingy amount of VRAM which is even worse for the under £600 market. By now any over £200 card should be having at least 12GB~16GB of VRAM by now. We went from 2GB to 4GB to 8GB under £400 between 2012 and 2016! AMD has been a bit better in this regard but its quite clear they are both in on the act.
 
Last edited:
We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs.
So much this.... I can understand 2 generations old cards needing the help but a top end previous gen and current gen should not be needing to 'fake it'....

VRAM is such a volatile topic we could get this thread closed if we hit it too hard but no high end gpu should have less than 16GB imo, hell even mid tier should be 16GB imo.
 
Telling that tthe 4090 FE is still available for sale, what, almost twenty hours after the latest drop?
 
We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs. It additionally highlights how poor a lot of these these dGPUs are,and the reality of how much performance stagnation we had over the last 5 years and also VRAM stagnation. I can understand consoles needing them because they are built to a cost,and are relatively affordable. But a lot of these dGPUs will soon fall off a cliff,especially with the stingy amount of VRAM which is even worse for the under £600 market. By now any over £200 card should be having at least 12GB~16GB of VRAM by now. We went from 2GB to 4GB to 8GB under £400 between 2012 and 2016! AMD has been a bit better in this regard but its quite clear they are both in on the act.

True. But I still really like DLSS. It allows me to run DLDSR going from 3440x1440 to 5160x2160 which is some games makes a huge difference to image quality. You then use DLSS Quality at this resolution which essentially brings the rendered resolution to 3440x1440.

I get the benefit of much better image quality than I would at native for around the same fps.

It is one of the reasons I am even considering staying at this resolution rather than going back to 4K when the opportunity presents itself when QD-OLED 4K monitor's come out.
 
12GB is enough, change my mind.

** Checks the locked threads **

Plenty info in there, put a handful on ignore then when viewing 90% of the content is filtered for you!

We shouldn't be needing FSR and DLSS to run titles on higher end dGPUs. It additionally highlights ..performance stagnation we had over the last 5 years and also VRAM stagnation. But a lot of these dGPUs will soon fall off a cliff,especially with the stingy amount of VRAM which is even worse for the under £600 market. By now any over £200 card should be having at least 12GB~16GB of VRAM by now. We went from 2GB to 4GB to 8GB under £400 between 2012 and 2016! AMD has been a bit better in this regard but its quite clear they are both in on the act.

Agree. I also said a few years ago that people recommending the 2060's over alternatives it was a waste of time adding the "..and it can do ray tracing" but what can you do. People know my stance on the vram so no comments being suckered in that debacle anymore. :cry:
 
Last edited:
It is one of the reasons I am even considering staying at this resolution rather than going back to 4K when the opportunity presents itself when QD-OLED 4K monitor's come out.
Yeah I wouldn't go 4k tbh, not with the method you're using esp. It's still an absolute bitch to run sometimes and just swallows gfx funds!!!
 
So much this.... I can understand 2 generations old cards needing the help but a top end previous gen and current gen should not be needing to 'fake it'....

VRAM is such a volatile topic we could get this thread closed if we hit it too hard but no high end gpu should have less than 16GB imo, hell even mid tier should be 16GB imo.

The reason why both of these companies are getting so excited about DLSS/FSR is because it makes them more money.

By reducing the internal rendering resolution,and using it upscaling it means:
1.)Less VRAM is required as lower resolution textures are internally generated
2.)Reduces memory bandwidth requirements internally
3.)Use a weaker dGPU chip,so a lower bill of materials
4.)They can lock newer versions to new models,forcing you to upgrade
5.)Once the new generation is out,prioritise software support to the newer models,forcing the older dGPUs to have to stand on their own merits. This will mean they start to age badly.

True. But I still really like DLSS. It allows me to run DLDSR going from 3440x1440 to 5160x2160 which is some games makes a huge difference to image quality. You then use DLSS Quality at this resolution which essentially brings the rendered resolution to 3440x1440.

I get the benefit of much better image quality than I would at native for around the same fps.

It is one of the reasons I am even considering staying at this resolution rather than going back to 4K when the opportunity presents itself when QD-OLED 4K monitor's come out.

But is starting to apply to qHD and 4K with relatively expensive graphics cards,and we all know these new technologies are junk at lower resolutions. We have had qHD monitors for well over a decade,and 4K monitors for a decade. El-cheapo TVs have had 4K for years now. Phones have qHD and 4K equivalent screens increasingly.

Yet,PC monitor technologies have stagnated for years now,and yet desktop dGPUs seem to be well behind the curve,especially at under £600. People keep moaning at consoles for being the reason PC games are not moving forward in graphics,etc. PCMR laughed at consoles using upscaling,etc saying PC was all about maximum fidelity,etc. Now PC dGPUs are essentially using all the tricks consoles were mocked for,and now we are using fake inserted frames tech,that TVs use to smooth motion. But marketing it as something revolutionary,when it isn't.

This wouldn't matter if we still had good generational improvements top to bottom,but we don't anymore. At best we seem to have one good generation,then one or two useless ones.

But years ago,when we started to have 3D accellerators with PCs and finally the PC started being better at 3D games than consoles,this didn't seem to be a problem(we had games like Unreal,FarCry,HL2,Crysis,etc during an era of consoles). What seems to be the bigger problem is the under £600 market has started to stagnate,and with Intel pushing only quad cores for a decade for mainstream,so did core count. But now we are starting to stagnate at 6~8 cores too,which is no better than a console.

This means most normal gamers,don't have the hardware for devs to really push stuff forward. This then makes consoles far closer to an average gaming PC,than say the 2000~2010 period as an example. This stagnation is not all to do with consoles,its mostly to do with Nvidia/AMD just being more worried about what their accountants are saying,unlike earlier when it was more engineers genuinely wanting to outdo their competition. Games devs want to maximise sales,so target the lowest common denominator.

This is barely a decade ago:

JHH in 2011 said:
After a brief intro, Jensen outlined his vision beginning at 8:20 in the video below, where he talked about the importance of making gaming graphics cards affordable. “We started a company and the business plan basically read something like this,” stated the Nvidia CEO. “We are going to take technology that is only available only in the most expensive workstations. We’re going… to try and reinvent the technology and make it inexpensive.” He went on to explain to the attentive audience that his success was largely that “the killer app was video games.”

Fast forward to 2023,he is saying Moores Law is Dead,trying to justify overpriced graphics cards and pricing out the average gamer. Money printer can just go brr right? Where is all this extra money coming from?


Everything we do in the west is financialised in the short term now,to even our own detriment. This is why we have problems with energy prices,food prices,etc because short term profits trump long-term planning. It's why we have so much inflation. Then do more and more QE and cheap credit and low interest rates,so people,companies and countries can borrow more and more to pay for stuff,to drive corporate profits higher.

Nobody thinks what happens,once that tap gets switched off,or when we finally have to pay all this off.All about the short term,quick buck mentality and then moan when the rest of the world takes advantage of this to their long-term benefit.
 
Last edited:
Is there any downside to the founders edition? Other than a lower boost clock out of the box is the cooling adequate?

I only have the 3080 TIFE version, but the build quality is superb, cooling is great.

If I was ever going to get a 4090, it would be the FE.
 
The reason why both of these companies are getting so excited about DLSS/FSR is because it makes them more money.

By reducing the internal rendering resolution,and using it upscaling it means:
1.)Less VRAM is required as lower grade textures are internally generated
2.)Reduces memory bandwidth requirements internally
3.)Use a weaker dGPU chip,so a lower bill of materials
4.)They can lock newer versions to new models,forcing you to upgrade
5.)Once the new generation is out,prioritise software support to the newer models,forcing the older dGPUs to have to stand on their own merits. This will mean they start to age badly.
Oh yeah from a business perspective utilising 'fake frames' saves them a huge amount of cash, you can see how nvidia is trying to 'normalise' dlss etc.... it's completely wrong but unless users push back, and lets be honest outside of people on forums like this they won't know better, they'll just get away with it.
 
Oh yeah from a business perspective utilising 'fake frames' saves them a huge amount of cash, you can see how nvidia is trying to 'normalise' dlss etc.... it's completely wrong but unless users push back, and lets be honest outside of people on forums like this they won't know better, they'll just get away with it.

My attitude if I have to spend £400~£500 on a dGPU,but then need to use upscaling to run games "natively" without AA at a lowly qHD resolution from Day One,then use inserted frames,so I can get 60FPS,whilst having to spend £300 on a motherboard for a £200 CPU just to get modern connectivity(because of the gimped mainstream dGPUs),what is the point?

If I am compromising already,whilst spending more and more,as a mainstream gamer then I might as well get a console. This is on top of a lot of AAA games,charging more and more upfront,whilst charging even more for "Battle Passes" or even cosmetics. This is even starting to infect single player games. For the older games,you don't need the latest hardware,and tend to be more CPU limited, and with laptop CPUs getting better and better,a laptop would be good enough. For all the modern AAA games,unless you intend to spend huge amounts on a top tier dGPU,it makes little economic sense.

Even Steve from HUB,was talking about this to MLID about how well these dGPUs will age. Considering most gamers keep dGPUs for at least two years,and in reality longer from my experience,how is an RTX4060/RTX4060TI 8GB(really an RTX4050 8GB and an RTX4060 8GB),going to do in a few years? It will end up with more and more people getting put off,as their shiny new £500 card,which cost 2X their previous card,doesn't last anywhere as long.

In fact more and more of my friends are either going to consoles,or just playing older/Indie games and just avoiding newer games if their systems can't run them. As time progresses,I seem to be going that way. So basically I will start to realistically spend less and less with them overall,because the need for upgrades will reduce,and my interest in the hobby will start to wane.

Edit!!

You are seeing this with smartphones. As Apple/Samsung jacked up prices people started to keep smartphones for far longer instead of frivalously upgrading. Then a whole load of Chinese companies started taking up marketshare at the lower end tiers.
 
Last edited:
Interesting article about about the lower end RTX 4000`s being used in Gaming Laptops and the TDP limits not really making a difference above 100 watts.


Also a youtuber review give his views on what's happing .

 
My attitude if I have to spend £400~£500 on a dGPU,but then need to use upscaling to run games "natively" without AA at a lowly qHD resolution from Day One,then use inserted frames,so I can get 60FPS,whilst having to spend £300 on a motherboard for a £200 CPU just to get modern connectivity(because of the gimped mainstream dGPUs),what is the point?

If I am compromising already,whilst spending more and more,as a mainstream gamer then I might as well get a console. This is on top of a lot of AAA games,charging more and more upfront,whilst charging even more for "Battle Passes" or even cosmetics. This is even starting to infect single player games. For the older games,you don't need the latest hardware,and tend to be more CPU limited, and with laptop CPUs getting better and better,a laptop would be good enough. For all the modern AAA games,unless you intend to spend huge amounts on a top tier dGPU,it makes little economic sense.

Even Steve from HUB,was talking about this to MLID about how well these dGPUs will age. Considering most gamers keep dGPUs for at least two years,and in reality longer from my experience,how is an RTX4060/RTX4060TI 8GB(really an RTX4050 8GB and an RTX4060 8GB),going to do in a few years? It will end up with more and more people getting put off,as their shiny new £500 card,which cost 2X their previous card,doesn't last anywhere as long.

In fact more and more of my friends are either going to consoles,or just playing older/Indie games and just avoiding newer games if their systems can't run them. As time progresses,I seem to be going that way. So basically I will start to realistically spend less and less with them overall,because the need for upgrades will reduce,and my interest in the hobby will start to wane.
I'm in a strange place where I can also use the gpu for work as well as gaming but at the same time I don't want to 'waste money', I've said it elsewhere that I feel a 4080 is a better match to my 5950x but the 4090 is far better value at that price range... the 4070ti is no go, I'd need to replace that far sooner because of the lack of vram, not to mention the performance drop off with newer titles is going to be far sharper than with a card with more vram....

You could argue a lot of the current gpu decisions are on purpose or by design so we buy gpu's more reguarly due to forced obsolescence (can't knock jenson's business brain, even if it is anti consumer), it's not like pc's need updating as often these days. It really didn't help when we had (insert 'stupid' term of choice) buying gpu's at ludicrous markups during lockdown....

Oh I know that feeling about consoles, I've said it to myself that I'm so close to just buying a console instead of getting a new gpu (I do need an upgrade, just holding out due to stupid costs and cable options...) but then I think how much gaming do I actually do, do I want to be paying £70-80 for a game or subscribing (better deal imo) when I probably play an hour a day... and ultimately I just say to myself, at this moment in time... no, I'm still pc.



You are seeing this with smartphones. As Apple/Samsung jacked up prices people started to keep smartphones for far longer instead of frivalously upgrading. Then a whole load of Chinese companies started taking up marketshare at the lower end tiers.
Honestly I haven't seen a need to buy a phone above £2-300 for years, apart from 'tier snobbery' and 'social pressures' I don't see anything on a top tier phone I can't do on a good mid tier phone... but then I'm not playing games with it and my primary use is a bit of browsing, maps and using it as a phone lol.
 
Honestly I haven't seen a need to buy a phone above £2-300 for years, apart from 'tier snobbery' and 'social pressures' I don't see anything on a top tier phone I can't do on a good mid tier phone... but then I'm not playing games with it and my primary use is a bit of browsing, maps and using it as a phone lol.
If all you do with your phone is browse internet then absolutely even cheap phones can do that perfectly fine these days.
Now go and try to record low light video or take low light pictures.
Most people these days buy high end smartphones for cameras alone.
 
Last edited:
If all you do with your phone is browse internet then absolutely even cheap phones can do that perfectly fine these days.
Now go and try to record low light video or take low light pictures.
Most people these days buy high end smartphones for cameras alone.
I wouldn't exactly say the camera on my phone is bad (Samsung a52s), it has a night mode and it seems to take ok pictures, it's no dslr and it has the same issue that most cameras have in that the picture takes longer to take in low light so camera shake can be an issue (so yes OIS could be nice to have). Having said that I'm not expecting a huge amount from a camera on a phone, and I don't like AI 'fixed' images which seems to be the solution for a lot of 'high end' cameras.
 
Last edited:
I'm in a strange place where I can also use the gpu for work as well as gaming but at the same time I don't want to 'waste money', I've said it elsewhere that I feel a 4080 is a better match to my 5950x but the 4090 is far better value at that price range... the 4070ti is no go, I'd need to replace that far sooner because of the lack of vram, not to mention the performance drop off with newer titles is going to be far sharper than with a card with more vram....

You could argue a lot of the current gpu decisions are on purpose or by design so we buy gpu's more reguarly due to forced obsolescence (can't knock jenson's business brain, even if it is anti consumer), it's not like pc's need updating as often these days. It really didn't help when we had (insert 'stupid' term of choice) buying gpu's at ludicrous markups during lockdown....

Oh I know that feeling about consoles, I've said it to myself that I'm so close to just buying a console instead of getting a new gpu (I do need an upgrade, just holding out due to stupid costs and cable options...) but then I think how much gaming do I actually do, do I want to be paying £70-80 for a game or subscribing (better deal imo) when I probably play an hour a day... and ultimately I just say to myself, at this moment in time... no, I'm still pc.




Honestly I haven't seen a need to buy a phone above £2-300 for years, apart from 'tier snobbery' and 'social pressures' I don't see anything on a top tier phone I can't do on a good mid tier phone... but then I'm not playing games with it and my primary use is a bit of browsing, maps and using it as a phone lol.

The big issue with my usage habits,is that the non-gaming stuff I run will now run fine on a laptop. After all I have had nothing but SFF systems since 2005,so already I prioritise lower power and more compact parts anyway. Once we started getting 8 and 16 core CPUs in laptops,with high single core boosts,these would easily last me 4~5 years for non-gaming usage.

So it means most of my PC upgrades are driving by gaming,but if the GPUs are an issue then I have to wonder why I am still having a reasonably up to date desktop.

Also the difference in cost between console and PC versions of AAA games isn't as much now. So if I am spending £500 on a console for three years,instead of a few times that for a PC,the difference in price for the few AAA games I am buying won't impact the fact the PC costs more. Plus no doubt a laptop will run all the older PC games I still play fine,and I already have a laptop,so instead of upgrading two,I just get one.

With smartphones,Apple and Samsung attempted to do an Nvidia and AMD move with them. But Chinese companies started making better and better smartphones. One of the areas they innovated in was in introducing OIS to cheaper smartphones and telephoto lenses.

This meant companies like Samsung had to respond as they started to get under huge pressure in markets such as India,etc.


If all you do with your phone is browse internet then absolutely even cheap phones can do that perfectly fine these days.
Now go and try to record low light video or take low light pictures.
Most people these days buy high end smartphones for cameras alone.
The problem is with Chinese companies pushing for OIS,etc in cheaper smartphones even that reason isn't as big a deal now. Loads of sub £500 smartphones have OIS,and use pixel binning. Many have telephoto lenses. My Samsung S20FE costed me well under £400,and I have a stabilised main camera,stabilised 75MM equivalent and an UW lens. Yes,a £1000+ S23 Ultra is better,but it's not like the S20FE isn't sufficient for most average users who want to take snapshots. And for me my S20FE and brand new Fuji XT20,with a few lenses cost me less than a £1000+ smartphone. I have the best of both worlds now.

But TBF,I still consider a £1000 smartphone a better value item,than a £1000 dGPU. It's a multi-functional device,which can probably serve as a primary computing device and photography device for a lot of people,for a few years now.

I wouldn't exactly say the camera on my phone is bad (Samsung a52s), it has a night mode and it seems to take ok pictures, it's no dslr and it has the same issue that most cameras have in that the picture takes longer to take in low light so camera shake can be an issue (so yes OIS could be nice to have). Having said that I'm not expecting a huge amount from a camera on a phone, and I don't like AI 'fixed' images which seems to be the solution for a lot of 'high end' cameras.

But considering the cost of the Samsung A52s it's still decent as snapshot camera.
 
Last edited:
Back
Top Bottom