• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Closer to the start rather than end of the 4000 series product cycle and sub £800 would have been nice. But I had a gift card (Christmas bonus) burning a hole in my pocket and nothing else I wanted or needed.

Not sure about the product life cycle bit. Unless you mean because the super just came out? Lol.

I completely understand about the gift card bit though. Assuming it was at least a 3 figure sum.

Anyway. None of that really matters. You have the card now. Enjoy :D
 
Assuming it was at least a 3 figure sum.

Look at the normal price of the GB 4080 Super Gaming OC compared to what I paid :s

Not sure about the product life cycle bit. Unless you mean because the super just came out? Lol.

I mean if it was back in 2022 and not relatively close to the next gen I'd have been happier, my cut off point really was £800 for the 4080 and that was back then. But sadly this was the best deal I've managed to come up with.
 
Last edited:
Look at the normal price of the GB 4080 Super Gaming OC compared to what I paid :s

No need. I just find the 4080 very over priced. To me it was always a £800 card. I mean I paid £575 for my 4070 Ti (brand new). Is the 4080 worth £325 on top? Not in my book.

Again, just my thoughts. More to with Nvidia and their pricing than anything else.
 
No need. I just find the 4080 very over priced. To me it was always a £800 card. I mean I paid £575 for my 4070 Ti (brand new). Is the 4080 worth £325 on top? Not in my book.

Again, just my thoughts. More to with Nvidia and their pricing than anything else.

One of the games I'm playing a bit is Hogwarts Legacy which can easily push past 12GB VRAM use, so I'd be looking at at least 16GB VRAM cards. Availability was pretty hit and miss at the retailer I bought from - I did consider the 4090 but I think I'll just keep the extra cash see what turns up with the 5000 series.
 
One of the games I'm playing a bit is Hogwarts Legacy which can easily push past 12GB VRAM use, so I'd be looking at at least 16GB VRAM cards. Availability was pretty hit and miss at the retailer I bought from - I did consider the 4090 but I think I'll just keep the extra cash see what turns up with the 5000 series.


I don't get why anyone would pay more than £400 for a 12GB GPU, the other odd thing is these GPU's are marketed as 1440P GPU's, in terms of horsepower they are, easily, but even with a couple of existing games you're already having to make compromises due to that VRam, and i include the RX 7700 XT in this, its over £400, but they also sit at anything up to £800.

This is stupid, these things are already becoming obsolete as you buy them, in 2 years, 3 years, even 1 year these things are going to be RTX 3070's all over again.

12GB GPU's are lower mid tier, the sort of GPU's where in a year or two you expect to be turning down textures to stop texture pop-in, blurring, erratic 1% lows causing stuttering, GPU's costing £400+ should be 16GB, 4K capable GPU's should be 20GB+, instead what we have is lower midrange GPU's with enthusiast tier pricing.

There needs to be more pushback or they will just keep taking the _____ out of us.

I don't consider the 4070 a true midrange GPU, to me while not a low end GPU, it is lower tier, like an RTX 3060 or RX 6600 XT of the previous gen, just horrendously overpriced.
 
Last edited:
Hogwarts doesn't represent the wider gaming scope though remember, its engine was buggy at launch and they never really fixed the memory usage, both VRAM and RAM can exceed 16GB a piece for the game process, meaning you will be going above 16GB easily factoring in background OS resourcing. When the VRAM limit is reached, the frametimes will drop leading to stutter as VRAM tasks cache to disk/RAM etc. Hogwart's engine is still bugged for this having recently reinstalled the game just to see if anything changed in 6 months.

I point once again to Cyberpunk as the halo example of how memory management is done, under 8GB of system RAM use, and 10-12GB of VRAM use for the process when path tracing, or 14GB with 2GB of 4K texture mods installed. I simply cannot fathom how and why Hogwarts is so memory hungry when the textures and game world are nowhere near as densely detailed.or high quality.
 
Hogwarts doesn't represent the wider gaming scope though remember, its engine was buggy at launch and they never really fixed the memory usage, both VRAM and RAM can exceed 16GB a piece for the game process, meaning you will be going above 16GB easily factoring in background OS resourcing. When the VRAM limit is reached, the frametimes will drop leading to stutter as VRAM tasks cache to disk/RAM etc. Hogwart's engine is still bugged for this having recently reinstalled the game just to see if anything changed in 6 months.

I point once again to Cyberpunk as the halo example of how memory management is done, under 8GB of system RAM use, and 10-12GB of VRAM use for the process when path tracing, or 14GB with 2GB of 4K texture mods installed. I simply cannot fathom how and why Hogwarts is so memory hungry when the textures and game world are nowhere near as densely detailed.or high quality.

You know, this exact argument was used as to why the 3070 started to show lack of VRam symptoms in games, first it was just a couple of games "oh its because the engine for that game is broken, nothing wrong with the 3070, its brilliant" and quite soon more and more "Broken Game Engines" started to appear, it didn't take long before everything but the 3070 was broken.
 
8GB of VRAM is absolutely not enough to run games at 1440P though, this much is a fact and the 3070 is old enough where that transition period of games using more VRAM were coming into the scene.

There's a difference between the general scope of game engines becoming heavier and so needing a bit more VRAM to, "these 2 games are outright busted and needlessly consume both RAM and VRAM for no good reason", the 2nd game being Starfield which Bethesda took several months to accept needed a patching and then they sorted the engine out in terms of optimisation.

From the past 3 years I cannot recall any other game other than those 2 basically eating up (V)RAM like nothing else. The Last of Us part 1 did have some optimisation issues too but that was mostly related to shader compilation rather than outright memory leaking.

In my view 16GB should be the bottom end baseline for VRAM in 2024. Any card aimed at 1440P gaming coming out with less is fit for the toilet.
 
Last edited:
It's been 8/10gb excuses now it's 12gb excuses.

My 79XTX's seen >19gb used with 18gb system ram, comparable usage same game on my 4070 is 12gb with 22.5gb system ram with extremely bad texture pop in.

For years now there's been far too many in the community making all sorts of excuses/defending **** poor paltry vram allocation, doesn't help when it's the apple technique of pricing high to begin with, then add on even higher entry just for vram.

This gen until recently entry to 16gb at the high end was roundabout £1200 for NV-for 16Gb.:(

No.1 GPU on steam is a 12gb 3060-NV tried to replace it with a barely faster 8Gb 4060(when it doesn't run out of Vram) with added software and people wonder why GPU sales are down.

They even marketed the 4060's 8gb as enough because of the extra cache, but when asked why a 16Gb version, it was more or less because sometimes 8gbs not enough.
 
Last edited:
It used to be that a texture was a single image to represent the surface of an asset. The Albedo or Colour Texture.

The GPU shaders became a little more advanced, with a Specular Map the GPU's shader could read a greyscale to highlight the full range of light reflected off the asset, now we have two image files for one texture.

Then GPU shaders became a little more advanced, now they can read a map representing the bump scale of an asset, so you get actual texture in your texture, now you have 3 images in your texture...

I could go on, GPU shaders are still getting better and better and reading more and more aspects of a texture to display more noise in the asset, each such asset requires its own image file in a now extensive texture stack, maybe 6 and more... each image takes up its own image space, IE VRam memory.

What people think of as a broken engine because their favourite GPU struggles to run it is in fact just too powerful for their overpriced crappy GPU.

DyWZaS1.jpeg
 
Last edited:
Just want to make sure is this the cable i need as one in box is ugly as hell to me ..
i have

Asus GeForce RTX 4080 SUPER ROG Strix OC White​

ASUS ROG Thor 850 W Platinum​



Yes, its for Asus / Seasonic PSU's, same thing, the PSU is probably Seasonic inside, like My EVGA and quite a lot of high end PSU's, Cablemod are also a good company.
 
Just want to make sure is this the cable i need as one in box is ugly as hell to me ..
i have

Asus GeForce RTX 4080 SUPER ROG Strix OC White​

ASUS ROG Thor 850 W Platinum​


Are the cable mod ones safe to use after the problems reported with there angle connectors
 
Hogwarts doesn't represent the wider gaming scope though remember, its engine was buggy at launch and they never really fixed the memory usage, both VRAM and RAM can exceed 16GB a piece for the game process, meaning you will be going above 16GB easily factoring in background OS resourcing. When the VRAM limit is reached, the frametimes will drop leading to stutter as VRAM tasks cache to disk/RAM etc. Hogwart's engine is still bugged for this having recently reinstalled the game just to see if anything changed in 6 months.

I point once again to Cyberpunk as the halo example of how memory management is done, under 8GB of system RAM use, and 10-12GB of VRAM use for the process when path tracing, or 14GB with 2GB of 4K texture mods installed. I simply cannot fathom how and why Hogwarts is so memory hungry when the textures and game world are nowhere near as densely detailed.or high quality.

One thing I would say, Hogwarts Legacy is actually quite detailed in places, though the overall graphics fidelity is a bit mixed. Some parts of the world are detailed extensively from the movie version of the setting with lots of fiddly detail, ironically they are also parts of the game which are quite underused like the station but are often loaded in so chugging a lot of VRAM which isn't beneficial to what is being seen on screen - a bit of a curse of the franchise I guess.
 
GPUs across the range have been too expensive for 3+ generations now in my opinion.
1000 series Nvidia GPUs weren't cheap but look reasonable by todays standards. AMDs offering at that time were probably overpriced in comparison too.
The 2000 series Nvidia cards and their AMD counterparts didn't really seem to offer much.
The 3000 series Nvidia cards were just hard to get hold of because of the mining craze and this was the start of the price hikes as I recall. Again AMD not really doing any better.
The 4000 series Nvidia cards are all just silly prices and there are about 2 cards that might seem somewhat good as I recall (4070 for it's relative price/performance and the 4090 for its pure performance). AMD cards weren't any better really but did have a bit more VRAM, but I don't feel that justified their prices either.

*Gently pats his 1080Tis* (not really as the PC's on and flakey enough as it is)

It's easy to say the current gen is too expensive and you'll stick with what you have when you have a last gen card but it's a real head-scratcher trying to decide what to get if you're a few generations out of date and "needing" to get something.
I don't imagine next gen will be any better, but the situation will be the same, easier to skip and bemoan if you're on current gen or maybe previous gen but for those that want or need a new GPU what choice will you have?
 
Back
Top Bottom