• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

simple question, upgrading from OG 3080, is a 8700k running @ 3440 x 1440 enough grunt to power 4080 Super?
You are likely to see some benefit in more GPU-heavy games, so I wouldn't call it pointless, but you really need something like a 7800X3D to get the most out of those cards.

This can give you an idea, if you take the 9900K as equivalent to yours:
 
Depends on upgrade price ultimately. A 12600K or above will comfortably satisfy even a 4090 at 1440p or above, and where it doesn't, can just use DLDSR to offload back to the GPU. I'd say the baseline is at least a 12th gen i5 or above to not have to worry about CPU being the limiter for some years to come. A larger X3d will further lift those % lows though for sure and with that comes the higher averages.
 
Last edited:
It'll run it well enough for you to get a decent fps boost imo, but yes you will be bottlenecked in certain games/situations.

Worth doing if you plan on a platform upgrade at some point soon though imo.

I have the start of a new system that has been gathering dust in the corner of my room, ROG Strix X670E, 32GB DDR5, and a Lancool 216 case, just need a CPU and PSU, truth be told it feels like a gigantian effort to build a system these days.
 
I have the start of a new system that has been gathering dust in the corner of my room, ROG Strix X670E, 32GB DDR5, and a Lancool 216 case, just need a CPU and PSU, truth be told it feels like a gigantian effort to build a system these days.
Right? I've built loads of systems in my time but these days it just feels like a bigger thing even though it's technically getting easier to do. Maybe I'm getting old or maybe it's the stress of GPUs costing £1000+...
 
Right? I've built loads of systems in my time but these days it just feels like a bigger thing even though it's technically getting easier to do. Maybe I'm getting old or maybe it's the stress of GPUs costing £1000+...
Yup feel the same way, I know I need to upgrade my CPU, but really can't be bothered to get it done.
I think I may end up just getting a new case, PSU, CPU and cooler all t the same time, so I don't have to faff around with the current one.
 
The problem is typically not knowing which kit to get, even simple things like the CPU cooler, there is endless choice of AIO vs air cooled.

For me it was a case of stalking what's out and knowing my budget, then fixating on a particular one based on what some choice reviewers say about it like Gamers Nexus. So if it was an AIO at the time then the Freezer II Pro was the go-to given its low price high performance standing.

GPU on the other hand.... It took a year+ after getting the 3080 Ti FE to figure out that a 24GB card was necessary as much of my editing consumes most of that VRAM, and there was only one logical option there so not really a hard choice :o

I would not want to go through motherboard selections again though, 12th gen at launch being an early adopter meant a whole year+ of BIOS bugs to deal with before Gigabyte sorted it out. And it's not like other brands were much better off either. I just preferred the hardware on GB boards like the multi phase power circuitry and stuff.
 
If Nvidia uses 8GB again for it's mid range cards on the 5000 series those are going to be DOA. 12GB really need to be the bare minimum for mid range cards in 2024. 8GB should only be used for the extremely low end cards now as its not been enough for a while, especially now.
 
Last edited:
agreed, but no one forces a gamer to buy such a crap product and well all knew this was going to happen, hardly blame nvidia for selling a disabled card for daft money, its in the business of making money, blame those that bought and continue to buy 8gb cards.
the 4060ti is a 1080p card imo, that happens to cost mid range money ;)
 
If Nvidia uses 8GB again for it's mid range cards on the 5000 series those are going to a DOA. 12GB really need to be the bare minimum for mid range cards in 2024. 8GB should only be used for the extremely low end cards now as its not been enough for a while, especially now.

Yup, 8gb was great when it released 4 years ago.... but buying any new gen now for £500+, you really shouldn't be charging anymore than that at most for 8gb, nvidia seem to love to bend people over in order to get them to pay through the nose for more vram :cry:
 
nvidia markets this as a 1080p card, and i am seeing a lot of 1440p benchmarks in the video..
and theres still a $100 difference between the 16GB and 8GB variants
so, the basis of benchmarking isnt clear enough

It's HUB doing what they do best, clickbait and to create controversy :)

Their point(s) are valid but they often miss the point/context and when called out by more experienced/knowledagble folk like DF/Alex, they throw their toys out the pram e.g. HUB blaming the 8gb gpus for the **** show of how TLOU ran on launch when DF expertly explained the underlying issues that were present on the PC, which simply weren't there with the PS 5 version to then magically being resolved by patch(es).
 
Last edited:
Their point(s) are valid but they often miss the point/context and when called out by more experienced/knowledagble folk like DF/Alex, they throw their toys out the pram e.g. HUB blaming the 8gb gpus for the **** show of how TLOU ran on launch when DF expertly explained the underlying issues that were present on the PC, which simply weren't there with the PS 5 version to then magically being resolved by patch(es).
thats where i am fundamentally undecided, so are they saying 16gb should be minimum vram for all cards, if a company wants to provide a lower priced 8gb version they just shouldnt bother, so effectively they are advocating about reducing the span of choices available to the consumer.... because dictating absolute prices to nvidia is just not going to work
 
Back
Top Bottom