• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

There is no upside to any of this for anyone. You're basically saying, oh good, I can hang on to this overpriced 4080 because the 5080 is even more overpriced. Then I can buy a 6080 which is going to be even worse. The last 'not terrible' (I'm reluctant to say 'good') Nvidia GPU was a 3080, assuming you could get one at RRP. Developers have been releasing games that run terribly on most gamer's PCs for ages now (because people keep buying them), the 50 series makes no difference to that.

I hate to sound like a zoomer, but this is massive cope.

Graphics cards are like consoles now you can just keep your old card for 5-7 years and get playable performance

In the old days you had to upgrade every generation or games were basically unplayable; but now it's all much or muchness especially the last few years
 
Last edited:
Too right, I am not fond of the dark days of getting something like a 9700 pro and it struggling to run typical games only a year later.
Sometimes you'd get cards that were current but didn't even support the latest hotness whatsoever. The nvidia FX 5000 series were infamous for this, they claimed shader model 2.0 support but Half Life 2 and Far Cry came out the same year and showed it to be a massive lie.
To be fair, progress in PC graphics were absolutely breakneck back then, but it was a horribly expensive hobby at the time.
 
Too right, I am not fond of the dark days of getting something like a 9700 pro and it struggling to run typical games only a year later.
Sometimes you'd get cards that were current but didn't even support the latest hotness whatsoever. The nvidia FX 5000 series were infamous for this, they claimed shader model 2.0 support but Half Life 2 and Far Cry came out the same year and showed it to be a massive lie.
To be fair, progress in PC graphics were absolutely breakneck back then, but it was a horribly expensive hobby at the time.

Those were fun times though man. Used to love reading about new cards. Doing benchmarks etc.

These days you can barely overclock and benching more than 5 minutes is a waste of time and not the fun it used to be.
 
So I thought I'd bench my new 4090 for a bit, see what it's capable of. I played around with the power limit because I heard that it was an incredibly efficient card. Benched the speedway with 100% power limit and then 70% power limit

100% power limit: https://www.3dmark.com/sw/1845136 average fps of 100.23
70% power limit: https://www.3dmark.com/sw/1845149 average fps of 91.38

30% less power consumption for just 8.8% less frames per second, I am very impressed. I will probably keep it this way.
 
So I thought I'd bench my new 4090 for a bit, see what it's capable of. I played around with the power limit because I heard that it was an incredibly efficient card. Benched the speedway with 100% power limit and then 70% power limit

100% power limit: https://www.3dmark.com/sw/1845136 average fps of 100.23
70% power limit: https://www.3dmark.com/sw/1845149 average fps of 91.38

30% less power consumption for just 8.8% less frames per second, I am very impressed. I will probably keep it this way.

Power Limit is ok but an undervolt via curve is even better!
 
Power Limit is ok but an undervolt via curve is even better!
Personally I think it's easier to overclock (i.e raise the curve with a core increase), make sure that's stable and then undervolt by dropping the power limit. You get the exact same outcome without altering the factory curve. It's also much less fiddly.
 
A UV typically also means adding an OC and you still use less power. My 4090 is +1100 on VRAM and the core is locked to its 2730 boost clock. Uses around 100-150 watts less power in games as a result and actually gains fps lol.
 
A UV typically also means adding an OC and you still use less power. My 4090 is +1100 on VRAM and the core is locked to its 2730 boost clock. Uses around 100-150 watts less power in games as a result and actually gains fps lol.
Not all cards work well with such though, mine hates me touching voltages at all with passion (as in UV itself). I'd have to do different approach, but CBA to bother that much so just power-limit it usually by 20%, with barely any perf. loss. I reckon I'll play more with it in summer, currently I am happy to unleash the power beast to supplement my room's heating. :)
 
A UV typically also means adding an OC and you still use less power. My 4090 is +1100 on VRAM and the core is locked to its 2730 boost clock. Uses around 100-150 watts less power in games as a result and actually gains fps lol.
'Undervolting' is kind of a misleading phrase imo. You're not really under-volting, you're correcting the factory OVER volting :cry:
 
  • Haha
Reactions: mrk
A UV typically also means adding an OC and you still use less power. My 4090 is +1100 on VRAM and the core is locked to its 2730 boost clock. Uses around 100-150 watts less power in games as a result and actually gains fps lol.
Interesting thread re UV etc. Never tried it but noticed while pushing up OC on 4090 (as what else do u do whilst waiting on that other thing with 090 in it) there was no increase in fps. So, I'm going to try this, really in practice for the next one.
 
  • Like
Reactions: mrk
'Undervolting' is kind of a misleading phrase imo. You're not really under-volting, you're correcting the factory OVER volting :cry:

Its just cause cards are not binned anymore. They dont take time to test individual voltage curves for the cards, they just slap a high stock voltage on everything and ship it out

In the old days when binning was still a thing, if you bought an Asus Strix it would be able to either overclock very high or be able to run on a very low voltage.

These days you can buy a strix that is garbage and is unstable if you UV, its now a lottery - so now a 2k 5090 may be a better undervolter than a 3.5k 5090, you wont know till you buy it
 
Last edited:
Anyone noticed the overlay/hud thing is wrong since the update? My shows 99% GPU on games but reckons it's running at 860mhz lol.

edit: I just checked MSI AF and for some reason it had changed to 33% power limit.. no idea how I've been playing games fine like that! That said, playing FO76 atm and it keeps flicking from 1590 to 2800 and other random frequencies so sure something is up.
 
Last edited:
Depends what level of GPU you're looking at. But GPU core grunt is way more important than vram. 10GB is still perfectly fine at 1440 unless your trying to run with 'psycho' level textures (looking at you Cyberpunk), but at that level of texture settings you'd always need a card more powerful (for decent performance), that by default will have more vram, ie a higher grade card.

4k I think is where vram starts to matter more, but again you need a high performing card for decent performance at 4k anyway.

There will always be the freak game that is sensitive to vram, I think Hogwarts was one of those games, but that was only an issue when playing on low level GPUs if I remember rightly.
 
Is it safe to say that its not worth to buy GPU for 1440p screen with less than 16Gb Vram ? Some cards have 10Gb which is laughable amount.
Imo it's only worth buying a gpu with less than 16Gb@1440p if you get it on the cheap or you turn on upscaling with the IQ cost to reduce vram to enable (non gpu expensive/vram expensive)higher textures that go past 8/10/12Gb.

But, getting it on the cheap doesn't make it not run out of vram if it only had 12Gb or 10Gb in the first place.
 
Last edited:
Imo it's only worth buying a gpu with less than 16Gb@1440p if you get it on the cheap or you turn on upscaling with the IQ cost to reduce vram to enable (non gpu expensive/vram expensive)higher textures that go past 8/10/12Gb.

But, getting it on the cheap doesn't make it not run out of vram if it only had 12Gb or 10Gb in the first place.

Got 12GB. Playing so many new games. Not running out :D
 
Despite saying previously you've ran out with 12Gb:shrugs

Besides, because you don't run out running X settings doesn't equate to vram not running out on Y settings, clearly shown by the multitude of users reporting running out of 8/10/12GB vram, even some 4080 users have reported running out of 16gb vram in here and in Games sub.

Gpu grunt to vram allocation was made irrelevant in 70 and even 80 class gpu's in 2022, vram is the problem requiring turning down textures from max is a must in some titles, Avatar, SWO, Indy, SM2+Texture pack that Iv'e ran on the 4070@1440p-and that's not counting the games you have to restart with performance drop off if your having a heavy session or automatic texture degradation/loading in slower-plenty tech vids have been reporting it for years at this point.

Last year DF eventually jumped on reporting vram deficiencies even on Nv titles-Nv clearly don't care at this point.


16Gb 70ti is likely going to cost upwards of £800-they've successfully vram locked performance tiers because we'll buy anything they throw at us.
 
Back
Top Bottom