£350 for an 8GB card is hard to swallow in 2023 though. Same with the 4070's 12GB: versus the 7800 XT, in theory 4-5% slower for 50W less isn't a bad, but 12GB is too little for £500+ at this stage.
So I'm an AMD guy but own a 4070, you'll find compared to AMD (which I also own in my 2nd rig, along with both machines cpus) the 4070 uses literally 3.5-4gb less than a 6800xt on EVERY game me and my mate have compared at the
exact same settings/res - whether this is games being better optimized for Nvidia vs AMD, I don't know. But my point is if I use 4gb less than a 16gb card that means I've still got the same remaining future amount of vram as a 16gb card if you think about it? As they're literally
wasting 3.5-4gb more vram than me?
Even the 'allocated' amount differs by around the same amount and neither brand card ever seems to come close, pair that with how good dlss3.5/frame generation is, and I don't think I'd have a problem using this for 3 years...
Regarding the wattage, a 4070 undervolts to real world usage of 105-145w - 145w being native 1440p with RT on at max settings in Control and around 105-125 in TLOU at native 1440p ultra
in the most demanding areas it'll sometimes goto 135w. So in reality nearly 120w less consumption, from my experience undervolting my amd cards I'd imagine it wouldn't be unreasonable to get the 7800xt to sat 180-190w but not much less seeing as the 6800xt doesn't undervolt
that much from it's stock draw.
The 4070 imho seems to have that RX 6600 XT kind of power consumption vs performance factor, which is why I chose it, you could get them for £520 last month as well, so £40 more than a 7800xt, so I wouldn't call it a rip off at this point.
FWIW mine never even turns it's fan on (65c stock) and runs on average at 53-57C native 1440p ultra.
So when your Nvidia card uses 3.5-4gb less vram in
reality vs AMD cards it makes it relative to a 16gb card wasting an extra 3.5-4gb in
all current and previous games, I've tested 40 games since buying in it july and compared the to my mates 6800xt and we've both yet to find a single game where this hungrier vram issue isn't the case on his 6800xt... I'd put money on it a 7800xt/7900xt/x would do the same at the same res/settings...
This seems to be something people don't want to admit to when banging on and on about having another 4gb over these cards or the 3080 etc... And remember I'm an AMD guy myself, but I wont be biased, hence why I gave this card a chance - or it'd be sent straight back using the 14 day no questions asked warranty!
I'm in
no way saying this is AMD's fault FWIW, just stating games are obviously written to be better optimised on Nvidia in terms of vram usage vs performance achieved? Who knows, but this is fact that this happens with my specific card at least.
I'll stress this point, I'm talking about
actual usage when played not allocation predicted in the settings menu...
So TLDR, when you're playing the game on my card or a 6800xt, both cards
actual vram usage (
NOT the predicted allocated amount in the settings menu) end up with the same remaining 'free' vram, due to the 16gb 6800xt using 3.5-4gb more per game than my 4070 at the same setting/res natively, thus a 16gb card ends up with the same remaining vram as my 4070 12gb... That means the whole 12gb worries isn't a factor with my card vs say a 6700xt...
I have tested this is pretty much every current and previous game (bar BG3 as that's not my cup of tea) and have had the same continuity throughout.
I am not in anyway trolling, but in reality that IS what my card/system uses at 1440p ultra native and I've tested it against my mates system with a 6800xt... So I can't really be any fairer than that when it comes to actual ownership/real world testing.
The first thing I did was test all this when I tried out the card, if it'd been sky high actual usage in game I'd of honored the 14 day returns policy and sent it packaging and got a different card.