No mate, as I said in my original post it's paid for itself in 3 years and given me a wink under £600 back and will live in my SFF 2nd rig and replace the AMD card in there at that point.
I wasn't looking for validation I was just making a point that no one seems to mention despite slating lower vram cards like mine that the
actual utilization vs predicted/allocated in the game settings ends up being more efficient than the rival more thirsty less optimised 6800xt, thus at the end of it we both ironically end up with the same amount of vram, thus neither card should have a problem for a few years considering I've tested it in nearly 40 games new and slightly older with a friend with pretty much the same game library. So it isn't fluke. Nor was it using dlss/rt.
I just thought this was a valid point seeing as I'm an AMD guy but took a walk on the wild side and was shocked at the bias I saw against my card considering it was only £520 which is £40 more than a 7800XT, so that extra £40 could be worth considering to some for the RT and DLSS3.5/FG/Super low latency mode etc.
I'm just genuinely shocked no one said "oh that's decent at least you're not worrying about how much it'll use, the 4070 must be better optimized/games are written better/optimized better (out of admitted bias/sponsoring) for nvidia, so that negates the lesser vram."
There was literally zero bias/argument intented, it just sounded like people were confusing actual usage for 'allocated/predicted' by the game in the settings vs what you actually use, then to say all the read outs are wrong but get me to look up a video using the same read outs/programs is ridiculous. That's like saying all the sensors/temps are also lying. Then why have them!
If anything it came across biased against me, vs someone saying, oh in real life usage that's interesting, you shouldn't have a problem do to how well optimized that appears to be