Keep in mind that the 4070 is also the best case value-wise, but even that if you look at it is not really faster than a 6800 XT in raster, retailed at the same price (3 years later), and has 4 GB less vram. But you do get the Nvidia niceties over it (re RT performance & DLSS et al). Even compared to the 3080 it's really just +2 GB Vram +Frame Gen, 3 years later. And that's THE BEST value proposition Nvidia has put forth (excl. the 4090). If we look at any other card then it's nothing but a bloodbath. And btw this is without taking into account the nice returns of mining while sleeping for previous gen, which you may argue is an oddity but nonetheless was a real boon for anyone with a GPU (so the real value of that generation was even higher; we can argue if it's "fair" to think of that as a value-added or not).
So, don't misunderstand me, the cards can still put in a lot of work so you can enjoy using one with 0 issues, but if we compare generational progression and pricing then it's clear Nvidia chose to keep profit margins super high and just sell on brand value, and because they have no real competition then why wouldn't they? If you look at the 4090 then it all makes sense, that one's a behemoth and demolishes the 3090, but everything else is pretty much running in place for the most part. There's just no other way that your flagship card has insane price/perf. if you aren't also selling gimped cards below it, there's diminishing returns on performance as you scale up.
I kind of get what you're saying but none of that is relevant to me, as I went from a RX580 8GB 2nd hand MITX rig I was casually using as a hackintosh dual boot on my tv playing emulation and older titles...
So no offense intended but what last gen or the gen before did means nothing to me as I don't upgrade each generation, I had another £400 I could have spent on my budget and gone AM5 but I didn't see the point, same as I didn't choose a 5800X3D as I wanted a cool/silent running rig that sipped power which paired with my undervolt, I've achieved, and without watercooling. So for
my needs it suits me well.
Regarding the whole 3 year old RX 6800XT vs 4070 thing though, for someone like me who hasn't got a 5700XT/6600/6600XT/6700/6700XT/6800 to upgrade to a 6800XT, it'd need to be as you say worth the jump, so me and a friend did the following experiment, as we'd build new builds at the same time, thanks to a sale from the same supplier... So check this:
We both bought a 5700x/32gb corsair lpx/850w PSU, he buys a 6800XT, I buy my 4070...
In EVERY game we ran both at the same res/settings 1440p, mine used between 2.3-3.6gb LESS
actual vram than him...
Wether that's down to some sneaky compression/better optimisation for Nvidia/more favoured coding wise for Nvidia hardware? I've no idea, but I don't care, as the reality is simple, if I use around 3-4gb less than him
actual usage not allocated then that means the remaining amount of vram for
both cards
is the same remaining amount to be allocated IF required! - Which TLDR they never went higher than leaving 3.3-4gb vram left on either card, so win win on both cards regardless of total vram to start with
See my point?
So thus both cards would 'run out' in the future when each other do... Only difference is I can switch DLSS3.5/whatever future version on and Frame Generation, and he can't...
As others have shown, you gain around 50% extra fps with FG turned on, so cannot loose, you also use less VRAM with DLSS3.5 on, and even
@mrk has shown in earlier posts, is only using 11ish gb maxing out a 4090 at 4k in Alan Wake 2...
No games I've found have used more than 9.6gb and that's with RT on maxed graphic settings in 1440p... Most stuff uses WAY less!
As I say, it's probably down to the heavily influenced/biased/paid off Nvidia backhanders for devs, but it works for me if it means my physical remaining vram amount matches what a hungrier 6800xt is using... Means I'm not missing out on the 3-4gb extra my mates card is basically wasting/using to take up the slack of being less optimised hardware for said games.
I did heavily consider a 6800XT though, but it's what 3 years old pretty much and the extremely low power usage of the 4070 - especially when undervolted (I use 115-135w or 145w max with RT on) along with the modern day feature set of the 4XXX series, for what £80-100 more at the time than a 6800XT, was a no brainer to me, as I'd seen what DLSS3 onwards could do as with Frame Generation - something my friend will never do, thus probably won't have the future proofing fallback I will have when settings/res start to chug... Also from my experience with DLSS on I use less vram again too, so win win. Having tried FSR on both my 580 8GB and my 4070, it's not even questionable which is better, let alone when paired with FG...
FWIW though, This is just my own experience, no trolling/offense intended. Just suited my needs well, seeing as all I had was a 2nd hand MITX build with a 3500x/rx580 8gb/2666mhz 16gb/500gb pci-e 3.0 nvme/450w mitx psu), so the jump was BIG!