Well, I always consider the amount of VRAM to be great indicating of longevity.
In which case whether 8GB 3070 Vs 12GB 6700, or 10GB 3080 Vs 16GB 6800 XT, the Nvidia looked the worse value to me.
Of course, plenty of buyers don't care about longevity especially on forums like this.
But all those cards were too expensive for me anyway at launch and since then things got totally crazy. For me, even if prices were back at MSRP, there's little value this generation.
8GB vram defo is not enough imo but I don't think the 3070 cuts it for grunt anyway (defo not at 4k) thus settings need to be reduced which in return reduces vram usage as well and if you have to turn down 1-2 settings to reduce any issues because of the lack of vram, is it really any different to having to drop settings because of the lack of grunt? The card is still going to do incredibly well if you know what its limitations are and what to do to avoid those issues i.e. don't be like the guy on here who tried running cyberpunk max on everything with a 3070 and complained about it..... bearing in mind even a 3090 can't do max settings in cyberpunk with an acceptable fps experience....
10GB wise, we're all still waiting 1 year 5 months on to see any real evidence as to issues here, will it be an issue at some point, perhaps and then the question is, how many games will we be talking about? I'll be surprised if we see more than a handful, if that.....
Things like DLSS (which in return also reduces vram usage) are only going to become more important when games are pushing the graphical effects especially on the ray tracing front going forward.
Then you have direct storage, this in theory will change how vram management works entirely and vram capacity should not be as much of an issue.
Factor in that nvidia are regarded as being better with their vram management than amd i.e. they use less vram where as amd generally gobbles up as much as they can, whether this is a design decision or an optimisation problem is another question.
In terms of amds longevity, given that ray tracing is being adopted into the vast majority of the games now and amd being considerably slower in this (even when the effects are limited and in their own sponsored games too) as well as not being able to handle as complex RT scenes, there is no doubt, amd will suffer in the long run here imo, especially since we have already seen in several RT titles, a 3070 matching/besting a 6900xt because of the lack of RT ability.
TLDR: either way, with time, every card is going to have an issue because of some limitation. Question to ask is why will people be upgrading in a year or so?
Upgrading for better ray tracing performance?
Upgrading for better rasterization performance?
Upgrading for more vram?
No doubt the next gen of gpus will be far more interesting and at this stage, might as well wait, however, they are likely to be even more expensive.