Wrong.
Alan Wake 2 DLC: Benchmarks mit Pathtracing, Raytracing & Rasterizing
www.pcgameshardware.de
Also, regardless of scene/how it was benchmarked, the point still stands with the same old of selective quoting to fit narrative i.e. whether intentionally or not using a benchmark of launch day version to show an out of date/no longer relevant version of said games performance..... If anything, this just further proves that games need a patch or 2 to properly address issues as opposed to just brute forcing past issues. Next thing we'll have people taking the likes TLOU and hogwarts to show vram issues when they were also solved with patches. Bit like humbug putting texture etc. issues in forza down to vram when the devs even stated themselves changing drivers solves said issue.
I'm also not really seeing any evidence from anyone to show the 3080 suffering more than it should because of vram either and going by daniel owens videos, you can see the dedicated vram not coming close to 10gb either unless again: you play at 4k and don't use upscaling or/and add loads of texture packs or/and look at games on launch day which were well regarded to be poorly optimised only to be fix with a couple of patches.
No doubt it would have been nice to have had more vram from the get go to cover all bases but again, what alternatives where there?
- rdna 2 gpus at the time which as evidenced have aged considerably worse by not having a consistently good upscaling tech and suffering big time in rt workloads (which is becoming far more common now due to the rt evolution)
- pay the extra £750 for the only other ampere gpu to have more vram at the time? As evidenced by the lack of answers to this question ""has the extra vram proved beneficial enough to justify the extra cost", (except hum who said he didn't go for the 2080ti over the 2070s and 5700xt because it was too expensive....), it's safe to guess what the answer will be.