10GB of G6X has 760GB/s of Bandwidth which is extremely high and way overkill for every game out right now at 4k, I don't see 10GB of G6X not being enough until maybe 4 years where you'll want to upgrade anyways. Vram amount is not the same as Vram bandwidth.
So why then is pretty much every card that's coming out in the next few years (including the inevitable supers/Ti's, the already confirmed 3070's and 3080's, 3090, the 16gb+ amd cards) having much more? Seems like it'd be a gigantic waste of cash.
I'm aware bandwidth matters but 760GB/s isn't "extremely" high when last gen rtx2080 is over 500GB/s IIRC, and 2080ti pretty close. And I'm sure I've maxed memory usage in over 5 titles even today. For sure bf5, control and resi evil remake, off the top of my head. Small handful, but I don't even play much.
And this is just @ 4k, how the hell is an rtx3080 gonna manage the advertised 8k? or even 5k? Control has been shown to consume like 20gb @ 8k.
I've seen some benchmarks where the 3080, despite being 30-50% faster than the 2080ti in most titles, barely beats it in some others: Perhaps being help back by having to resort to using system memory? Despite having higer bandwidth? Doesn't make sense, but somethings holding it back in some titles.
Granted most of these games would be unplayable at 8k anyway, rather pointless (I mean how close u really gonna sit to your screen), as well as not many games needing it, even over 4 years.