You asked for proof and you got proof, I have totally blown all your arguments out of the water.
Nope, I didn't.
What you think of the game is totally unimportant, it does not have to be pretty.
To you, to me and 99.999999% of gamers it isn't. If it's not pretty it shouldn't need £3200 of graphics power to run 65fps, it points to the horrifically bad optimisation of the game. When it looks massively worse than other games that have come out in the past two years and runs massively worse as well it is NOT a good representation of 4k gaming, in fact it's exactly the opposite, it's a single game that stands out as the worst possible representation. The massive majority of gamers don't use 4x Titan X's, the massive majority of gamers would disable poor quality effects and certainly settings that kill performance with absolutely no IQ improvement for the sake of smoother gaming. How pretty a game is, for most gamers, is directly linked to what performance they will put up with.
The argument was why a Fury X and GTX 980 Ti was stuck at 4fps and 6fps respectively at max settings and could TXs do any better.
No that wasn't the argument, the discussion was you insisting Fury X sucks at 4k due to memory. I provided a list of about a dozen widely played games, most which look FAR better than this all of which run great at 4k in which the 6/12GB options provide zero benefit, you insist that this single game is representative. I never once asked if Titans could or couldn't do better in this game, I asked what that has to do with 4k gaming in general when every other game doesn't share this result.
The one thing I will say about XCOM 2 is it can use 10.5gb of memory, it may be the first game but it certainly won't be the last and there will be plenty more in the works.
A severely unoptimised game, probably like SoM, using uncompressed textures for absolutely no reason except to mess with AMD, is not representative of how much memory 4k gaming needs.
I didn't ask for proof, so I didn't get proof, you're running 8xmsaa at 4k, pretty much pointless. Are you running 'max settings', ie stuff that reduces IQ and produce unrealistic effects like DOF?
The game runs like absolute filth at 4k on 4x Titans, the percentages of gamers who run 4x Titans..... 0.000000001%. £3200 or so for 4 cards to achieve 65fps in a game that looks this bad and should be runnable at 4k with max settings on a 680gtx.... what a truly epically great example of how Fury X can't run 4k.
Again I DIDN'T ASK FOR PROOF, I just said that one game means nothing in terms of if 4GB is enough. if 99% of games run great at 4k but one game that requires 4 titans to produce 65fps doesn't run great I don't care.
If running max settings is so important, why aren't you running 8k with dsr, or 32xSSAA, why 8xMSAA... is that the setting you found that broke the Fury X but only embarrassed £3200 of graphics cards with 65fps in a game that has all the graphical quality of a 5-6 year old game?
EDIT:- Having looked at some videos and some performance charts, that memory usage goes from minimal with zero impact on performance at high to completely crippling a 6GB 980ti at max as badly as it hurts the Fury X.... this is absolutely another Shadow of Mordor. High to Max gives zero visual gain but a sudden massive jump in memory from under 4GB to over 6GB. The only other game I've seen display this behaviour is Shadow of Mordor where they randomly, pointlessly uncompressed textures for no reason at all. There is no benefit to doing this, it doesn't improve graphics, it is simply less efficient for the sake of using memory. You spent £3200 to get better performance than cards with 4GB of memory and the only way Nvidia achieves the difference is paying a couple of devs to throw in a 'max' option with zero IQ improvement with uncompressed textures to pretend like more than 4GB matters.
If Max textures vs high had a genuine IQ improvement that is one thing, but zero improvement with the sole purpose of adding the setting to use more memory on purpose, more Nvidia sabotage... though in this case you could call it a therapy setting to make people who bought their 12GB cards feel less bad about it.
You repeatedly and consistently use extreme rare case games with pointless settings and known massive performance problems to generalise 4k gaming. If 1 or now 2 games run poorly on sub 12GB cards out of hundreds when the rest all run fine, the only sensible generalisation to be made is the 2 games have problems and 4GB is absolutely fine currently for 4k gaming.