Refreshing to see some new technology break through in the graphics card sector, I look forward to seeing it evolve.
I don't think 4gb of VRAM is going to be that much of an issue, it isn't as if a game's entire texture assets library is in use, simultaneously in every area, all at once, so even if it is a problem, then the worst I can envisage is perhaps a stutter or two when moving between areas in massive open world titles.
Whilst the lack of HDMI2 is a problem for me and there doesn't seem to be a solution for the time being, it obviously isn't impossible to overcome. I'm not an expert, but my observation is this :-
4K PC gaming in the living room, is fast becoming a reality. We're at a point now, where graphics card hardware is powerful enough(for the record, I don't consider not being able to run a game at Ultra, as being equal to not being able to run it at all, PC gaming has always been about making trade ups between image quality and performance and it probably always will be). Small form factor is becoming popular, TV manufacturers seem to be taking gaming much more seriously than they did even a couple of years ago. Most 4K TVs released in 2015 have acceptably low input lag, whereas even just a year earlier you had to cherry pick which models and manufacturers were suitable. Pretty much all TVs released now have full chroma support and they all have anywhere between competent and excellent scaling and processing engines. This increased emphasis on the importance of gaming as a home cinema application, is also reflected in enthusiast and review sites.
At the moment, for whatever reason; probably commercial, home cinema just doesn't feature Display Port, it's not only TVs, it's AV receivers, sound bars and pretty much anything else. It's all HDMI and in order to run a video through HDMI at 4k beyond 30hz, it needs to be HDMI2.
The important thing for me however, is not that AMD have neglected to equip their cards with HDMI2 ports, it is in how they react to the potential for this to be a problem. They have created what looks to me to be some really competent display technology in Fiji and it's genuinely refreshing for an enthusiast to see the cycle of "more cores, more memory, more heat, more space" that we've been locked in for what feels like ages now, broken. These cards represent a change in direction, which was needed, they are ideally suited to small form factor and they are ideally suited to the living room.
AMDMatt, I don't expect you to be able to make any big statements on behalf of AMD here, but over the next few weeks these cards will be getting reviewed and I have a feeling that AMD will be getting asked what their position towards "4K Fury in the living room" is. Given how ideal these cards are for the environment in question, it would be nice for AMD's position to be either that of bundling a converter with their cards capable of carrying a 4k 60hz signal to home cinema equipment, or that of releasing a suitably priced converter after-market. I don't know, the ship might have sailed for the Fury and Fury X, but maybe a converter could be sold separately, while being bundled with the Fury Nano and of course with Project Quantum, if AMD are dead set on bringing it to market.
I think if AMD's position ends up being "our cards aren't designed for 4K in the living room" then that would be a real shame, because given the investment AMD have clearly made in this technology and how refreshing it is, it would be nice to see it realize its full potential, in an environment it seems built for.