• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Volta with GDDR6 in early 2018?

I have been gaming on 4K for nearly 3 years now. Not half as hard as people make it out to be imo. I do not use aa which saves fps, also turn off all the undesirable effects like depth of field, motion blur etc and that brings fps between 30-60 in most recent games. 99% of all games made I can play 60fps+ anyway :)

Worst case scenario all one needs to do is do the above, in addition to dropping from Ultra settings to Very High on a few settings and it will still look better than 1440p with everything on :D

With Volta 4K power will be here imo. But there will always be a handful of people who need minimum 100fps in which case they will have to wait another 2-3 more years.

It is all relative end of the day.
 
Kind of a point.
Ddr3 isn't much worse than ddr4
A five year old I7 isn't much worse than a brand new one.

At least in real gaming terms anyway. Turn a few settings down from ultra to high. Sometimes you can't tell the difference other than chugging frames.

Conversely, there are are few if any PC solutions that can max out all games at 4k.
 
whilst I agree 4k is the new marketing gimmick, its worth pointing out netflix still streams at 720p to browsers, so hasnt jumped even to 1080p :)

For me pixel count is far from a prime need, I game at 1440p, I dont notice anything improved over 1050p (which I used to game at), I only game at 1440p because I got the higher resolution for more desktop real estate and simply game at the native resolution to avoid scaling issues. Adding pixels chews up processing reources, I rather have things like sgssaa, longer draw distances, higher follage etc. than pixel count in games.
 
whilst I agree 4k is the new marketing gimmick, its worth pointing out netflix still streams at 720p to browsers, so hasnt jumped even to 1080p :)

For me pixel count is far from a prime need, I game at 1440p, I dont notice anything improved over 1050p (which I used to game at), I only game at 1440p because I got the higher resolution for more desktop real estate and simply game at the native resolution to avoid scaling issues. Adding pixels chews up processing reources, I rather have things like sgssaa, longer draw distances, higher follage etc. than pixel count in games.
It is possible you just have not seen the correct games to appreciate the difference. I see a very noticeable difference in Witcher 3 and even a simple game like FIFA 17.
 
They run it 'ok'. They're borderline if you want to do 60fps in nearly everything with good settings. Some games you're gonna have to make more notable compromises.

And games will get more demanding. These two cards you mention might still be 'ok' for a couple years, but for those who really want to push 4k *comfortably*, we definitely need more. Extra horsepower will obviously be the biggest thing, but if super high bandwidth HBM offers notable benefits at these high resolutions, I would imagine consumers, especially enthusiast consumers like those here, would want that. I find it absolutely bizarre that some of y'all are pushing against this.

Combined with GDDR6 for the x70/x80 cards, I think it would create a pretty great 'high resolution-capable' lineup.

Maybe GDDR6 will still be used, though. Of course it will still be 'enough', even GDDR5X will be 'enough', but who pays top dollar for 'enough', ya know? A bit of overkill ala the Kepler and Maxwell Titan's is not necessarily a bad thing.

Something most people don't even consider when talking about memory ability is GDDR5X is more than fast enough @2160p for 4 way SLI with a stack of Titans. Don't forget that in SLI the memory is mirrored on all cards and with 4 cards in play the fps even at 2160p is very high. The biggest problem I find with mGPU is not the memory but how the cards are connected when using 4 cards, AMDs solution using the PCI-E slot is bad as it has to share that bandwidth with the normal GPU stuff and NVidia are not much better where there are no HB 4 way SLI bridges.
 
Something most people don't even consider when talking about memory ability is GDDR5X is more than fast enough @2160p for 4 way SLI with a stack of Titans. Don't forget that in SLI the memory is mirrored on all cards and with 4 cards in play the fps even at 2160p is very high. The biggest problem I find with mGPU is not the memory but how the cards are connected when using 4 cards, AMDs solution using the PCI-E slot is bad as it has to share that bandwidth with the normal GPU stuff and NVidia are not much better where there are no HB 4 way SLI bridges.

I think the market for 4way SLI or even 2way SLI for that matter is so low it's just not worth there time.
 
I think the market for 4way SLI or even 2way SLI for that matter is so low it's just not worth there time.

That is not the point I am making.

I am pointing out that GDDR5X is capable of dealing with very high fps @2160p, far more than a single card can produce on its own.
 
All this 4K vs high refresh rate debate.

NEAqXzx.gif


Seriously though, for gaming I don't mind 1080p but for media I do enjoy the extra pixels.

Early 2018 does sound about right for Volta considering Nvidia has only recently launched the 1080 Ti and the Titan Xp. I'm don't think Nvidia will jump to HBM2 yet until it matures further but it wouldn't surprise me if they release a HBM2 version further down the line for a more "premium" price (maybe for the Tesla/Quadro line like they did with Pascal).
 
That is not the point I am making.

I am pointing out that GDDR5X is capable of dealing with very high fps @2160p, far more than a single card can produce on its own.

That is the benefit of mirroring the memory, effective giving you a RAIDlike system, and also why the concept of memory pooling so 2 8GB cards can work as 116GB card just doesn't work because then you are not increasing the memory bandwidth in accordance to the increase in GPU power. Let alone the fact it is basically impossible to do it effectively in DX12 in any real-world software
 
All this 4K vs high refresh rate debate.

The original argument wasn't 4K vs high refresh - but that I claim that compared to previous jumps in resolution where people almost universally appreciated the extra screen estate the opinion is more split at 4K - beyond just requirements for refresh rate or hardware to power it.
 
whilst I agree 4k is the new marketing gimmick, its worth pointing out netflix still streams at 720p to browsers, so hasnt jumped even to 1080p :)

FYI, for weird and complicated reasons, Netflix actually does 1080p on Microsoft Edge. But no other browser.

  • 720p 3500 Kbps on all other browsers
  • 1080p 5100 Kbps on Edge

And you can definitely tell the difference. So do all your browsing, etc. on Chrome, then watch Netflix on Edge.

If you're interested in seeing the exact difference, watch this test video on different browsers: https://www.netflix.com/title/70136810
 
Back
Top Bottom