• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 4GB of VRAM enough? extremetech article

With dx12 games coming along soon multi gpu configs are more at the hands of the developers as far as support goes. It's going to be interesting to see how the big developers handle this as it could make or break 3-4 way setups. I think 2 way will be well served but we shall see.

I'm not convinced by D3D native support, this will remain to be seen and which rendering method will be more widely adopted. Holding these decisions at the hands of the developer leaves an even larger divide than before, and will likely require a heavier involvement (push) from vendors than we see today. These techniques won't be seen just yet anyway.
 
Great couple of posts Silent, do you think DX12 mgpu support is going to get very messy the way game vendor tie-ins are going?
 
There are rumours circulating of NVIDIA dropping support for certain configurations anyway...

There is a lot to this that isn't as transparent as some make you believe. Three to four way configurations just aren't supported to a standard that it's even worth recommending. A lot of this analysis doesn't show in a 3 day article and same testing methodology. This is something that needs a much wider timeline and result comparative to really get a sensible scope for just why these configurations aren't worth the investment.

I am talking about a heavy usage period going back to 4 to 5 years. When SLI was worth upselling by vendors these things had the devotion they required. Today is a different business. Talking about which vendor scales better than the other and listing the failings of both serves no purpose. The way of the dinosaur is edging closer. We all like to think there is a market for such configurations, but NVIDIA at least does not, and it won't be before long that this becomes a reality.
I think the increased demands of VR and the inevitable slowing of GPU power gains will create a situation where dual-GPU setups actually become more desirable.

Especially with DX12 and things like VR SLI and split screen rendering making multi-GPU setups more efficient and effective.

Think back to how CPU's inevitably had to go multi-core because the road they were going down with single cores was hitting major roadblocks. It's possible we could see something like that again unless we achieve some properly major breakthroughs with silicon-based chips or other materials in the next 10-15 years.
 
Great couple of posts Silent, do you think DX12 mgpu support is going to get very messy the way game vendor tie-ins are going?

It has the potential to. Again it depends on how invested the developers are, it leaves a lot open to how advantageous they believe it will be to the game in particular, and even less of a concern how many GPU you may or may not have.
 
I think the increased demands of VR and the inevitable slowing of GPU power gains will create a situation where dual-GPU setups actually become more desirable.

Especially with DX12 and things like VR SLI and split screen rendering making multi-GPU setups more efficient and effective.

Think back to how CPU's inevitably had to go multi-core because the road they were going down with single cores was hitting major roadblocks. It's possible we could see something like that again unless we achieve some properly major breakthroughs with silicon-based chips or other materials in the next 10-15 years.

There are and will be uses much in the sense there are uses today. But the industry drives itself, and SFR might be fanstastic when implemented correctly - but if the quantity isn't there, then people are likely to adopt these setups anymore than they do today. Not to mention price point is a huge factor for majority of buyers in these cases. If quad card configurations were cheap, we would all have them anyway.
 
My answer to the original question of... "is 4GB of VRAM enough?"... is...

Currently for a single card and the games available?... Generally yes...

For multiple cards and future games?... Generally no...

Would having MORE than 4GB of VRAM be better?... Obviously yes...

Your welcome! :D
 
There are and will be uses much in the sense there are uses today. But the industry drives itself, and SFR might be fanstastic when implemented correctly - but if the quantity isn't there, then people are likely to adopt these setups anymore than they do today. Not to mention price point is a huge factor for majority of buyers in these cases. If quad card configurations were cheap, we would all have them anyway.
But it wouldn't be like today. That's my point. Increase how efficient and effective multi-GPU setups are, and they become more attractive. Provide more reason for needing a ton of horsepower(like with VR), and they become more attractive. And from a business standpoint, when GPU progress slows down more, promoting dual card solutions could be more attractive.

I mean, I can easily see a day where cards like the 7990 become more typical, it's just they wouldn't all be using dual high end cards anymore. We'd have *ranges* of dual card setups like we have ranges of multicore CPU's. Further progress might start to prioritize advancements in multi-GPU processing, making them even more efficient and effective, and hopefully smaller.

I'm not saying this would happen overnight, obviously. Or is even definitely going to happen. We never know what technological breakthroughs could bring us. But I do think this is one possibility for the future.
 
But it wouldn't be like today. That's my point. Increase how efficient and effective multi-GPU setups are, and they become more attractive. Provide more reason for needing a ton of horsepower(like with VR), and they become more attractive. And from a business standpoint, when GPU progress slows down more, promoting dual card solutions could be more attractive.

I mean, I can easily see a day where cards like the 7990 become more typical, it's just they wouldn't all be using dual high end cards anymore. We'd have *ranges* of dual card setups like we have ranges of multicore CPU's. Further progress might start to prioritize advancements in multi-GPU processing, making them even more efficient and effective, and hopefully smaller.

I'm not saying this would happen overnight, obviously. Or is even definitely going to happen. We never know what technological breakthroughs could bring us. But I do think this is one possibility for the future.

I don't think VR is going to have a sizeable impact on multigpu sales, and that's down to the fact that I don't think market adoption of both technologies will take that lineage. I can't say whether or not that's going to be the case though as I don't have a crystal ball lol. New rendering methods may take priority over these types of things...it's too early to say
 
I've not had issues running 4K on my pair of 4GB 290X's yet, the only game that suffered out off all the games I tested was Watch_Dogs
 
I don't think VR is going to have a sizeable impact on multigpu sales, and that's down to the fact that I don't think market adoption of both technologies will take that lineage. I can't say whether or not that's going to be the case though as I don't have a crystal ball lol. New rendering methods may take priority over these types of things...it's too early to say
I don't think VR is going to make a seriously huge impact in the short term(1-4 years). But beyond that, I do think VR will certainly start to become something more popular among PC users.

I mean, my argument would still stand even without VR, but VR just improves it. Split frame rendering is useful even without VR, as it negates the inherent latency from dual-GPU setups.

Of course something like true foveated rendering might make all this invalid. That could potentially create a situation where VR graphics/performance is actually superior to 2d display graphics/performance, which would be wild. But it seems like the eye tracking speeds necessary for that are still a way off. Plus it would need to time to be built into the rendering engines and all.

But yea, you're not necessarily wrong. The future is always hard to predict. The march of technology almost inherently creates unpredictable applications and results.
 
...And since at 4k MSAA is absolutely irrelevant...
You don't need AA at 4k to deal with jaggies (assuming you don't have a hugenormous monitor), but other aliasing artefacts, e.g. moiré patterns, can still be very noticable under certain circumstances. I'm happy to play games at 4k without AA, but I prefer to turn it on if I can.
 
It has the potential to. Again it depends on how invested the developers are, it leaves a lot open to how advantageous they believe it will be to the game in particular, and even less of a concern how many GPU you may or may not have.

:cool:

SE3 mgpu got highly plugged by Rebelion PR they shouted louder than any other Mantle partner about the great things they could do with Mantle mgpu yet didn't implement it at all.:eek:

Think there will be mgpu grief coming no matter the vendor, I just don't think it's deemed important enough.
 
Playing at 1440P with a 4GB card I have yet to see the usage creep passed 2.50GB in the games I play.

4GB is enough for the 99.99% of customers out there ATM :)
 
Back
Top Bottom