• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

VRAM: single vs dual cards

Soldato
Joined
27 Jul 2004
Posts
3,622
Location
Yancashire
Another video memory question from me, I'm getting myself in a confused fuddle....

I see some people say that, for example, a card with 4GB VRAM would be enough if only using a single card at 4k, but if using SLI or crossfire and 2 cards, then it won't be, and you 'need' cards with more VRAM.

Now, I have no idea why this would be so. Can someone explain why if using more than one card, you need more VRAM? Or are they talking bollards?
 
I believe it is a bit like this:

When you have multiple cards, you are generally processing more frames due to more GPU power. As it stands, for multiple cards to work together you need to have the same information stored on each card, so no matter how many 4gb cards you have in your machine, you only really have 4gb of usable VRAM.
 
Because you won't be getting high enough framerates to actually use the memory.

Let's say you like playing above 60FPS with your 4GB card. When you go to 1440p or 4K for example on the game and it's really intensive (on medium to high settings) you are getting sub 60FPS.

What you would need to do is add another GPU to get past 60FPS again. But now you get 95FPS so you can actually run higher settings (Ultra maybe?) for 60FPS.

But Ultra needs 4.5GB to run properly... but you only have 4GB.
 
It's more a case of what is going to run out first - the "grunt" of the GPU or the VRAM. The theory is if you're running a single card, by the time you're running a high enough resolution/settings to use all the VRAM, the GPU is running out of steam and will be the thing slowing you down. If you're running multiple GPUs you are likely to run out of VRAM at higher resolutions/settings while still having some power left in the GPU cores, so your performance will drop because you haven't got enough VRAM.

That's the theory, but in practice it's obviously not as clear cut with some games being heavier on VRAM usage, some lighter, and some games needing a lot of power at higher resolutions, and some not.
 
Yeah as above.
If you have say high settings running at 40fps and you add a second card and are getting 80fps, so then you think hey I can go to ultra settings, only now high res textures push you over 4GB so it becomes a stutter fest.

In practice there arent many games that need more than 4GB of Vram, it is just bizarely funny how AMD really pushed their 8GB cards, still are, yet their flagship now has 4GB its fine. It is just a really messed up mixed message from AMD as per usual and it is hardly any surprise that there is a lot of back peddling going on.
 
Thanks guys I think it makes more sense now. It's still not gone 'ding' in my addled brain though :-)

So, it's more a case of, add a second card, get higher frames, like it and then want to up all details and still have higher frames, but the VRAM can't keep up?

But what about if you're already running games at their max settings. I mostly do or did until v recently for most games. I think this is why I'm not getting it

Edit. I think I get it. It's about 'want' vs 'need' on a sliding scale of framerates, screen res and game details. Someone needs to make a graph!
 
Last edited:
it all boils down to what settings you want to play on, say you had a 4k monitor and you was playing a pc game on

Single Card - 2.8gb/4gb vram usage - medium/high settings - 50-55fps

here if your generally only using one card then you only expect to play on medium to high settings, so its fine and worth your money, this is because one card is not powerful enough to run comfortably on ultra and therefore you compromise but once you start adding a second card your fps will go up to around 90-95 and then that is a waste of money because one card was enough for that, unless you put it on ultra

Sli/cf - 5gb/4gb vram usage - ultra settings - 50-55 fps

Now it has more grunt so you will generally put it on ultra, but this requires 5 gb of vram but your cards only have 4gb, so it will run terrible. now this leaves us with two scenarios

single card that can do medium/high at playable settings
dual card that cant really run ultra due to vram limitations and is overkill for medium/high

which would you choose#?
 
Last edited:
it all boils down to what settings you want to play on, say you had a 4k monitor and witcher 3 on

Single Card - 2.8gb/4gb vram usage - medium/high settings - 50-55fps

here if your generally only using one card then you only expect to play on medium to high settings, so its fine and worth your money, this is because one card is not powerful enough to run comfortably on ultra and therefore you compromise but once you start adding a second card your fps will go up to around 90-95 and then that is a waste of money because one card was enough for that, unless you put it on ultra

Sli/cf - 5gb/4gb vram usage - ultra settings - 50-55 fps

Now it has more grunt so you will generally put it on ultra, but this requires 5 gb of vram but your cards only have 4gb, so it will run terrible. now this leaves us with two scenarios

single card that can do medium/high at playable settings
dual card that cant really run ultra due to vram limitations and is overkill for medium/high

which would you choose#?

Note:these numbers are accurate and are just an example

No single card can do 50-55 on medium settings with witcher 3 and 4k.
You then need to go 980ti/Fury X/Titan X to get into those numbers.
Titanx cant sustain 60fps with witcher 3 at 1080p.

its a moot point
 
No single card can do 50-55 on medium settings with witcher 3 and 4k.
You then need to go 980ti/Fury X/Titan X to get into those numbers.
Titanx cant sustain 60fps with witcher 3 at 1080p.

its a moot point

forgot to write they were just an example and not accurate, but my explanation about vram been utilised is correct
 
it all boils down to what settings you want to play on, say you had a 4k monitor and you was playing a pc game on

Single Card - 2.8gb/4gb vram usage - medium/high settings - 50-55fps

here if your generally only using one card then you only expect to play on medium to high settings, so its fine and worth your money, this is because one card is not powerful enough to run comfortably on ultra and therefore you compromise but once you start adding a second card your fps will go up to around 90-95 and then that is a waste of money because one card was enough for that, unless you put it on ultra

Sli/cf - 5gb/4gb vram usage - ultra settings - 50-55 fps

Now it has more grunt so you will generally put it on ultra, but this requires 5 gb of vram but your cards only have 4gb, so it will run terrible. now this leaves us with two scenarios

single card that can do medium/high at playable settings
dual card that cant really run ultra due to vram limitations and is overkill for medium/high

which would you choose#?

best answer so far lol.

Simply

4k get 6GB Vram minimum as newer games will need more, 8GB preferred.

75% of games a single card wont use more than 4GB but a few will

mordor, GTA V, Evolve
 
Back
Top Bottom