• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2gb or 4gb GPU

Anyway, don't think we need to start all this again, already have 1 thread half filled with it.
Point is, the OP wanted advice on a new Nvidia card, even the swe clockers graphs show that a 2GB Nvidia card should be fine for Ultra settings (unless the final version is more demanding than the BETA).

It has also been pointed out that AMD cards offer better value.

2gb should be fine for 1080p. Above that remains to be seen. The final build should clear that up.
 
2gb should be fine for 1080p. Above that remains to be seen. The final build should clear that up.

2Gb won't last, this time next year the next generation machine will be out and settled and then ports will start to push over 2Gb of VRAM.

3Gb minimum is what's needed to last a decent amount of time.
 
but this is just your guess, right? :)

Nope... It's called common sense....

You look at any time a new console has launched.... PS2....Xbox...PS3...X360....

Memory requirement's have always jumped straight after.

Developers have ~350Mb to use for VRAM on the current consoles.... And now they have 10x that on the next generation machines and people don't expect VRAM requirements on PC to go up? :rolleyes:
 
2Gb won't last, this time next year the next generation machine will be out and settled and then ports will start to push over 2Gb of VRAM.

3Gb minimum is what's needed to last a decent amount of time.

3GB won't last. Give it time and you will be seeing 3GB being redundant....

On saying that though, I am very surprised that AMD are launching GPU's with only 2GB of VRAM. Maybe they know something we don't.

What is missed often is GPU grunt Vs VRAM. No point putting 6GB of VRAM on a 560Ti or a 6950... You will run out of GPU grunt long before that VRAM is needed. The same will be so for the 2GB limit. Already we are seeing Games like Crysis 3/BF3/Tomb Raider requiring more GPU grunt and if they push for more vram demanding AA/textures, you can bet you will need more powerful GPU's to run that detail.

Of course, I have no evidence or proof and this is my thoughts :)
 
3GB won't last. Give it time and you will be seeing 3GB being redundant....

On saying that though, I am very surprised that AMD are launching GPU's with only 2GB of VRAM. Maybe they know something we don't.

What is missed often is GPU grunt Vs VRAM. No point putting 6GB of VRAM on a 560Ti or a 6950... You will run out of GPU grunt long before that VRAM is needed. The same will be so for the 2GB limit. Already we are seeing Games like Crysis 3/BF3/Tomb Raider requiring more GPU grunt and if they push for more vram demanding AA/textures, you can bet you will need more powerful GPU's to run that detail.

Of course, I have no evidence or proof and this is my thoughts :)

Oh I completely agree, it's all a balancing act.... But there's no point in buying a 2Gb GTX 670/680 or going SLI when in a years time they'll be VRAM bound.

The GTX 570 and 580 were classic examples of an un-balanced GPU...... They had more shader grunt then they did memory which meant that the settings had to be turned down even though the core itself had plenty left in the tank.
 
I was curious. What fps do you get in Crysis 3 with everything maxed out fully @ 1080P (curious again).

I play with V-Sync 60fps and it very rarely drops below it tbh....

That's everything on Ultra and 2xSMAA

4xMSAA is playable but can drop down to low 30's during action scenes...
 
Oh I completely agree, it's all a balancing act.... But there's no point in buying a 2Gb GTX 670/680 or going SLI when in a years time they'll be VRAM bound.

The GTX 570 and 580 were classic examples of an un-balanced GPU...... They had more shader grunt then they did memory which meant that the settings had to be turned down even though the core itself had plenty left in the tank.

Why will they suddenly become VRAM bound? Surely they'll still run out of grunt before VRAM?

So are we saying the new consoles have 3GB VRAM? I thought they had 8GB shared RAM? If it is 8GB shared, they surely it could use more than 3GB VRAM so even 3GB cards wouldn't be enough?

I imagine 1080p is the most common resolution for PC gaming and seems reasonable to assume that a lot of gamers have 2GB VRAM or less (due to the fact that currently only the top 2 or 3 AMD cards and top 4 (arguably 6) Nvidia cards are available with more than 2GB), so can't imagine games companies will want to push requirements much higher than this.
 
Why? What performance are you getting?

Because I can see in a years time, games being that demanding more often than not and You can see from your own testing that you need 3 7950's possibly to get full details to play the game smoothly. The VRAM isn't the issue here, it is GPU grunt.

For a single 1080P gamer, he doesn't want to upgrade his PC to get SLI/CF to work (some will of course). But PSU/CPU/Heat/Space/Electric are all considerations for SLI/CF as well as games working fully with CF/SLI enabled.

For those who intend to go SLI/CF, for sure I would recommend more than 2GB but to claim it as finished is basically wrong.
 
Last edited:
Nope... It's called common sense....

You look at any time a new console has launched.... PS2....Xbox...PS3...X360....

Memory requirement's have always jumped straight after.

Developers have ~350Mb to use for VRAM on the current consoles.... And now they have 10x that on the next generation machines and people don't expect VRAM requirements on PC to go up? :rolleyes:


Things that require more vram also tend to require more gpu grunt. Just because new consoles have a large amount of vram doesn't mean it will make use of it the way you would assume. If it did then the games would be a slide show.
 
Oh I completely agree, it's all a balancing act.... But there's no point in buying a 2Gb GTX 670/680 or going SLI when in a years time they'll be VRAM bound.

You can't be certain this is the case. While I agree with your point in principle, you're making it seem a certainty when that is far from true.
 
You can't be certain this is the case. While I agree with your point in principle, you're making it seem a certainty when that is far from true.

Textures aren't ALU bound.... they're TMU bound and most cards have more then TMU performance for textures..

I only have to bench games with and without texture packs to see there's virtually no performance drop from running high resolution textures.

What they do consume is memory....
 
Textures aren't ALU bound.... they're TMU bound and most cards have more then TMU performance for textures..

I only have to bench games with and without texture packs to see there's virtually no performance drop from running high resolution textures.

What they do consume is memory....

I never said that wasn't the case - I just took issue with your statement of certainty when that isn't necessarily true. As I said, I agree with your sentiment that usage will increase but you're guessing as much as the next guy until we see some "next-gen" games come out.
 
Why will they suddenly become VRAM bound? Surely they'll still run out of grunt before VRAM?

So are we saying the new consoles have 3GB VRAM? I thought they had 8GB shared RAM? If it is 8GB shared, they surely it could use more than 3GB VRAM so even 3GB cards wouldn't be enough?

I imagine 1080p is the most common resolution for PC gaming and seems reasonable to assume that a lot of gamers have 2GB VRAM or less (due to the fact that currently only the top 2 or 3 AMD cards and top 4 (arguably 6) Nvidia cards are available with more than 2GB), so can't imagine games companies will want to push requirements much higher than this.

They have 8Gb yes but Xbox One's operating system consumes 3Gb and PS4's consumes 2.5Gb.

So Xbox One has 5Gb for games and PS4 has 5.5Gb for games.

So my figure of 3.5Gb for VRAM use is not that unrealistic as that leaves 1.5-2Gb free or general game data and other assets.

And there's games out now that can push 2Gb and they're only current generation games...

Killzone Shadow fall uses 1.5Gb of VRAM and that's only because Gorilla made the game with the original planned 4Gb of DDR5.

It was only later on that they learned Sony had doubled PS4's memory to 8Gb.

So if they're using 1.5Gb then, before they had 8Gb how much do you think games in 12 months time will use?

Especially as Killzone Shadow fall is a launch game that's using the same engine they used on PS3....
 
They have 8Gb yes but Xbox One's operating system consumes 3Gb and PS4's consumes 2.5Gb.

So Xbox One has 5Gb for games and PS4 has 5.5Gb for games.

So my figure of 3.5Gb for VRAM use is not that unrealistic as that leaves 1.5-2Gb free or general game data and other assets.

And there's games out now that can push 2Gb and they're only current generation games...

Killzone Shadow fall uses 1.5Gb of VRAM and that's only because Gorilla made the game with the original planned 4Gb of DDR5.

It was only later on that they learned Sony had doubled PS4's memory to 8Gb.

So if they're using 1.5Gb then, before they had 8Gb how much do you think games in 12 months time will use?

Especially as Killzone Shadow fall is a launch game that's using the same engine they used on PS3....

That's largely speculation.
Also if they thought they had 4GB to work with and 2.5GB was used for the OS then they were allowing 0GB free for general game data and other assets. If this is how they plan to use 4GB why wouldn't they use 5.5GB for VRAM now they have 8GB, since apparently they didn't need to leave any memory for game data.
So maybe 3.5GB for VRAM is low, maybe it's more like 5.5GB VRAM.
 
Back
Top Bottom