• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

VRAM - AMD/Nvidia, why does it differ?

I'm stuck running 1440p with a HD7870 @ 1200/1450.
Its as fast as a stock 7970 (non ghz)
Its horrible. Horrible, horrible, horrible.
So stuttering. Much VRAM usage.
I can't even mine Dogecoins without it crying.

I'm ordering an AIB R9 290 because I want the power and the 4GB Buffer. The consoles have a truckload of usable VRAM, and we're gonna need it too.


I have the 7870XT @ 1200 / 1575, runs very well, But, tried turning the Resolution Scale Percentage up in BF4, my VRAM filled to 1950MB and it borked.

I can see whats going to happen once Mantle gets here, i may well have the power to run BF4 absolutely maxed, but my 2GB of VRAM says NO!
 
^ Precisely, I honestly don't get why or how people cannot see this. PS4 has 8GB GDDR5, most of that is going to be disposable to the GPU.

Any bells ringing yet?

We're going to need it...

Playing devils advocate, the console life cycle time is HUGE. PC will see many generation iterations before the consoles see their next one.

So, yes - 8GB could be a sign of "YO GONNA NEED IT SON, SOOOOON" but it's also a sign of 'console future proofing' whereby not putting 8GB in would be stupid...because it has to last 5 years+

In 5 years it's a comfortable bet the PS4/XBOX One are still going to be current console tech. In the PC arena nobody has any idea what the landscape will be in 5 years.

So, directly comparing console hardware decisions with PC ones it pretty difficult and largely moot.

I'm not saying VRAM is irrelevant, it clearly is not. What I am saying is that in a very large % of normal usage the VRAM used in the current gen for current titles (and some years past that) is perfectly fine.

Yes, some users (SLI/XFIRE, Large res, modding, surround/eyefinity) will see a big benefit now but most wont. So why bump the price? Those who want large VRAM solutions can buy the niche cards for niche application. Everyone else can buy cards with an appropriate amount of VRAM for their usage.
 
VRAM isn't actually a large cost of GPU manufacture, only at the moment as prices are at an all time high.

What you're saying is true, nobody expects these games to be making full use of an entire 8GB within the next couple of years. But I am currently using in excess of 90% of 3GB. That is a genuine need for more VRAM. PC is the pinnacle platform, I'm not sure where all the whining is coming from. It isn't a case of more is less or less is more.

Of course if the Maxwell 8GB rumours are true that puts rest to that.
 
I know they scaled back the amount of RAM taken by OS a bit, but the consoles still reserve a decent chunk for non-framebuffer usage. Like 2GB?

I have a semi-relevant musing... will stacked VRAM affect how much you can have? Because look at all the modules soldered on to a Titan or 290X, can you really shrink that down and layer it on top of another chip?
 
Last edited:
Playing devils advocate, the console life cycle time is HUGE. PC will see many generation iterations before the consoles see their next one.

So, yes - 8GB could be a sign of "YO GONNA NEED IT SON, SOOOOON" but it's also a sign of 'console future proofing' whereby not putting 8GB in would be stupid...because it has to last 5 years+

In 5 years it's a comfortable bet the PS4/XBOX One are still going to be current console tech. In the PC arena nobody has any idea what the landscape will be in 5 years.

So, directly comparing console hardware decisions with PC ones it pretty difficult and largely moot.

I'm not saying VRAM is irrelevant, it clearly is not. What I am saying is that in a very large % of normal usage the VRAM used in the current gen for current titles (and some years past that) is perfectly fine.

Yes, some users (SLI/XFIRE, Large res, modding, surround/eyefinity) will see a big benefit now but most wont. So why bump the price? Those who want large VRAM solutions can buy the niche cards for niche application. Everyone else can buy cards with an appropriate amount of VRAM for their usage.

Thats a good point, and we all know how Development of Games for Consoles only gets more refined over time (usually due to the Dev's learning how to use every single microflop of the Consoles power :P).

The problem is here, is that these consoles run on x86. Dev's already know how to write to every single bit of RAM on the Consoles. I'm betting we see on-the-fly quota switching in games where the code will allocate 6GB (2GB/6GB)to the CPU in one bit where loads of AI and Physics are needed and 4GB (4GB/4GB) to the GPU in a different bit when there are loads of super pretty textures (like a "Vista" event or a real-time-rendered cut scene).

In this case we will have to be prepared for Console games to be utilising a LOT of VRAM on the Consoles very soon - this is just going to mean that the port will be a lot smoother for those with tons of VRAM on their PC GPU.

These Devs aren't working with RISC Processors and weird Architectures anymore...all of this is familiar to them. Heck, the PC's they use to Render and Code these games are cut from the same cloth as the consoles now.
 
There is 5.5GB available for developers use according to a press release from Sony earlier in the year.

Bang bang...he shot me down bang bang...I hit the ground...

Anyway - We all have to remember these Devs on occasion are going to be using 2GB of VRAM to render a 720p image. Its going to be a very pretty 720p image, but 720p nonetheless. Imagine the power its going to require to render that same image in 1440p, with high quality textures and tesselation.
 
lol, that is the one thing which does grind on me personally - when people say it's ok for 1080p. I've been using 1080 for almost a decade, if you want to play at 1080p that's your prerogative. Frankly I want to get the most out of my GPUs without the bottleneck.

If your usage is fairly moderate currently at 1080 and you're happy with that, then happy days. Some of us however go that little bit further, which is where these walls are being hit in surround and higher resolutions.

Unless the arguement is making these demands is going to put the price up on flagship items for those of you on lower resolutions, to which I say tough ****. :D
 
Bang bang...he shot me down bang bang...I hit the ground...

Anyway - We all have to remember these Devs on occasion are going to be using 2GB of VRAM to render a 720p image. Its going to be a very pretty 720p image, but 720p nonetheless. Imagine the power its going to require to render that same image in 1440p, with high quality textures and tesselation.

Where is this 720p coming from. Most ps4 games atm are 1080p or 900p. Xbox one is a different matter. Killzone shadow fall is supposedly the best looking of the ps4 release titles and runs at 1080p/60 fps.

http://www.computerandvideogames.com/436467/killzone-shadow-fall-supports-native-1080p-at-60fps-says-sony/
 
I think the majority in here within 1-1½ year already have upgraded their GPUs, so I can't really understand why people put so much into it. So yea when you "need it" then you're also gonna have it "per default".
 
Where is this 720p coming from. Most ps4 games atm are 1080p or 900p. Xbox one is a different matter. Killzone shadow fall is supposedly the best looking of the ps4 release titles and runs at 1080p/60 fps.

http://www.computerandvideogames.com/436467/killzone-shadow-fall-supports-native-1080p-at-60fps-says-sony/

http://uk.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

It's actually the Xbox One that currently has a few games running 720
 
Someone I was talking to who used to be in the industry was adamant the core clock and bandwidth differences wouldn't really be noticed but that list of games says otherwise. Although may also be down to developers choice. I hear Ghosts on PS4 has serious frame rate problems.
 
Someone I was talking to who used to be in the industry was adamant the core clock and bandwidth differences wouldn't really be noticed but that list of games says otherwise. Although may also be down to developers choice. I hear Ghosts on PS4 has serious frame rate problems.

It's not the core clock and bandwidth its the fact the ps4 has around 50% more shader's over the xbox one.

It's a funny one with ghost's, the ps4 frame rate is actually to high. Check this out.

http://www.kotaku.com.au/2013/11/call-of-duty-ghosts-frame-rate-on-playstation-4-is-too-high/
 
So just what is going on? Well, a close look at our captures reveals that Call of Duty: Ghosts actually runs at higher frame-rates than 60fps on a fairly frequent basis, despite the video output being limited to 60Hz. In scenes where we experienced judder and perceived frame-rate loss, what we are actually seeing is the appearance of skipped and incomplete frames – an effect that is arguably far more noticeable than a few prolonged drops down to 50fps or so seen the 360 version of the game.

G-Sync would have put paid to that little problem :p
 
Back
Top Bottom