• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

6GB 780Ti confirmed

You guys aren't seeing the long term benefits of more memory.

Most games use engines that were built with the consoles in mind, thus these game engines were built on 512Mb of RAM and around the limitations that brings.

Even a console port of a game can consume way more then the 512Mb of RAM on PC, now the lastest set of consoles have 5 and 5.5Gb of memory. Using logic and common sense you could argue that the developers using this next generation machine could didicate 3.5Gb of that memory to VRAM alone, maybe even 4Gb!!

Still think 3Gb is enough?

And there's the whole argument of a card will run of power before 3Gb will limit it's performance, load of rubbish.

Texture mapping performance is completely irrelvant and independant of shader and ALU performance.

If you take the first Crysis game for example, if you max the game out in stock form and with a 4Gb texture pack installed the results from a benchmark run will be IDENTICAL... All that despite the higher texture detail.

You also have to factor in engine technology, everything is moving over to deffered rendering which by itself will cause a big increase in memory use due to multiple buffers and such.

People dropping £500-600 on a graphics card obviously want to keep it for a few years and get there value from it, what they don't want to have to do is turn some of the settings down in 12 months time because the card doesn't have enough memory even though the card has the shader power to handle the game itself.
 
Last edited:
If the Cards DX 11.2 feature ready then the amount of on-board memory isn't really that relevant. I don't quite get why people want 4k either, for gaming or movies.

Lets face it your not going to see that much of a difference.

I'm pretty sure it can't be too hard to hack the firmware on the card and unlock the cores like that on a titan anyway, hell I wouldn't be suprised if you could just rip a TI firmware and flash it over.

@Mr, AMD won't there's no need for it to be honest with DX 11.2 - as soon as games start using that is the end of insane memory cards.



Nah they're disabled during manufacture via laser removal so unless you've got a time machine and live in the far east you're stumped :)
 
Give it another few months time and will be talk of the next Nvidia monster :)

Bet that card arrives with 6gig as standard
 
You guys aren't seeing the long term benefits of more memory.

Most games use engines that were built with the consoles in mind, thus these game engines were built on 512Mb of RAM and around the limitations that brings.

Even a console port of a game can consume way more then the 512Mb of RAM on PC, now the lastest set of consoles have 5 and 5.5Gb of memory. Using logic and common sense you could argue that the developers using this next generation machine could didicate 3.5Gb of that memory to VRAM alone, maybe even 4Gb!!

Still think 3Gb is enough?

And there's the whole argument of a card will run of power before 3Gb will limit it's performance, load of rubbish.

Texture mapping performance is completely irrelvant and independant of shader and ALU performance.

If you take the first Crysis game for example, if you max the game out in stock form and with a 4Gb texture pack installed the results from a benchmark run will be IDENTICAL... All that despite the higher texture detail.

You also have to factor in engine technology, everything is moving over to deffered rendering which by itself will cause a big increase in memory use due to multiple buffers and such.

People dropping £500-600 on a graphics card obviously want to keep it for a few years and get there value from it, what they don't want to have to do is turn some of the settings down in 12 months time because the card doesn't have enough memory even though the card has the shader power to handle the game itself.


Some good points and I agree completely that given next gen without the extra VRAM it's an inevitability that PC will fall behind sooner rather than later. I will be interested to see just what scaling is available in games such as The Division with Ubisofts true next gen engine, and whether we might see lower VRAM cards stretched.
 
Gregster, do you guys pick up a cut of the advert fees wcctech and videocardz.com get from this forum? :)

You forgot to read the small print at the bottom of the wcctech article.

Headline says : "GTX780ti 6GB Confirmed*"

Small print says: "*By LtMatt's mum, after she had been fed a couple of shandies".

Seriously though, it is truly a 780ti 6GB card, just with two 780ti GPU's on the board ;)

Haha, you could be right but I do love wccftech's and videocardz rumours :D It does seem to coincide for Titan being EOL now though and a 6GB card on a 780Ti would be an option for some.

Haha, I will have to enjoy this one vicariously through Kaap's and Gregster's forum posts :D

We know Kaap's getting at least 4 and Gregster probably 2..

No chance for me and unless Maxwell does have significant jumps, I shall be skipping that as well....Probably :D
 
You guys aren't seeing the long term benefits of more memory.

Most games use engines that were built with the consoles in mind, thus these game engines were built on 512Mb of RAM and around the limitations that brings.

Even a console port of a game can consume way more then the 512Mb of RAM on PC, now the lastest set of consoles have 5 and 5.5Gb of memory. Using logic and common sense you could argue that the developers using this next generation machine could didicate 3.5Gb of that memory to VRAM alone, maybe even 4Gb!!

Still think 3Gb is enough?

And there's the whole argument of a card will run of power before 3Gb will limit it's performance, load of rubbish.

Texture mapping performance is completely irrelvant and independant of shader and ALU performance.

If you take the first Crysis game for example, if you max the game out in stock form and with a 4Gb texture pack installed the results from a benchmark run will be IDENTICAL... All that despite the higher texture detail.

You also have to factor in engine technology, everything is moving over to deffered rendering which by itself will cause a big increase in memory use due to multiple buffers and such.

People dropping £500-600 on a graphics card obviously want to keep it for a few years and get there value from it, what they don't want to have to do is turn some of the settings down in 12 months time because the card doesn't have enough memory even though the card has the shader power to handle the game itself.

It's not nonsense because that is how games are working at the moment. They don't have enough processing power to push the settings which cause them to run out of memory.

Sure you can mod the game with texture packs which as you say do not have a massive imprint on performance other than loading up the VRAM but until game developers begin to do this as well the point still stands.

The last I heard on game dev's implementing high res textures was on Bioshock Infinite with its "movie style" textures. Game comes out an the textures are standard.
 
Last edited:
It's not nonsense because that is how games are working at the moment. They don't have enough processing power to push the settings which cause them to run out of memory.

Sure you can mod the game with texture packs which as you say do not have a massive imprint on performance other than loading up the VRAM but until game developers begin to do this as well the point still stands.

The last I heard on game dev's implementing high res textures was on Bioshock Infinite with its "movie style" textures. Game comes out an the textures are standard.

Games are running silly high precision settings on PC just for the sake of it.....

And textures are going to see a much bigger jump on the next set of consoles then lighting and other effects are.
 
For those ambitious 4k hd dreamers maybe. Not for me though, i am sure that extra ram will make clocking a fair bit harder with no benefit at lower resolutions.
 
I doubt they will. The 7970 with 6GB isn't something you see for sale anywhere and the AMD penny pinchers wouldn't pay the extra for another 4GB anyways :p

Penny pincher's? AMD have always had more vRam than Nvidia, its Nvidia who penny pinch. :p

I think they will, if Nvidia do a 6GB 780TI at what, 650 - £700<? AMD will slot in a competing 8GB card below that price.

I think AMD are just waiting for Nvidia to create the opportunity. Probably for £550 - £600, which is what the 3GB 780TI costs now.
 
Last edited:
No good for benching!!

Next!!!

Lol :D

Waiting for what Maxwell has to offer then?

Penny pincher's? AMD have always had more vRam than Nvidia, its Nvidia who penny pinch. :p

I think he meant it's AMD customers who are penny pinchers and wouldn't pay extra for a bit more VRAM, but it was a tongue in cheek comment anyway of course ;)
 
Lol :D

Waiting for what Maxwell has to offer then?



I think he meant it's AMD customers who are penny pinchers and wouldn't pay extra for a bit more VRAM, but it was a tongue in cheek comment anyway of course ;)

My response to him was also tongue in cheek. ;)

Anyway. AMD users are not as susceptible to marketing which gets you to pay more for less :p
 
Back
Top Bottom