• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7950 vs 660 ti

Yes it is, BF3 with the frostbite 2 engine doesnt render textures if you dont have enough Vram, and doesnt render trees or some of the surroundings, so you dont stutter at all.

I bet you have never even had stuttering form having low Vram, you just go by what the rumors are on the internet.

"The GPU will bottleneck its Vram anyway"


lol, simply not the case unless your speaking to someone running super resolutions or high levels of AA.
 
The 480 plays BF3 very very well at 80+ fps and ultra textures/effects/terrain decoration using up its Vram.

you have to turn down shadows terrain and mesh and the game doesnt look much different, just less draw distance and less rounded mountains in the distance.

The 570 on the other hand cant play at those same settings because its Vram is too low and some textures render black.

There is something wrong if it is rendering textures as black. I had a 560TI prior to my 680 and If I put it to full ultra, it would play like a slideshow. After all VRAM had been used up, it never went into showing black textures.


I never said a 2 year old GPU will be able to play the latest games at max settings... thats dumb ov course it will not lol.

I said if you are keeping the card for a long time and not upgrading then I recommend more than a 2gb card from the example above, to still be able to play games a decent framerates and make them look half decent with nice textures (textures aren't a performance hog, but a RAM hog)

But if you are turning settings down, you will be using less VRAM and 2GB stands a very good chance of being enough.
 
Yes it is, BF3 with the frostbite 2 engine doesnt render textures if you dont have enough Vram, and doesnt render trees or some of the surroundings, so you dont stutter at all.

I bet you have never even had stuttering form having low Vram, you just go by what the rumors are on the internet.

A search on "VRAM limit black textures" pointed me to.... this thread. Oh dear - just you on the whole internet.

If I run maxed out settings with Windows Aero enabled (which uses around 100-150MB of VRAM) I can max out my cards VRAM. I either crash or stutter to a halt but the game itself looks no different.

Tommybhoooy got the same symptoms in a modded version of Skyrim with his 6950->70's.

Ooopsie.

@Gregster - I think we are in - what did you call it... Groundhog day? - regarding VRAM :p.
 
Last edited:
I didn't want to get involved but common sense has to kick in and my own experience with hitting the VRAM ceiling. I bet it isn't the last time this subject gets brought up either.

I have never heard of black textures and would go out on a limb and say the memory is clocked too high, especially if they were triangles that are long and pointy :p
 
I didn't want to get involved but common sense has to kick in and my own experience with hitting the VRAM ceiling. I bet it isn't the last time this subject gets brought up either.

I have never heard of black textures and would go out on a limb and say the memory is clocked too high, especially if they were triangles that are long and pointy :p

I have had the black textures, which is no texture really a few times over the years, but not since being on the gfx cards i have now and it was due to Vram.
I have been on 2560x1600 since 2006.
 
Last edited:
I have had the black textures, which is no texture really a few times over the years, but not since being on the gfx cards i have now and it was due to Vram.
I have been on 2560x1600 since 2006.

I can honestly say I have never heard of VRAM running out causing no textures. In all my experience of running out of VRAM, it just went into a slideshow. I did some quite extensive testing when I had the 560 because this debate of usage was going strong then and not once did I ever get black/no textures. I would look at it being something else tbh.

I was just playing Rusty's game lol, only this time it was him on the recieving end :D

You Sir, are a troll. How childish.
 
I remember the olden days and owning a 8500 GT with 1GB of VRAM. Games used max of 500MB back then but the card was so bad, it couldn't cope with anything half decent at a playable rate.

When we finaly see the newer monitors coming out (Maybe 2 years), the 7970 and 680 will look like that poor 8500 GT.

It's a scenario that best suits not having more vRAM, that does not make it fact.

You don't know that. and even if that was true, what do a lot of people do 2 years on, get another of the same GPU when they are cheap.

More is always better than less ;)
 
I can honestly say I have never heard of VRAM running out causing no textures. In all my experience of running out of VRAM, it just went into a slideshow. I did some quite extensive testing when I had the 560 because this debate of usage was going strong then and not once did I ever get black/no textures. I would look at it being something else tbh.

There is always a first time and being at my res for so long has meant i have seen all aspects, the plummeting flips and stuttering is the most common due to swap out but some games don't need swap out streaming and will just leave some textures black as everything is loaded at once but as i have said since being on 2GB i have not seen any of the 3 symptoms.

Typically the purple _____ whatever your seeing in your game is because you have most likely ran out of video ram. While the 4gb patch will help with your frame rates for processing what you see on the screen it does nothing at all for loading the textures to be shown. Simple meaning, you probably shouldn't use it if your running a 1GB card. I have 6GB of Vram on my 580 classified ultra sli setup (3gb per card) and I have not seem any missing textures at all. The ram on the card is dedicated video ram and pretty much used for rendering objects and the objects textures, that info is then handed off to your intalled ram sticks via the north / south bridge and final touches for physics are placed in my the cpu if no dedicated card is present.

If you use the right search parameters you can find loads of examples.

https://www.google.co.uk/search?q=m...s=org.mozilla:en-GB:official&client=firefox-a
 
Last edited:
More is always better than less ;)

Again true. But this is ignoring the £££ factor. The extra VRAM costs a considerable premium over the standard model when in all likelihood (not impossible I know) it's only going to be utilised in multi-GPU set up's.

You could argue in the future that a game could run close to 2GB at "OK" FPS on a single screen/single GPU and a second 4GB card would release the card from its dual GPU/VRAM bottleneck.

Is that what you're getting at humbug?

I still think its unlikely but it's a hell of a lot more likely than single card. And a large up front investment to get the first 4GB now on an improbability.

There is always a first time and being at my res for so long has meant i have seen all aspects, the plummeting flips and stuttering is the most common due to swap out but some games don't need swap out and will just leave some textures black as everything is loaded at once.

As I said above the only relevant information I can find on black textures and VRAM limit is this thread. It doesn't sound like VRAM. Any other information on it?

Everything else I can read on VRAM limits in addition to my own experience relates to crash to desktop's or single digit FPS stuttering.
 
Last edited:
Again true. But this is ignoring the £££ factor. The extra VRAM costs a considerable premium over the standard model when in all likelihood (not impossible I know) it's only going to be utilised in multi-GPU set up's.

You could argue in the future that a game could run close to 2GB at "OK" FPS on a single screen/single GPU and a second 4GB card would release the card from its dual GPU/VRAM bottleneck.

Is that what you're getting at humbug?

I still think its unlikely but it's a hell of a lot more likely than single card. And a large up front investment to get the first 4GB now on an improbability.

No that's not what i'm getting at, i don't think 2GB is enough for a £400 GPU. I know the 4GB cost a premium, and the 6GB AMD's do to. 6GB is to much unless your running 6 screens and 4 7990's.
A bit more than 2GB for £300 / £400 on the base models to me is reasonable, i expect 2GB for my £200 GPU. But not for a £400 GPU.

I guess what i'm saying is they should bring the 4GB model down to where the 2GB model is now.
 
No that's not what i'm getting at, i don't think 2GB is enough for a £400 GPU. I know the 4GB cost a premium, and the 6GB AMD's do to. 6GB is to much unless your running 6 screens and 4 7990's, A bit more than 2GB for £300 / £400 on the base models to me is reasonable, i expect 2GB for my £200 GPU. not for a £400 GPU.

I guess what i'm saying is they should bring the 4GB model down to where the 2GB model is now.

Well then that makes less sense.

All the evidence (from people that have tested) suggests that 2GB is more than enough to match the grunt of 2 GPU's in even multi-monitor set up's.

It makes sense that the premium edition model (i.e. 4GB) are for tri-SLI and above. It's not just users saying this either. EVGA said this themselves.
 
Again true. But this is ignoring the £££ factor. The extra VRAM costs a considerable premium over the standard model when in all likelihood (not impossible I know) it's only going to be utilised in multi-GPU set up's.

You could argue in the future that a game could run close to 2GB at "OK" FPS on a single screen/single GPU and a second 4GB card would release the card from its dual GPU/VRAM bottleneck.

Is that what you're getting at humbug?

I still think its unlikely but it's a hell of a lot more likely than single card. And a large up front investment to get the first 4GB now on an improbability.

As I said above the only relevant information I can find on black textures and VRAM limit is this thread. It doesn't sound like VRAM. Any other information on it?

Everything else I can read on VRAM limits in addition to my own experience relates to crash to desktop's or single digit FPS stuttering.

https://www.google.co.uk/search?q=m...s=org.mozilla:en-GB:official&client=firefox-a

I have managed to even black texture QuakeWorld back when i used to play it if i put the settings to high.
 
Last edited:
I'm not sure that you can really conclude from that quote that that is the VRAM limit being hit.

The first hits I read on that search revealed somebody with a 7950GO trying to game on today's game and the people talking about the slideshow as textures are read from the RAM/pagefile.

Then I stopped reading as it doesn't appear to be an everyday symptom of running out of VRAM - perhaps game dependent?
 
Well then that makes less sense.

All the evidence (from people that have tested) suggests that 2GB is more than enough to match the grunt of 2 GPU's in even multi-monitor set up's.

It makes sense that the premium edition model (i.e. 4GB) are for tri-SLI and above. It's not just users saying this either. EVGA said this themselves.

Arguing against more vRAM for less money is what makes no sense, maybe that's why nvidia feel they can get away with less for more :p

EVGA WOULD say it themselves, they pay for the IC's!!!!!!

Sorry, i could not resist that :)
 
Last edited:
You cant "match the grunt" of the GPU with Vram, its not as simple as that and doesnt work like that, a GPU can use lots of Vram (or potentially more than it has) in different situations and it has the grunt to do so :)
 
Arguing against more vRAM for less money is what makes no sense, maybe that's why nvidia feel they can get away with less for more :p

EVGA WOULD say it themselves, they pay for the IC's!!!!!!

Sorry, i could not resist that :)

To be fair to EVGA it's only what has since been proven by actual end users.

Ideal world I'd like more VRAM but it wouldn't make any difference :).

You cant "match the grunt" of the GPU with Vram, its not as simple as that and doesnt work like that, a GPU can use lots of Vram (or potentially more than it has) in different situations and it has the grunt to do so :)

Try telling that to the numerous people who have done so on this forum :rolleyes:
 
Last edited:
Back
Top Bottom