• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Myths Of Graphics Card Performance: Debunked

Ah yes, the age old performance envelope. Take note Lt Matt. No more graph wars :D.

Ambient temps are interesting...43C for a 690GTX overclocked. I'd be lucky to see 40c with three Ti's.
 
Last edited:
Its been obvious for years a lower spec CPU/GPU overclocked is never ever as good long term as a higher priced model. They are speed binned for a reason usually..........
 
I think the author of the article does not know much about Skyrim modding - he is running the official HD texture pack only. The other popular high res texture packs on NMM take up more VRAM. Maybe he should try other packs including the additional character and landscape texture packs which use at least 2K textures(a number use 4K textures). Running at 1680X1050 I am hitting at least 1.5GB with low amounts of AA on my GTX660(at least that is what I think it was - might need to fire it up again and have a looksy) and I am using the official HD texture pack as it uses less VRAM. People tend to run dozens of mods(me included),and VRAM usage can be quite high because the third party mods are poorly optimised.

Modding is why people really need a lot of VRAM,and most games are previous generation console ports with poor textures,so I do wonder what will happen in the next year or so? The XBox360 and PS3 with their limited amount of RAM,really held back texture quality on many multi-platform games.

IMHO,2GB is the minimum any reasonably powerful gaming card should have,and as time progresses I expect we will start to see more 3GB and 4GB cards from AMD and Nvidia once the next generation is launched.
 
Last edited:
I don't get the fuss with all the modded Skyrim nonsense and VRAM useage. I've got all the 2K and a few 4K texture packs, 4X MSAA and SSAO via ENB preset, all whistles and bells, indirect lighting etc...and at 1440P my usage barely tops 2.5GB.

So at 1080P there shouldn't be too much of an issue with texture packs on a 2GB card. I think some people have a few conflicting mods, as that is what brought my VRAM usage to silly levels previously.
 
When Bethesda releases Fallout 4 it is going to be even worse for VRAM usage especially with mods for that. People use the mods with Skyrim which makes the game the way they want it to look and its far from nonsense. You cannot predict how much these mods use,especially since they are changed often,when the modders add new features.

Its the reason why the GTX780 and AMD high end cards have 3GB+ of VRAM and why people buy cards with that amount of VRAM. Words are cheap,actions speak much louder.

People said 256MB VRAM was enough when 512MB cards were available - 8800GT 256MB anyone?? People said 512MB was enough when 1GB cards were available- HD4870 512MB anyone?? GTX580 1.5GB against GTX580 3GB anyone?? GTX570 1.28GB vs GTX580 1.5GB in BF3??

With the next generation console increasing the texture quality of multi-platform games, anyone can see where this is going when the new consoles have 16 times the amount of RAM of the old ones. Come back in 12 months time and see what will happen. This is why so many multi-plaform games and games based on such engines have rubbish textures - they are optimised to run on consoles first.

Any graphics card with a half decent GPU should have at least 2GB minimum,especially if you don't upgrade every six months.

Edit!!

The other aspect people are ignoring is resolution. Higher resolution monitors are getting cheaper and cheaper. 2560X1600 and 2560X1440 monitors used to cost around £1000 only 5 years ago. You can get 2560X1440 and 2560x1080 monitors for under £400 now,and 1920X1080 monitors can cost as little as £100,meaning surround gaming is getting cheaper than ever. Those Korean 27" 2560X1440 monitors are under £300.
 
Last edited:
Still living in the past? Lots of games require much more memory for 3D.

Running Skyrim or Crysis 3 with decent 3D Equipment is fun for those who have enough memory to do so.
 
I like the dB break down. Shows what a load of tosh buying a fan on the value of it's dB, which is something I have never understood when people do it. Buying fan A instead of B which is 50% better but is 5dB louder. In reality, both fans will be masked by other noises, even room ambiance.
 
A very interesting read, thanks.

And note to self.

There has been much debate about how hot is too hot for a GPU. However, higher temperatures, if they're tolerated by the equipment, are actually desirable as they result in better heat dissipation overall (as the difference with ambient temperature, and thus amount of heat that can be transferred, is higher). At least from a technical perspective, AMD's frustration over reactions to the Hawaii GPU's thermal ceiling is understandable. There are no long-term studies that I'm aware of speaking to the viability of given temperature set points. From my own experiences with device stability, I have to rely on manufacturer specifications.
Fine, but that reference cooler was also stopping it from running at peak clocks consistently.
 
Some good analysis there, apart from the bit on the 'VSync' page which referred extensively to ghosting and based the explanation on what a static photograph on BioShock shows ('ghosting'). That is very misleading. The concept of motion blur caused by eye movement is an absolutely essential part of why increasing the refresh rate can be beneficial from a visual fluidity perspective. All that talk of '2ms' vs. '8ms' just muddies the water. Some further and slightly more technical reading for you - http://forums.overclockers.co.uk/showthread.php?t=18566417. The section on input lag sort of missed the mark as well, brushing over its importance. The time taken to react physically to a stimulus (fighter pilot example) and being able to detect something visually (i.e. a certain disconnect between input and screen output) are two different things.
 
Last edited:
Interesting read. The overclocking of the 690 producing a lower frame rate was interesting.

As was his opinions on input lag and if it really mattered.
 
Interesting read. The overclocking of the 690 producing a lower frame rate was interesting.

As was his opinions on input lag and if it really mattered.

I think the article is rubbish

I currently own 290Xs, Titans and GTX 690s

On air with limited fans speeds

The 290Xs do throttle a lot

The Titans throttle a bit

The GTX 690s hardly throttle at all even at 90c

I fixed the 290Xs and Titans with water cooling but have never got round to the GTX 690s as they don't have a problem.

With no extra volts the GTX 690s are also the best overclockers, they have a stock clock of 915mhz and can boost to over 1260mhz.:D
 
A very interesting read, thanks.

And note to self.

Fine, but that reference cooler was also stopping it from running at peak clocks consistently.

I don't really buy that quoted bit (well the science is correct but doesn't really apply) - my 780 with the WF3 runs very cool (aslong as you have the airflow for it) because it has such good heat dissipation. Its still generating the same amount of heat but its being transferred away from the core very efficently. Granted IIRC heat pipes, etc. work more efficently as the heat increases until you hit their saturation point but that doesn't really help if the core itself is still increasing in temperature.
 
Last edited:
I think the article is rubbish

I currently own 290Xs, Titans and GTX 690s

On air with limited fans speeds

The 290Xs do throttle a lot

The Titans throttle a bit

The GTX 690s hardly throttle at all even at 90c

I fixed the 290Xs and Titans with water cooling but have never got round to the GTX 690s as they don't have a problem.

With no extra volts the GTX 690s are also the best overclockers, they have a stock clock of 915mhz and can boost to over 1260mhz.:D

Maybe it was just a bit of a dud 690. I'm not all too sure about their comments on desirably higher temps. High temps are never desirable. Lower the operating temperatures the better, you can't mess with that lol.
 
Maybe it was just a bit of a dud 690. I'm not all too sure about their comments on desirably higher temps. High temps are never desirable. Lower the operating temperatures the better, you can't mess with that lol.

The GTX 690s behave differently to the rest of the kepler cards, where they are dual GPU cards they are always going to run hotter than their single GPU stablemates. When they reach 80c they will lose 13mhz of the max possible boost and the same again at 90c but what they won't do is throttle like a 290X when it goes over temp and lose huge amounts of mhz. I quite often see my GTX 690s going over 1250mhz on boost and it is the hottest core (the primary one) running the highest overclock. GTX 690s are designed to run that hot and unlike the 290Xs don't nose dive on the mhz.

I often thought about watercooling my GTX 690s as they were often fighting it out with the top watercooled quad sli GTX 680s on the benches and the few extra mhz would have helped a lot.
 
Last edited:
Not sure what's going on with their GTX690.

Mine throttles 13mhz at 80c. But overclocked it sits just below that playing BF4 at 99% usage :/ i get 1137MHz Core1 and 1150MHz Core2 and at 100% fan speed it's quieter than a 7970 Matrix on 50% fan speed.

Actually, come to think about it, i have got to 82-85c in the summer but it's still 1137MHz :/

Would definitely want to know how they overclocked their card and whether or not they had it up against a radiator at the time! Though must just be GeForce GTX 690
fan at 61%, which is damn near silent. 40db sounds loud surely?...
 
Last edited:
Back
Top Bottom