• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 480 Faster than GTX 295??????

I have 1 32" 1080p set up, but even with Quad Sli with AA forced on I get a lot of slowdown!

Maybe I should try 3 single cards this time.

I am also surprised that you are having problems with that set up as a pair of cards should be able to play all current games. Could you tell us the rest of your system? The problem you are having could be due to other factors.
 
If the 480 is equal or just a bit slower than the 295. Than having 3 480's would be a rough 50% better than 2 295's.
 
Atleast that on average. I'd have thought in a lot of cases assuming a decently high screen res you'd see more like 80-90%.
 
I have 1 32" 1080p set up, but even with Quad Sli with AA forced on I get a lot of slowdown!

Maybe I should try 3 single cards this time.

Have you ever thought the CPU might be bottlenecking you?
Also, it may not be faster then the dual GPU 295, but definetly faster then the 285.
 
Its almost impossible to say how much difference you would see, because multi GPU scaling is such a mixed bag. Once you move past dual GPU's scaling can get pritty poor TBH.

The GTX480 is going up against the GTX285 and the HD5870 / its replacement.
 
Its almost impossible to say how much difference you would see, because multi GPU scaling is such a mixed bag. Once you move past dual GPU's scaling can get pritty poor TBH.

The GTX480 is going up against the GTX285 and the HD5870 / its replacement.

As a rule of thumb - assuming optimal resolution and not one of those rare games that can full max out the potential of AFR, I usually reckon for average gains in an average game:

Card 1 100%
Card 2 +60%
Card 3 +30%
Card 4 +10%

Balanced out by the games that either suck with multi GPU or do very well with 3-4 card multi GPU.
 
Yes, you are right Rroff, thats the kind of level I am getting, sometimes better and a lot of the time the same.

MW2 stutters on a few levels with Quad Sli, but less with 1 GTX 295
Batman Physx work very bad with Quad SLI ended up using 1 gtx295 half of it GTX 275!
Crysis does benfit from more cards the better it seems, but even that is a HOG of a game!
 
You'd probably be better off dropping down to a single GTX295 and setting max pre-render to 2. Ironically for an nVidia headline title Batman AA doesn't make the best use of SLI.
 
As a rule of thumb - assuming optimal resolution and not one of those rare games that can full max out the potential of AFR, I usually reckon for average gains in an average game:

Card 1 100%
Card 2 +60%
Card 3 +30%
Card 4 +10%

Balanced out by the games that either suck with multi GPU or do very well with 3-4 card multi GPU.

Are three GPUs 90% faster, are four 40% faster than two. How many game support more than two GPU, how many only use one. Its all just guessing really.
 
CPU bottleneck especially at the res you are running. In fact the overheads of running 4 card effectively in SLI can make 4 cards be slower than 3 cards.

http://www.benchmarkextreme.com/Articles/I7 920 Bottleneck Analysis/P2.html

Notice in some games at 1920 x 1200 even tri sli 285's only give you a 10% gain over dual gtx285's at that is with your cpu being at 4.2Ghz.

I'm sure there was another review somewhere which showed you need your cpu at 5Ghz+ to release the full power of two gtx295's but can't find it for the minute.

http://www.tomshardware.com/reviews/geforce-gtx-295,2123-7.html

GTX295 in sli loses to gtx280 tri-sli there.

Tbh gtx295 in sli is only worth it if gaming at very high res. At your res, it is either a 10% gain at best or even a drop in framerates.
 
tri sli gtx 480?! just dont blame nvidia when your rig is on fire lol... seriously tri sli drivers never been great and its overkill for most things (other than crysis and arma 2) anyway.
 
I have 1 32" 1080p set up, but even with Quad Sli with AA forced on I get a lot of slowdown!

Maybe I should try 3 single cards this time.

32 inch 1080p televisions are still 1980x1080 resolution as far as i'm aware.

Meaning if you're having slowdown using 2x 295 at that res, even with everything on max, there is something very wrong with your pc.

As others have said - could be a bottleneck due to cpu or possibly type of PCI slot you are using.

With a 5850 (a lot slower than 2x 295?) i can use a resolution of 5040x1080 with everything on max, full AA, everything i can get my mouse on - and get no slowdown at all in games like AVP, Bad Company 2, Batman AA etc

So you're sitting on enough graphics power to do way more than you need - as someone else said, thats the sort of graphics power u need for 3 x 30 inch monitors!!!!
 
This is what im worried about, the GTX 480 should be a little bit faster than the GTX 295! (I hope!)

It is a 'little' faster if this has any truth in it.

benchmarktotals.jpg


Also

cjgd1.jpg


http://img686.imageshack.us/img686/6831/vei74ghz58701gb8501200c.jpg
 
Also in the past, say when the 8800GTX came out, SLi/Crossfire were pretty crap back then, so it's no wonder the card beat them (i.e. the 7950GX2).

It didn't, 1950xt's in crossfire beat the 8800gtx in the vast majority of games, it ONLY beat the 7950gx2 for one very simple reason, it was 256mb per core, and the 8800gtx got a MASSIVE bump in memory. It wasn't that much faster until the res knocked out performance of the 7950gx2, at which point it dropped 30-40% down on the 8800gtx, which is generally what happens to any and all cards when they become memory limited versus a card that isn't.

A single 1950xt could quite comftably beat a 8800gts 320mb in almost any semi high res situation, and thats with 256mb memory, Nvidia have always been lightyears behind in memory usage. The 8800gtx comftably spanked a single 1950xt when the GTS wasn't memory limited.

The 8800gtx was no bigger a jump in performance than any other generation in the past 7-8 years. Just managed to go for 3 times the memory with the last gen mostly based on 256mb cards just as 512mb cards were really becoming required for higher resolutions.
 
7950GX2 was 512MB per core, and while the 1950xt crossfire could beat the 8800GTX in games that were current around the time of release - in any more recent title the 8800GTX completely decimates it.
 
Last edited:
It is a 'little' faster if this has any truth in it.

benchmarktotals.jpg

I've yet to run Vantage on an Nvidia card, don't they get a massive performance boost from cpu scores and from use of Physx, I'm honestly not sure what specific tests and how much it can effect results.

In other words if its marginally beating a 5870, how much of the extra score is down to the bonus score physx gets them.

Does vantage score higher on a 5870 with some kind of Nvidia physx going on alongside it?

Gah, I've not really run any sythetic benchmarks for years now, infact I've barely run any game based benchmarks either.
 
If they are comparing with an ATI card they would probably have been running CPU physx for both - otherwise you'd see a massive performance increase not just a narrow advantage.
 
7950GX2 was 512MB per core, and while the 1950xt crossfire could beat the 8800GTX in games that were current around the time of release - in any more recent title the 8800GTX completely decimates it.

http://www.bit-tech.net/hardware/graphics/2007/05/02/nvidia_geforce_8800_ultra/8

http://www.bit-tech.net/hardware/graphics/2007/05/02/nvidia_geforce_8800_ultra/9

Considering I can't, at a quick glance, easily even find Crysis benchmarks because the game was out so long after the x1950 series, its a bit of a bizarre game to claim as a performance guide.

Throughout most of that review the 8800gtx simply doesn't get more than around 50% ahead, sometimes less, the x1950xtx actually manages to beat the 8800gts 640mb quite a few times, and thats not in crossfire.

To pretend the 8800gtx was some absolutely massive performance jump is insane. IT did very well compared to a lower memory 7900gtx, but was helped by the 7900 series being not great, and as you can see the x1950 is quite comftably ahead of the 7900, often by massive amounts.

In cod 2, Prey which is a Nvidia favouring game they don't come close to doubling ATi's previous gen, in the more reasonable resolutions several years ago, 1600x1200 and just 1920x1200 its barely 50% faster and thats the ultra version.

IT didn't come anywhere near close to smashing the x1950xt's in crossfire, and no in reality every single time I check up on this and provide links crossfire FAR more often than not would beat a 8800gtx. Then another few months later you'll claim, despite having it proved to you time and time again, that the 8800gtx easily beat the x1950's in crossfire.

http://www.anandtech.com/video/showdoc.aspx?i=2870&p=22

yes, thats it beating the 8800gtx, by 50%, admittedly its best game by far, and also in the highest resolution and IQ no less. But it wins the majority of games theres, with Oblivion(always better on Nvidia and AMD, always has been) and one other game to Nvidia with the rest to x1950 crossfire. Every single review I see says that.

The 8800gtx is no where near as good as people think it was. Its biggest improvement was moving from 7800/7900 gen where the majority of cards sold were 256mb models, to a generation where most people spent the marginal extra and went for the 640mb model and above.
 
Last edited:
You have to look at the bigger picture... sure in old style raster heavy/simple shader games it was faster, but a few years on with its unified shader architecture the 8800GTX can still cut it and in any game that uses more modern shader features it will pull away.
 
Back
Top Bottom