• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

My mini review - GTX 280 vastly better than 9800GX2 for high res gaming

Associate
Joined
11 Aug 2004
Posts
1,814
Location
London
Sooo i bought a GTX 280 when i already had a 9800GX2, pretty crazy you might think... but when you have a 2560x1600 res monitor the GX2 is an uber turd, due to the memory issues below. Even with 1920x1200 it sometimes does not cut it.

This post is about the stuff you dont see in reviews with performance numbers and charts... and i highly recommend you take a moment to read this if you have a 1920x1200 or higher res monitor, and are thinking of getting a 512MB graphics card. Plus it took ages to do all this!

But firstly heres my two cards:

280GX2.jpg


The GTX 280 is considerably lighter than the GX2.




Things worth mentioning about the 280 with it's 1GB Memory over a GX2 and other 512MB usable VRAM cards

> Certain games now have quicker loading times, especially Unreal Engine 3 based games (like Mass Effect, about 50% faster loading)

> More complex games with high res textures have less random paulsing, from things caused by hard drive paging and so on (Oblivion, Mass Effect, Crysis and so on...)

> With 280/1GB cards texture heavy games with AA/AF applied at very high resolutions do not run in to the 512MB memory limit, and/or 256-Bit bus bandwidth issues - which cripples frame rates severely.

> Even though my 3D vantage scores have only increased slightly on the standard Performance setting (big increase on Extreme, see below), some scenes in the first two big tests of the benchmark - Jane Nash, and New Calico - clearly run better in areas, some areas are run at almost double the frame rate.

The GX2 might also have 1GB, but only 512MB is usable for games because of how SLI works, but i'm sure most of you already know this.

Because of these memory issues with the GX2's 512MB usable VRAM over a 256-Bit memory bus with GDDR3, high res gaming is not a good idea! See the below tests to see why in more detail. If possible stay away from GX2 or any other 512MB cards with monitors of 1920x1200 res and higher. I'd like to make it clear that this issue is not GX2 specific, or has anything to do with SLI, it will be exactly the same with any other cards that have 512MB VRAM, regardless of setup.
The GX2 is just a good example to be using here for these tests, as it has around equal GPU power to the GTX 280 so should be getting around the same frame rates at high res, but is completely let down by the amount of usable memory.





A comparison of games at high res between the two cards:
These are my own tests. The GX2 should be able to run all these games just as good as the 280, as it has the GPU power, but the memory issues cripple it in all the following tests....
When you run into this memory problem you can normally easily tell, as you go from great frames per-second to unplayable or total slideshow.

2560x1600 tests


Enermy Territory: Quake Wars

2560x1600 + maxed out settings + Vsync

9800GX2 = unplayable, nearly smooth but not quite, random paulses because of not enough VRAM for the res plus the large textures used.

GTX 280 = Playable with 4X AA + 16x AF


GRID

2560x1600 + maxed out settings + Vsync

9800GX2 = Complete slideshow, single digit FPS, even on the menu screens.

GTX 280 : Playable with 4x AA + 16AF


Unreal Tournament 3

2560x1600 + maxed out settings + Vsync

9800GX2 = Unplayable, and with any AA/AF turned on even the menu screen is a slide show

GTX 280 : Playable with 2x AA + 16x AF


Mass Effect

2560x1600 + maxed out settings + Vsync

9800GX2 = Completely unplayable

GTX 280 = Playable with 2x AA + 16x AF


Oblivion + High res texture mods

2560x1600 + maxed out settings + Vsync

9800GX2 = Unplayable (under 30FPS), random stuttering, especially in open areas.

GTX 280 = Playable with 2x AA + 16x AF


COD 4

2560x1600 + maxed out settings + Vsync

9800GX2 = Playable with no AA, 4x AF (but has a few rare micro stutters here and there, but nothing big)

GTX 280 = Playable with 4x AA + 16x AF



Source Engine based games

2560x1600 + maxed out settings + Vsync

9800GX2 :
Portal = Playable with 2x AA + 16x AF
Team Fortress 2 = Playable with 2x AA + 8x AF
CS:S = Playable with 4x AA + 16x AF
HL2 : EP2 = Playable with 2x AA + 8x AF
Dreamfall: TLJ = Playable with 4x AA + 16x AF

GTX 280 :
Portal = Playable with 8x AA + 16x AF
Team Fortress 2 = Playable with 16x AA + 16x AF :cool:
CS:S = Playable with 16x AA + 16x AF + SuperSampling
HL2 : EP2 = Playable with 8x AA + 16x AF
Dreamfall: TLJ = Playable with 16x AA + 16x AF + SuperSampling :cool:





1920x1200 resolution game tests
On this res the 512MB limit is not as severe, with games out at the moment you will need to have AA and AF turned on to run in to this problem.


GRID

1920x1200 + maxed out settings + Vsync

9800GTX = Playable with 2x AA + 16x AF

GTX 280 = Playable with 4x or 16x AA + 16x AF



Mass Effect

1920x1200 + maxed out settings + Vsync

9800GTX = Playable with 16x AF

GTX 280 = Playbale with 4x or 16x AA + 16x AF

Some other games where the 512MB limit kicks in at 1920x1200 and you enter the realm of the Slide Show, are:
UT3
Oblivion + High res texture mods
GRID
Bioshock
Crysis (although it's harder to tell with this game as it already cripples the GPU with AA at this res anyway)
- AA/AF needs to be enabled (for now) at this res to run in to the problem, i've not played a game so far where it will do this unless atleast 2x AA + 16x AF is enabled @ 1920x1200.





Other stuff


3D Mark Vantage : 'Extreme' setting
(Extreme is 1920x1200 res)

9800GX2 : Score = 3295
GTX 280 : Score = 5410


Nvidia's GTX 200 series Medusa Demo
(This demo makes use of the 280's and 260's extra memory)

1650x1080 + maxed out settings + 4x AA

9800GX2 = Average of around 15 FPS

GTX 280 = Average of over 30 FPS





Conclusion:

Basically if you have a 1920x1200 res monitor or higher, and/or like to have AA/AF enabled on higher res, then stay away from the 9800GX2!! Or any other 512MB cards, especially ones with a 256-Bit memory bus - unless they use GDDR5 on a 256-Bit bus. 512MB does not cut it these days for high res monitors, and it will only continue to get worse with newer games, until it starts effecting 1600x1200 / 1650x1080 resolutions and so on.
This is something that very few reviews mention, and if i had known about this before i would not of bothered with GX2 with my monitor :mad:

> All these test were done myself on 2 different installs of Vista Ultimate 64-Bit - both on the same system (in my sig).
> The GX2 tests were done using many drivers over the period of months, all these different drivers done very little or nothing to improve these memory issues.


A funny thing worth mentioning is that with the GX2, when not in SLI mode and gaming at high res, it will still run most new games almost as well as when it is in SLI, because once you hit this 512MB memory limit, enabling SLI has no effect on performance at all.
 
Last edited:
Good review and valid points. The only thing wrong is your commen about 256bit memory bus.

Yes when running with ddr3, 256 bit is limiting and you need 512bit for high res but with ddr5, 256 bit is fine.

Just wanted to point that out because a blanket statement that all 256bit is bad for high res is not true.
Cheers! and i forgot about GDDR5, i will correct the post for that...
 
Damn you for this thread, I have succumbed to the temptation! :(

Should (hopefully) be a good upgrade from 8800GTX since I game at 1920*1200.

Well the 8800GTX does have a 512-Bit bus and 768MB memory so it would be pretty hard to run in to the limiting memory/bandwidth issues @ 1920x1200 (for now) that i mentioned with 512MB cards.
I had a 8800GTX before the 9800GX2.. and that was hardly a upgrade at all with the high resolutions i play at! my 280 is a good upgrade over both.
 
Last edited:
Ahh yeah, 384-Bit!

80 year olds have better memory than me :rolleyes: if only you could upgrade brain memory...
 
Thanks Mr B.

Did you notice microstuttering when playing Crysis with the GX2. I had SLI 8800 GT for a short while but changed to the 8800GTS because it was a lot more smoother overall during gameplay and didn't suffer from microstuttering due to bandwidth issues I'm guessing.

Yeah i did notice this with GX2 and Crysis, there was also some weird issue i had where after playing Crysis for a while the performance would continue to get worse until the stuttering was bad and the FPS dropped.

Just a small thing, but it is really that much faster at high resolutions to justify the £150 extra you'd have to put to it?

Yes it's WAY faster at 2560x1600, i mean, not being able to play some games at that res on the GX2 with no AA/AF, to being able to run them with 2x/4x AA + 16x AF on the GTX 280... big difference. Obviously though, if you had something like a 1650x1080 monitor then the GX2 would be a better choice.


Me too, not had any problems with the BFG GX2 at 1920x1200

Now when i had a 3870X2 that was a stutter fest,
It's still quite hard to affect 1920x1200, unless you turn up the AA and AF enough - you need to do this to run in to it at this res. But some games where the 512MB memory limit kicks in with enough AA/AF are:

GRID
UT3
Mass Effect
Oblivion + High res texture mods
ET:QW
Crysis (although it's harder to tell with this game as it already cripples the GPU with AA at this res anyway)

Edit: i'll just add this last bit to my original post before more people say this :p
 
Last edited:
Nah the 280 is a noisey *******. Only problem i have with mine.

When it's idle it's exactly the same as my GX2 was, which in turn was also the same as my 8800GTX.

But when you start a game the 280 makes more noise than other cards i've had in recent years (6800 Ultra, 7800GTX, 7900 Ultra, 8800GTX, 9800GX2), and after your've stopped playing a game the fan never spins back down to it's original idle speed.

I have water cooling, so everything else is quiet and i can easily pick out the noise from the card. Guess i'll just get a block for it at some point..
 
Last edited:
The worst thing about the jump from 512 to 1gb is the cost involved, especially with gddr5 as with anything new it will be ridiculously expensive compared to older stuff. the reason its such a pain for increased cost is, you don't necessarily need 1GB if you run out of memory with a 512MB card, you might only need 513MB, or more likely higher but I doubt anything comes close to using a GB yet so you essentially pay for wastage which is a shame.

1GB, at this time is overkill, I get that there are a few users with a 30" screen who it will help, but only in a small minority of games, to increase the price of the final card to the 99.9999999999% of users who don't need 1GB is fairly silly for them. AT the end of the day GFX makers need to aim a range of products at the widest range of people and be affordable in the segment. If ATi went ahead with 1gb of gddr5 on their 4870 no doubt it would increase the price significantly when only a very very small portion of buyers would see it used. Nvidia playing on epeen again. Its always a shame to lose out because you got a better screen, but its very silly to increase the cost so much. THey probably should have stuck with the old 8800gtx setup with 756mb mem and a midrange bus for a quite a lot cheaper card and the best of both worlds, but then, Nvidia have always been dumb.

I disagree, i'd like to see more 1GB cards. It's not just the 512MB texture limit that will affect high res monitors, games also load quicker and generally have less random stuttering. It's a better, smoother overall experience. ATI and NV should atleast do more 1GB cards for the higher end cards, which after all are aimed at serious gamers that will likely have high res monitors anyway and use AA/AF. A lot of people would be willing to pay extra for 1GB, but they should also still keep the 512MB versions, so you atleast have options.
And when buying a card it's best to think ahead, not buy it for whats currently availible, and in that case 1GB would be more useful for future games, as the 512MB memory limit will only get worse.
 
Would like to know that also, as some people say it does spin down, and others don't. It was Toms hardware who pointed this out.

http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-26.html
"During Windows startup, the GT200 fan was quiet (running at 516 rpm, or 30% of its maximum rate). Then, once a game was started, it suddenly turned into a washing machine, reaching a noise level that was frankly unbearable – especially the GTX 280......We should tell you, however, that our at-idle readings are taken after all our benchmarks have been run, after just a few minutes at idle. The problem is that the GTX 280 never really goes back to its minimum level......."


I assume all 280's are the same, made by Asus or whoever.

I've tried EVGA precision tool to turn fan speed down, but it does not work with this card.

What other software is there that can change fan speed? I dont think the noise issue is a big one, the card does not get hot idle compared to most high-end GPU's, so all thats needed is software to change fan speed. Or it could just be a problem with current drivers... the fan really dont need to be spinning this fast considering the temp.
 
By the time the 4870X2 is out i wouldn't be surprised if NV had shrunk the 280 core enough to do a GX2 280 or something, which would likely be faster.

And the 4870X2 would also need 2GB memory so that games have 1GB usable VRAM ... unless both cores share the same 1GB.
 
HardOCP also reported on microstuttering on the GX2, well they didn't call it that, but they reported how in places COD4 slowed down due to lack of memory/bandwidth.

This is worrying
http://evga.com/forums/tm.asp?m=411571
Core it throttling while gaming causing slow down...1 guys core droped down to 400mhz while playing Age of Conan. Looks like it's not a heat issue, and Nvidias p-states will drop it to 300mhz or so on the desktop so that's normal too. Maybe due to EVGA's precision tool not playing well with the Nvidia control panel ??

MR. B - I take it you have not experienced this weird slowdown effect in gaming ?

I cant say for sure that it happened, but it deffinitely seemed to of one time when i was getting lower FPS than usual in a game - but i had overclocked the card and memory highly, so i'm guessing it got too hot and throttled the the clocks. It's never happened at stock or the speed i'm running at now - 660MHz core, 1335MHz shaders, and 2.5GHz memory.
When it happened with me the memory was at 2.6GHz and 1450+MHz shaders.

..and it's funny seeing people pay £500+ for pre-overclocked 280's when mine, that was under £400, will overclock higher than every single one of these pre-overclocked 280's thats availible on OCUK. L O ****** L :cool:


I have to take issue with that, Crysis performance is litteraly cut in half with SLI disabled on my system.
As i mentioned you have to run into the 512MB limit before enabling SLI has no effecct. With a 2560x1600 monitor it's very easy to do this. For example GRID is unplayable on the GX2 at 2560x1600 without even any AA or AF, so enabling SLI does nothing at all. Sometimes i found it just makes the issue even worse.
 
Last edited:
Heres some interesting info...

I've just run 3D Mark Vantage on the 'Extreme' setting, which is 1920x1200 res. Heres the scores between my 280 and GX2:

9800GX2 = 3295
GTX 280 = 5410

...Big difference. Again this looks like it's because of the 512MB memory limit on the GX2. I'll add it to my OP.
 
Last edited:
Did you use the same 177.39 drivers for both cards ?

I shall run the same test and give you my results soon

I used 177.35 for 280, they're the latest WHQL drivers... do the 177.39's support 280 and do they have PhysX support for this card too??

The GX2 used 175.16 WHQL drivers. I also tried it with 175.70 BETA drivers - Hardly no difference to score, and the same with any previous driver versions.
Cant do any more testing with GX2 as i sold it Tom|Nbk :D

Theres no way a driver will get the GX2 near to the 280 though on the Extreme setting, too big of a difference. And i'm sure it's the 512MB limit kicking in @1920x1200 anyway.
 
Hurrah for Overclockers!

Just a quick thank you for helping me out in my hour of need. ;)

That ASUS card I bought from another not-well-known site didn't happen and they declined my order because I wanted it delivered to work which I thought was reasonable considering I'm IN WORK during daytime. Anyway! I bought a Tagan 700W modular from OcUK last night which was dispatched this morning, I rang up and spoke to a help geezer called Sam and tried to get the GTX280 dispatched with the PSU but was too late.. So I thought I would have to pay postage twice but oh no! He let me off with the delivery and sent it with postage free of charge. :)

Cool, let us know what you think when you have the card.
Went for the BFG GTX280 OC so roll on tomorrow!!

Noooo! Waste of money!...
i'm running at now - 660MHz core, 1335MHz shaders, and 2.5GHz memory.
When it happened with me the memory was at 2.6GHz and 1450+MHz shaders.

..and it's funny seeing people pay £500+ for pre-overclocked 280's when mine, that was under £400, will overclock higher than every single one of these pre-overclocked 280's thats availible on OCUK. L O ****** L :cool:
 
:P I actually chose the BFG OC as I wanted a top brand really, preferably EVGA or BFG (ASUS would do) and the BFG was the cheapest and OcUK are pretty reliable to order from.

As long as I get 650Mhz out of it I'm easy. :P

Brands rarely matter with these things, especially with the 280 as they all have the same Nvidia reference cooling and are manufactured at the same places. These companies just stick there name on 'em ;)

..some do have have better warranty and technical support though. But in all the 15+ graphics cards i've had over the years i've never needed to use either.
 
Yeah the whole point of this thread really was for me to point out the problem with 512MB cards and high res gaming. The GX2 does not have these issues because it runs in SLI.
The stuttering and memory issues are purely from not enough VRAM, which will effect any 512MB card.
 
I've always found those Anadtech benches extremely dodgy. For instance they run ET:QW @ 2560x1600 with 4x AA and get 62 FPS on the 9800GX2... but it's not even playable on those settings with that card. Even with no AA i used to get random stuttering in places with my GX2 @ 2560x1600, with any drivers, and other people had this too.

I've commented on these dodgy articles before, to try and get a reply from the author and his seemingly magical GX2/512MB cards that can run games at 2560x1600 with AA but never have got a reply...

Atleast some other sites are starting to show the problems with 512MB on high res.

What res monitor do you have Flanno?
 
Last edited:
Anyone with a GTX 280 care to comment on this :

FROM NVIDA -
"quote:

There are 2 types of 6-pin to 8-pin power adapters.

One type converts from two 6-pin plugs to a single 8-pin plug. These adapters could damage your computer and should not be used under any circumstances.

The second type converts a single 6-pin plug to a single 8-pin plug. NVIDIA also advises against its use since many power supplies will not provide sufficient current over the 6-pin power cable. However, this type of adaptor could potentially support normal operation as long as the customer checks their PSU manuals and ensures that its 6-pin PCI-E rails can handle the same current rating as an 8-pin power cable, which is 150 watts.

The recommended solution is to use a power supply with native 8-pin power cables. "

Now as I understand it for those with only 2 pci-e connectors on our psu's that don't want to go out and buy a new psu, there are 2 adaptors provided.....for the 6pin power on the card you get a dual 4pin molex to single 6pin pci-e adaptor and for the 8pin power on the card you get a dual 6pin pci-e to single 8pin pci-e adapter. So why are the likes of BFG, EVGA etc..supplying these if Nvidia say they are dangerous to use ? I assume it's because a 6pin PCI-e cable is rated for 75watts, and the 8pin (with 2 pins for ground) doubles it to 150w, and if you use the single 6pin to 8pin adaptor variety your psu may not be able to provide that 150watts down what is essentially a 6pin cable. Fair enough for a single 6pin to 8pin cable. But why is there an issue using a dual 6pin to single 8pin. Is there excessive ripple on each 6pin possibly which can damage hardware ? I'd really love to know as no way I'm buying a new PSU.

Not heard that before...

My GX2 and 280 both never came with a 8pin adapter. So best to make sure one is definitely included. I've not heard of anyone whos had trouble with these adapters aslong as the power supply is good enough.

When i first got my GX2 i bought a Thermaltake Tough Power 1200w Modular PSU because i'm p*ssed off of upgrading my PSU all the time.

Maybe start a thread about this just to make sure a 8pin adapter is ok?
 
Just curious..

Does anyone know if you'd run into the same bandwidth issues running two cards in SLi, as well as the GX2 2-cards-one-slot solution?

The setup does not matter. If the cards only have a 256-Bit bus with GDDR3 then your'll run into the problem at higher resolutions. Same with 512MB memory not being enough for high res. SLI in any form will not affect these problems in any way.
 
I recently did a review on Quad SLI (9800GX2 x2) with CrossfireX (3870x2 x2) as a comparison at both 2560x1600 and 1900x1200 with 4x AA/AF and I must confess I didn't notice a lot of the symptoms/problems you have described here. I can't comment on the difference between the GX2 and the 280 as I didn't have a 280 at the time to use but I never noticed the stuttering and FPS is much higher than what you have gotten it seems in Cod4 at least(unless of course you deem 100fps unplayable)!

I never said the FPS ;) COD4 was playable for me at that res, but with no AA. And the micro stutters were rare like i said, and happened mostly on out door areas but even then were still quite rare.
For the most part, the frame rate would be at over 50 FPS @ 2560x1600 with no AA and a little AF on the GX2, the rare stutters were mostly just an annoyance.

But once any AA was enabled it became too bad, but it would be ok with some AF though.
Did you try COD4 @ 2560x1600 with 2x or 4xAA + AF? On these settings it's no where near playable. I found once any AA is enabled with that game on that res you run out of memory. Even my 8800GTX handled it better at that res with some AA as it had enough memory for it.

I just wish i could find software that displays how much VRAM is being used by a game. As far as i know theres not any that work with Vista.
 
Last edited:
Back
Top Bottom