• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

My mini review - GTX 280 vastly better than 9800GX2 for high res gaming

Heres some interesting info...

I've just run 3D Mark Vantage on the 'Extreme' setting, which is 1920x1200 res. Heres the scores between my 280 and GX2:

9800GX2 = 3295
GTX 280 = 5410

...Big difference. Again this looks like it's because of the 512MB memory limit on the GX2. I'll add it to my OP.
 
Last edited:
Heres some interesting info...

I've just run 3D Mark Vantage on the 'Extreme' setting, which is 1920x1200 res. Heres the scores between my 280 and GX2:

9800GX2 = 3295
GTX 280 = 5410

...Big difference. Again this looks like it's because of the 512MB memory limit on the GX2. I'll add it to my OP.


Did you use the same 177.39 drivers for both cards ?

I shall run the same test and give you my results soon
 
Last edited:
Did you use the same 177.39 drivers for both cards ?

I shall run the same test and give you my results soon

I used 177.35 for 280, they're the latest WHQL drivers... do the 177.39's support 280 and do they have PhysX support for this card too??

The GX2 used 175.16 WHQL drivers. I also tried it with 175.70 BETA drivers - Hardly no difference to score, and the same with any previous driver versions.
Cant do any more testing with GX2 as i sold it Tom|Nbk :D

Theres no way a driver will get the GX2 near to the 280 though on the Extreme setting, too big of a difference. And i'm sure it's the 512MB limit kicking in @1920x1200 anyway.
 
Hurrah for Overclockers!

Just a quick thank you for helping me out in my hour of need. ;)

That ASUS card I bought from another not-well-known site didn't happen and they declined my order because I wanted it delivered to work which I thought was reasonable considering I'm IN WORK during daytime. Anyway! I bought a Tagan 700W modular from OcUK last night which was dispatched this morning, I rang up and spoke to a help geezer called Sam and tried to get the GTX280 dispatched with the PSU but was too late.. So I thought I would have to pay postage twice but oh no! He let me off with the delivery and sent it with postage free of charge. :)

Went for the BFG GTX280 OC so roll on tomorrow!!
 
Hurrah for Overclockers!

Just a quick thank you for helping me out in my hour of need. ;)

That ASUS card I bought from another not-well-known site didn't happen and they declined my order because I wanted it delivered to work which I thought was reasonable considering I'm IN WORK during daytime. Anyway! I bought a Tagan 700W modular from OcUK last night which was dispatched this morning, I rang up and spoke to a help geezer called Sam and tried to get the GTX280 dispatched with the PSU but was too late.. So I thought I would have to pay postage twice but oh no! He let me off with the delivery and sent it with postage free of charge. :)

Cool, let us know what you think when you have the card.
Went for the BFG GTX280 OC so roll on tomorrow!!

Noooo! Waste of money!...
i'm running at now - 660MHz core, 1335MHz shaders, and 2.5GHz memory.
When it happened with me the memory was at 2.6GHz and 1450+MHz shaders.

..and it's funny seeing people pay £500+ for pre-overclocked 280's when mine, that was under £400, will overclock higher than every single one of these pre-overclocked 280's thats availible on OCUK. L O ****** L :cool:
 
Cool, let us know what you think when you have the card.


Noooo! Waste of money!...

:P I actually chose the BFG OC as I wanted a top brand really, preferably EVGA or BFG (ASUS would do) and the BFG was the cheapest and OcUK are pretty reliable to order from.

As long as I get 650Mhz out of it I'm easy. :P
 
:P I actually chose the BFG OC as I wanted a top brand really, preferably EVGA or BFG (ASUS would do) and the BFG was the cheapest and OcUK are pretty reliable to order from.

As long as I get 650Mhz out of it I'm easy. :P

Brands rarely matter with these things, especially with the 280 as they all have the same Nvidia reference cooling and are manufactured at the same places. These companies just stick there name on 'em ;)

..some do have have better warranty and technical support though. But in all the 15+ graphics cards i've had over the years i've never needed to use either.
 
my point was that SLi/Crossfire are uesless. get single fastest GPU available and be happy with it.
micro-shutter, memory issues and compatibility issues to name a few problems multi-GPU user have to face, whereas ever since i've bought 8800GTX, i've not had any of those issues with AA turned on in almost all games i play.


Crossfire is not useless or there would not be so many people on OCUK going crossfire.
You sound like some one who have never had a multi GPU setup.

Mememory limt issues has nothing do do with multi GPU like i have said before & everthing to do with res AA, AF & detail, going single GPU would not sovle the memory limit issue.
And the GRID example shows that multi GPU can overcome the close to the mem limit problem but add more AA & the mem limit has been well & truly shot.
Your GTX would be match for 2560x1600 2xAA 8xAF 60FPS Vsync minimum im most games i play.

The problems with crosfire over the years that i have used it are virtulay none. I have one game that does not scale with multi GPU, that is not a problem, im not worse off for it & is no different from software not making use of all cores of a quad CPU.

Single GPU would have given me for more problems on the 30" with having to make compromises in most games i play.
The benefits far out way the possible negative & i mean possible as they hardly ever arise.
 
Last edited:
Yeah the whole point of this thread really was for me to point out the problem with 512MB cards and high res gaming. The GX2 does not have these issues because it runs in SLI.
The stuttering and memory issues are purely from not enough VRAM, which will effect any 512MB card.
 
Yeah the whole point of this thread really was for me to point out the problem with 512MB cards and high res gaming. The GX2 does not have these issues because it runs in SLI.
The stuttering and memory issues are purely from not enough VRAM, which will effect any 512MB card.

Exactly because most single gpu cards will run at too low a frame rate way before they could run at a res or detail setting that could hit the 512MB limit & get Vram refresh stutter.
A good example was the 1950xt 512MB v 1024MB 0-2 FPS gain.
But things are changing now tho with GPU having more power & games useing more Vram at highest setting than older games could.

1024=512 512=256 256=128 in comparison.
 
Last edited:
You sound like some one who have never had a multi GPU setup.
.......................

i actually had many friends going crossfire and SLi, most of whom encountered problem and sold the 2nd card as a result.
sure owning such cards would mean great scores in benchmarks, but from experience on friends computers, gameplay isn't as good as a single powerful card.

things have changed with 8800GTX, and the whole G92 series is just a joke seeing how easily people can hit the memory limit. so gtx280 IMHO is the only way forward (or GDDR5)
 
So I need to decide whether I get a GTX 280 this week. I am running 2 8800GT cards on an Nvidia 650i board and have zero intention of changing board as it's the 1st NV board that has given me ZERO problems. Lots of problems with my 680i, resulted in me switching back to my trusty 650i.

With that in mind, I am so tempted by the next gen. 4850 is obviously out as I can't use crossfire. I could wait for the 4870X2 but won't as it's too long a wait :). So am thinking GTX 280 after reading MR. B's review. The better playability, and extra headroom of the 1gig ram and better bandwidth can't be ignored either. HardOCP has commented on how lack of bandwidth/memory makes games less playable compared with the GTX 280 e.g. Assassins Creed.

http://enthusiast.hardocp.com/article.html?art=MTUxOCw1LCxoZW50aHVzaWFzdA==
"GeForce 9800 GTX SLI results are very interesting in this game. We found 2560x1600 playable but with No AA. You will notice though that the framerates are very high, averaging 58 FPS. However, when we tried to enable 2X AA we saw a dramatic decrease in performance that rendered the game unplayable on the GeForce 9800 GTX SLI. We feel that the narrower memory bus and low memory capacity on the GeForce 9800 GTX is keeping it from being able to run with AA at 2560x1600. "

But then again Anand has shown GT's in sli matching or beating the GTX 280

Bioshock
http://www.anandtech.com/video/showdoc.aspx?i=3334&p=17

Assassins Creed
http://www.anandtech.com/video/showdoc.aspx?i=3334&p=14

Crysis
http://www.anandtech.com/video/showdoc.aspx?i=3334&p=11


With that in mind is it worth upgrading over what I have ? I am thinking probably not. But don't know for sure. :rolleyes: I want to do it this week if possible as I have decent offers (100 euro +) on each of my GT's.
 
I've always found those Anadtech benches extremely dodgy. For instance they run ET:QW @ 2560x1600 with 4x AA and get 62 FPS on the 9800GX2... but it's not even playable on those settings with that card. Even with no AA i used to get random stuttering in places with my GX2 @ 2560x1600, with any drivers, and other people had this too.

I've commented on these dodgy articles before, to try and get a reply from the author and his seemingly magical GX2/512MB cards that can run games at 2560x1600 with AA but never have got a reply...

Atleast some other sites are starting to show the problems with 512MB on high res.

What res monitor do you have Flanno?
 
Last edited:
After reading your great review I'm seriously thinking of buying a gtx280, as it seems the best single card performer. I generally buy graphics cards every 18 months and I believe gtx280 could last that long. I am waiting for a new card more that 1 month now as I've sold my faithful 7900gto, returned a 3870x2 (terrible shattering in games) and I'm left with a crappy 7300se :P .
 
i actually had many friends going crossfire and SLi, most of whom encountered problem and sold the 2nd card as a result.
sure owning such cards would mean great scores in benchmarks, but from experience on friends computers, gameplay isn't as good as a single powerful card.

things have changed with 8800GTX, and the whole G92 series is just a joke seeing how easily people can hit the memory limit. so gtx280 IMHO is the only way forward (or GDDR5)

Its always friends & not your self..i can not speak for sli but i would say that a lot of user error can account for problems with crossfire & the biggest one is the A.I setting that used to have a greater effect on the outcome of crossfire & to the point that ATI had to put it in the guide as to many ppl had it on advanced when it should been on standard for crossfire.

I cant believe that i have gone through 5 crossfire setups with a vast amount of games & just been lucky everytime.

If everything runs fine on a single GPU at the setting you like then there is no point in going multi GPU but that is clearly not the case for others here as there would not be such an upturn in the take up of multi GPU Crossfire since the 3000 series.

As for me if i was going to use Nvidia as my next card it would need to be 2x280 in SLI & that aint never going to happen with everything thats involved.

I can see 2x4870x2 1GB shared as a minimum for me.
 
Last edited:
Anyone with a GTX 280 care to comment on this :

FROM NVIDA -
"quote:

There are 2 types of 6-pin to 8-pin power adapters.

One type converts from two 6-pin plugs to a single 8-pin plug. These adapters could damage your computer and should not be used under any circumstances.

The second type converts a single 6-pin plug to a single 8-pin plug. NVIDIA also advises against its use since many power supplies will not provide sufficient current over the 6-pin power cable. However, this type of adaptor could potentially support normal operation as long as the customer checks their PSU manuals and ensures that its 6-pin PCI-E rails can handle the same current rating as an 8-pin power cable, which is 150 watts.

The recommended solution is to use a power supply with native 8-pin power cables. "

Now as I understand it for those with only 2 pci-e connectors on our psu's that don't want to go out and buy a new psu, there are 2 adaptors provided.....for the 6pin power on the card you get a dual 4pin molex to single 6pin pci-e adaptor and for the 8pin power on the card you get a dual 6pin pci-e to single 8pin pci-e adapter. So why are the likes of BFG, EVGA etc..supplying these if Nvidia say they are dangerous to use ? I assume it's because a 6pin PCI-e cable is rated for 75watts, and the 8pin (with 2 pins for ground) doubles it to 150w, and if you use the single 6pin to 8pin adaptor variety your psu may not be able to provide that 150watts down what is essentially a 6pin cable. Fair enough for a single 6pin to 8pin cable. But why is there an issue using a dual 6pin to single 8pin. Is there excessive ripple on each 6pin possibly which can damage hardware ? I'd really love to know as no way I'm buying a new PSU.
 
Last edited:
Anyone with a GTX 280 care to comment on this :

FROM NVIDA -
"quote:

There are 2 types of 6-pin to 8-pin power adapters.

One type converts from two 6-pin plugs to a single 8-pin plug. These adapters could damage your computer and should not be used under any circumstances.

The second type converts a single 6-pin plug to a single 8-pin plug. NVIDIA also advises against its use since many power supplies will not provide sufficient current over the 6-pin power cable. However, this type of adaptor could potentially support normal operation as long as the customer checks their PSU manuals and ensures that its 6-pin PCI-E rails can handle the same current rating as an 8-pin power cable, which is 150 watts.

The recommended solution is to use a power supply with native 8-pin power cables. "

Now as I understand it for those with only 2 pci-e connectors on our psu's that don't want to go out and buy a new psu, there are 2 adaptors provided.....for the 6pin power on the card you get a dual 4pin molex to single 6pin pci-e adaptor and for the 8pin power on the card you get a dual 6pin pci-e to single 8pin pci-e adapter. So why are the likes of BFG, EVGA etc..supplying these if Nvidia say they are dangerous to use ? I assume it's because a 6pin PCI-e cable is rated for 75watts, and the 8pin (with 2 pins for ground) doubles it to 150w, and if you use the single 6pin to 8pin adaptor variety your psu may not be able to provide that 150watts down what is essentially a 6pin cable. Fair enough for a single 6pin to 8pin cable. But why is there an issue using a dual 6pin to single 8pin. Is there excessive ripple on each 6pin possibly which can damage hardware ? I'd really love to know as no way I'm buying a new PSU.

Not heard that before...

My GX2 and 280 both never came with a 8pin adapter. So best to make sure one is definitely included. I've not heard of anyone whos had trouble with these adapters aslong as the power supply is good enough.

When i first got my GX2 i bought a Thermaltake Tough Power 1200w Modular PSU because i'm p*ssed off of upgrading my PSU all the time.

Maybe start a thread about this just to make sure a 8pin adapter is ok?
 
Just curious..

Does anyone know if you'd run into the same bandwidth issues running two cards in SLi, as well as the GX2 2-cards-one-slot solution?
 
Back
Top Bottom