• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

***Official*** 9800GX2 thread (reviews and benches inside)

The game that needs the biggest boost in performance in Crysis and neither ATI or NVIDIA seem to be able to offer a solution that can run Crysis with 'very high' quality settings at 1920x1200. One would think that two $589 XFX GeForce 9800 GX2 video cards in Quad-SLI along with a $1,059 Intel QX9650 processor, a $372 eVGA 790i Ultra SLI motherboard and two $608 Corsair DOMINATOR 1800MHz C7 memory kits would be able to play a video game at 24" LCD monitor resolutions with no AA or AF enabled. With the test system going north $5,500 after you add the LCD monitor, power supply, water cooler, hard drives, case, cooling fans and optical drivers into the equation it makes it hard to suggest going with a platform like this unless you have some serious coin and to be honest many do

:eek: :eek:
 
I think a lot of people don't care about Crysis at this point. What worries me is Legitreviews are using canned benches if I am not mistaken, and HardOCP take great pains to point out they actually play the game and bench it. Is QuadSLI a turkey then ?
 
Last edited:
One more

http://www.anandtech.com/video/showdoc.aspx?i=3271&p=1

So we tested Crysis on Very high settings with our single 9800 GX2 and Quad setup.
This indicates that the higher the graphical quality, the MORE CPU bound we are. Crazy isn’t it. Counter intuitive, but pure fact. In speaking with NVIDIA about this (they have helped us a lot in understanding some of our issues here), the belief is that more accurate and higher quality physics at higher graphical quality settings is what causes this overhead.
 
Last edited:
Fornowagain are you able to shed any light on my question in the previous page regarding overclocking failure relating to specific clocks? You seem to be a man who knows his stuff.
 
For those using 6 to 8 pin adaptors please read this
http://evga.com/forums/tm.asp?m=297186&mpage=1&key=&#297186
"The new 9800 GX2 uses an 8 pin PCI-E plug and a 6 pin PCI-E plug. Therefore an 8 pin PCI-E cabled PSU + 6 pin PCI-E is required or a 6pin + 2pin (8 pin) PCI-E + 6 pin cabled PSU.

Use of a converter from 6 pin PCI-E to 8 pin PCI-E may cause damage to the video card or become a fire hazard as well as not being UL spec approved. "
 
I'm so skipping this gen. Not a great leap over the 8800s and not as many games on the horizon this year. Summers coming too. I'll save my cash for 6 months :)
 
I'm so skipping this gen. Not a great leap over the 8800s and not as many games on the horizon this year. Summers coming too. I'll save my cash for 6 months :)

Summers coming yea, but its going to be raining ang snowing so unless you dont want to be stuck inside you best be jetting off into the sun :D.
 
My 8800GTS 640 is dying keep getting random screen corruption. but would the GX2 work with my motherboard it the ASUS P5W DH deluxe?
 
Man i love a new bit of kit but has anyone done a comapison between a 8800GTX a 8800 Ultra and the new GX2 ?

I game at 1900x1200and currently have Gerards old Ultra that I bought form him via MM ? If i can flogg the Ultra for £200 on the bay then i would be happy to upgrade .
 
but of course. Just make sure you have the 8 pin required for it.


......or use the 6 to 8 pin adapter provided.


Man i love a new bit of kit but has anyone done a comapison between a 8800GTX a 8800 Ultra and the new GX2 ?

If you wanted a rough idea you could just look at the Crysis benchmark thread. A very rough idea ofc, everyones using diferent gear, but still.
 
If you wanted a rough idea you could just look at the Crysis benchmark thread. A very rough idea ofc, everyones using diferent gear, but still.[/quote]


Thx for that but i cant afford a QX9 whaetever it is . Q6600 at 3.8GHZ is all i can muster .
 
Can anyone help, my 3Dmark06 with the card at stock is 16.5k, and last night I decided to have a fettle, so I downloaded the following:

GPU-Z (crap temp monitoring, ran everest alongside it which seemed much more accurate.)
Ntune
Rivatuner

I just used Ntune for now, as Rivatuner throws an error that it needs to be updated to work properly, even though its the latest ver.

So anyway, the stock clocks are (core)600mhz (mem)1000mhz.
To cut a long story short, I tried about 3 overclocks, each scoring worse than the stock score.
The highest IIRC was around 750mhz core and 1050mhz memory.

Interestingly, the highest overclock gave the poorest score, (14.5k!!! )it was clocked right up to the nads (still able to complete 3Dmark, but artifacting quite badly in places)

Temps in everest peaked at about 70/75c IIRC, so its not getting to hot I dont think.
Temps in GPUZ peaked at about 50c
( I monitored temps by peaking at the desktop quickly between the sub-tests )

And thats about as far as ive got so far, managed to lose about 2000 3d marks lol.

Im going to run another test tonight at stock clocks to see if I can reclaim the missing score.

Its the first time Ive ever o/c'd a graphics card, so I dunno whats going on really.
Maybe its driver issues, maybe 3Dmark cant interpret the new performance due to the cards being so new and 3Dmark 06 being quite old now? is 3Dmark still relevent even, with hardware and drivers comming a long way in the last year is it able to properly assess new tech?

Gawd knows, but I was using an o/c spec that I saw online as a target figure, they claimed to have it stable at 768mhz core and 1106mhz memory.

Any advice or comments?
 
Back
Top Bottom