• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Best GPU for the new Conroe??

The 7900s are quieter and draw less power while the X1900s have better image quality and are cheaper, it really comes down to your needs TBH. There's a card for everyone. :) Welcome to the forums.
 
Last edited:
I think I would go for the x1900xt/xtx. I have a 7800gtx 512mb@610/1800 which isn't far from 7900gtx, and I'm not overly impressed by the performance. I'm going to volt-mod it though, and bring it up to 670-680/1900 which should beat a 7900gtx.

So, I'd say go ati ;)
 
Yeah x1900's use more power, produce more heat, and are louder, but they can do AA+HDR.

7900 GTX uses less power, produces less heat, and is quieter, but they can't do AA+HDR.

:)
 
LoadsaMoney said:
Yeah x1900's use more power, produce more heat, and are louder, but they can do AA+HDR.

7900 GTX uses less power, produces less heat, and is quieter, but they can't do AA+HDR.

:)

After 14k++ posts I'd expected more :p

Beacause that is partly wrong, nvidia does support INT16 HDR + aa(like source, half life episode one ext.) and super sampling aa+int or fp hdr. Nvidia does NOT support multi sampling aa + fpr hdr ;)
 
sablabra said:
After 14k++ posts I'd expected more :p

Beacause that is partly wrong, nvidia does support INT16 HDR + aa(like source, half life episode one ext.) and super sampling aa+int or fp hdr. Nvidia does NOT support multi sampling aa + fpr hdr ;)

Bah my bad, sorry. :( :p
 
vdechpokket said:
What is the best graphic card for conroe, ATI X1900XTX? or NVIDIA 7900GTX?


none, you bought intel and your going to hell in a hand basket... :mad: :p


my personal opinions aside... If you want a card to last you for 12 months then I personally wouldnt go top of the range this late in the year. i would buy an X1800XT to tide you over as Direct X 10 will be out in january and the DX10 cards will be out before the end of the year which will kill the value of your card and shorten its useful life. if you can afford it and you dont mind changing in a few months then go for an X1900XT-X.


but that doesnt change the fact that youve become a minion of Bealzibub.

16921245763.jpg
 
That moth was originally pictured on a both !! not a curtain lol..

But the X1900 XT Would be my choice..




zonkey6uu.jpg
 
sablabra said:
After 14k++ posts I'd expected more :p

Beacause that is partly wrong, nvidia does support INT16 HDR + aa(like source, half life episode one ext.) and super sampling aa+int or fp hdr. Nvidia does NOT support multi sampling aa + fpr hdr ;)
I think it's obvious to anyone that LoadsaMoney meant FP32 blending HDR and antialiasing together, a la Oblivion and Far Cry. :p No need to start picking at extremely tiny nits now.
 
Úlfhednar said:
I think it's obvious to anyone that LoadsaMoney meant FP32 blending HDR and antialiasing together, a la Oblivion and Far Cry. :p No need to start picking at extremely tiny nits now.

I wouldn't say I'm extremely picky! :D But someone may think that nvidia can't handle any kind of HDR + AA(true for fp32), and cs: source DOES look pretty amazing! Great coding, don't need too much power, looks just as good as bf2 and probably better! And I've tried far cry with my x1800xt, and it does look pretty cool, when looking through a window or something :p
 
sablabra said:
I wouldn't say I'm extremely picky! :D But someone may think that nvidia can't handle any kind of HDR + AA(true for fp32), and cs: source DOES look pretty amazing! Great coding, don't need too much power, looks just as good as bf2 and probably better! And I've tried far cry with my x1800xt, and it does look pretty cool, when looking through a window or something :p
Did you never get around to trying Oblivion on your X1800XT? Looks absolutely stunning with HDR and 4xAA. :D
 
Úlfhednar said:
Did you never get around to trying Oblivion on your X1800XT? Looks absolutely stunning with HDR and 4xAA. :D

Nope, but I'm downloading it now :D . The game looks crap, but I have to try it due to the graphics. :p My x1800xt wouldn't stand a chance at 1680x1050 4xaa 16xaf + hdr anyway... :D
I'm gonna see with my 7800gtx 512, but it performs poor in oblivion, right?

But oblivion seems to be pretty poorly coded, compared to half life 2:episode one and source...
 
sablabra said:
Nope, but I'm downloading it now :D
Direct2Drive or something, I hope. ;)

sablabra said:
The game looks crap, but I have to try it due to the graphics. :p
Each to their own, I guess. I would say it's easily game of the year, if not game of the last five years, but if you don't like RPGs or something then it's easy to imagine why it might look crap.

sablabra said:
My x1800xt wouldn't stand a chance at 1680x1050 4xaa 16xaf + hdr anyway... :D
Yeah, if you're stuck in that res you're going to be turning a lot of settings down. My monitor only goes up to 1600x1200, so I don't have any problems.

sablabra said:
I'm gonna see with my 7800gtx 512, but it performs poor in oblivion, right?

But oblivion seems to be pretty poorly coded, compared to half life 2:episode one and source...
Should be the same as on the X1800XT, since the 256MB model of the 7800GTX is a little behind the X1800XT but the 512MB model of the 7800GTX had its clockspeeds ramped up etc.

Like I said though, you'll be turning stuff down at that resolution. :( This game is definitely meant to be played on 7900GTX/X1900XT systems, you're probably right about terrible coding.
 
Úlfhednar said:
Direct2Drive or something, I hope. ;)

Each to their own, I guess. I would say it's easily game of the year, if not game of the last five years, but if you don't like RPGs or something then it's easy to imagine why it might look crap.

Yeah, if you're stuck in that res you're going to be turning a lot of settings down. My monitor only goes up to 1600x1200, so I don't have any problems.

Should be the same as on the X1800XT, since the 256MB model of the 7800GTX is a little behind the X1800XT but the 512MB model of the 7800GTX had its clockspeeds ramped up etc.

Like I said though, you'll be turning stuff down at that resolution. :( This game is definitely meant to be played on 7900GTX/X1900XT systems, you're probably right about terrible coding.

Direct2Drive... Hmm, yes, indeed :D
Well no, I don't like RPGs

But 1600x1200 should be more demanding than 1680x1050, right? :confused:

Yup, but my x1800xt was overclocked to hell on water! Still the 7800gtx 512 usualy performs better... I'm just gonna volt mod it so I can up the speed to 680/1900 hopefully (do 610/1800 on stock volts), it should make it a lot faster than 7900gtx :) . I'm looking forward to it(unless I kill my card!)
 
sablabra said:
Direct2Drive... Hmm, yes, indeed :D
Naughty, naughty. :p

sablabra said:
But 1600x1200 should be more demanding than 1680x1050, right? :confused:
My monitor goes up to 1600x1200, I play Oblivion in 1280x960. The only game I play in 1600x1200 with FP32 HDR+AA is Far Cry, since that's a hell of a lot less taxing on any system than Oblivion is.

sablabra said:
Yup, but my x1800xt was overclocked to hell on water! Still the 7800gtx 512 usualy performs better... I'm just gonna volt mod it so I can up the speed to 680/1900 hopefully (do 610/1800 on stock volts), it should make it a lot faster than 7900gtx :) . I'm looking forward to it(unless I kill my card!)
The Oblivion engine is just ATi biased, especially in outdoors environments. Expect to see equal or better frames on your X1800XT than on your 512MB 7800GTX.
 
Back
Top Bottom