• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ati x1900xt running @ 1900x1600 would i run?

Associate
Joined
30 Dec 2003
Posts
1,641
Now then folks..If i was going 2 run a 24" dell would a single HIS ATI Radeon X1900 XT ICEQ 3 SILENT Heatpipe 512MB GDDR3 or when such games a crysis come out would my pc die?

Simple question..can i run such a high res display on a single card, and play games in glory...
 
i would say no. that card will struggle a lot on its own at that res specially with all the eye candy you might get away with less eye candy ideally you want xfire as that would most likely do it without thinking twice.
 
I doubt even a x1900xt or a 7900gtx would cope with gaming on 1600x1200 tbh with full eye candy on.
 
I wouldnt bother with xfire, wait and get a new dx10 card, they should offer a huge performance leap, specialy for high ress gaming.
 
Snoops said:
I wouldnt bother with xfire, wait and get a new dx10 card, they should offer a huge performance leap, specialy for high ress gaming.

I keep seeing people saying this and logic would suggest the opposite. Remember its going to be all new with the blind leading the blind. Its going to take a while before we see a huge performance leap. Ati/Nvidia will only give us just a bit more performance. It will be a drip fed just like its always been.

How many people kept banging on about the 48 pipe 7900 monster that would kill Ati in its tracks. Keep it real people. Its not in Ati/Nvidia interests to give huge performance leaps when they don't have to.
 
2x x1900's in crossfire will last well over a year, and yes the HIS ICEQ 3 is available in Crossfire. :)

As has been said before there is zero point in anyone with a high end DX9 card spending over £600+ on Crysis, it runs in DX9, and will not look much different to what it does in DX10, it will absolutely fly on Crossfire x1900's under XP, as will all DX10 games as they will all work in DX9 also, and not look much different to what they do in DX10, so why on earth do you want to spend £350+ on a Dx10 card, and over £230+ on Vista just to play one single game, by the time DX10 is used more and its worth having then you can get your DX10 card,and believe me it won't be R600/G80 as by then stacks of faster, better DX10 cards would have been and gone.
 
Last edited:
The Asgard said:
I keep seeing people saying this and logic would suggest the opposite. Remember its going to be all new with the blind leading the blind. Its going to take a while before we see a huge performance leap. Ati/Nvidia will only give us just a bit more performance. It will be a drip fed just like its always been.

How many people kept banging on about the 48 pipe 7900 monster that would kill Ati in its tracks. Keep it real people. Its not in Ati/Nvidia interests to give huge performance leaps when they don't have to.
I'm beginning to think this way as well...there has really been nothing in the way of solid information on the G80 & R600 to make any real estimation on their performance with DX9 games.

I am aiming to get a new system up and running by the end of the month, but I'll using my existing monitor, a 24" Dell. I was looking @ the GX2, but they are expensive, so now I've been looking @ the 1900XT as a possible cheaper card. I could always get another in a couple of months time if I need more power...would that be a good move or not though.

If so, which 1900XT would people recommend? 256Mb or 512Mb? XT or XT-X? Gods! It use to be so easy to pick the best card! No look at this mess! :eek:
 
A_Darkfire said:
If so, which 1900XT would people recommend? 256Mb or 512Mb? XT or XT-X? Gods! It use to be so easy to pick the best card! No look at this mess! :eek:

HIS x1900xt 512
Its cheap, has the best cooler of all the x1900's, has the same memory as the xtx so will clock the same and higher.

Get 512 aswell because your running a high ress display :)
 
A_Darkfire said:
I'm beginning to think this way as well...there has really been nothing in the way of solid information on the G80 & R600 to make any real estimation on their performance with DX9 games.

I am aiming to get a new system up and running by the end of the month, but I'll using my existing monitor, a 24" Dell. I was looking @ the GX2, but they are expensive, so now I've been looking @ the 1900XT as a possible cheaper card. I could always get another in a couple of months time if I need more power...would that be a good move or not though.

If so, which 1900XT would people recommend? 256Mb or 512Mb? XT or XT-X? Gods! It use to be so easy to pick the best card! No look at this mess! :eek:
The HIS cards would be my pick if going crossfire.
Get the 512MB cards, as the Mastercards are all 512MB.
Go for the 512MB Mastercard first, becuase they will be harder to get later on. Then you can get any 512MB slave card to complement it.
 
Yep, for some reason, some people think that the release of a new programming interface and functional requests are suddenly going to allow ATI and nVidia to release graphics cards that will knock all previous out of the water.

Don't know why people think the switch from DX9 to DX10 is suddenly going to unleash all this held back processing power.
 
Magic Man said:
Yep, for some reason, some people think that the release of a new programming interface and functional requests are suddenly going to allow ATI and nVidia to release graphics cards that will knock all previous out of the water.

Don't know why people think the switch from DX9 to DX10 is suddenly going to unleash all this held back processing power.

Probably because of all the reports in the e-press of the huge amount of pipelines/pixel shaders these cards are supposed to have and also the new psu's that have started rolling out that are required to power them ?

"Reportedly the R600 and the G80 will use max 300 watts per card and any sli or crossfire rig will need 1KW at least"
 
Last edited:
Snoops said:
Probably because of all the reports in the e-press of the huge amount of pipelines/pixel shaders these cards are supposed to have and also the new psu's that have started rolling out that are required to power them ?

"Reportedly the R600 and the G80 will use max 300 watts per card and any sli or crossfire rig will need 1KW at least"
I seriously doubt that the R600 will eat 300W by itself with it using GDDR4. If it did, I don't think ATi woudl bother with a 90nm version at all and go straight for 65nm...

God only knows what NVidia are going to do with the G80...apart from charge a lot for it that is...

What is wrong with ATi and NVidia at the moment? Normally you can't shut them up about their next gen cards (and I don't mean their retooled graphics card that they have been doing to death in the last six months), but this time there hasn't been anything but a couple of rumours.

I would have thought the sane idea now would be to shrink the die size down to 65nm and possibly try the dual core trick with the DX10 stuff bolted on. I remember reading the ATi was looking at doing this, but it could have been drunk at the time.
 
I have to say, my X1900XT (albeit heavily overclocked with a 3Ghz Single core to back it up) copes very well @ 1600x1200, except in FEAR. Oblivion is very playable (70fps indoors, 40 fps out) on max settings, with some shadows turned down. Quake 4, Doom 3, all run at constant 60fps+.

I guess there are some new games that I wouldn't like to be stuck @ 1600x1200 for, but I'm a WoW addict at the mo, and that runs fine....
 
im curious, in his topic he said would it run it at 1900x1600...

what kind of res is that?

do you mean
2560x1600?
1920x1200?
1680x1050?
1600x1200?

or what?

coz a 1900xt should be fine at 1600x1200 and 1680x1050 with most settings maxed

anything higher than that and you'll be pushing your luck

why not get a x1950xtx? that'll run at 1920x1200 quite nicely and means you don't have to put up with crossfire and lack of support in some games etc
 
ok thats true, for next gen games itl struggle, but for everything at the moment its fine

for next gen, just wait, sell it and get a dx10 card or something, especially for crysis, itl prob need it, lol
 
I think it all depends on what you call acceptable. I found with my X1900XT-X I could run anything at 1920x1200 with 2-4X AA apart from Oblivion where if I wanted HDR I had to sacrifice AA. I'm currently running a X850 @620/620 at 1920x1200 and apart from loosing AA its running most games at max. It even runs Oblivion with Bloom at 1920 where I get 50FPS in the city and 25-30 in the forest.
 
eww bloom, bloom is dreadful, it looks rubbish compares to hdr and destroys your fps even more than hdr does for some reason

the thing that annoys me (also having an nvidia card) is the inability to have hdr and aa at the same time....even if i had an ati card though it would probably kill my fps (in oblivion)

i usually go for AA instead of hdr, coz although hdr looks pretty I can't stand jaggy edges when everything else looks so nice
 
Back
Top Bottom