• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anyone Disappointed With X1900 Performance

I am loving my x1800XT again I'm a lowly 1280X1024 user here :)

I remember when I had a Geforce 2 playing in 800X600 :) as it is, I love this card - all I need now is some more games. Been playing Half Life 2 (looks awesome with everything ramped up) but COD2 still drags, had to turn off dynamic lights, low bodies, no smoke softening or shadows and set to 2AA otherwise in du hoc I crawl at about 30FPS.

I think I need to save for a new HDD and stick the pagefile on a separate drive.
 
Bengbeng said:
Dissapointed, lol. It's most of the times faster than the GTX512, it's cheaper, better available and has more features (FP HDR + AA & HQAF).
I would be dissapointed if i bought a GTX512 last week... :D


My thoughts exactly.

The X1900XT is faster & importantly cheaper than the 7800GTX 512MB.
Between the two the X1900XT would be the easy choice.


[Me, i will be going the X1800XT route i think, much goodness for the price].
 
Your missing the point the x1900 isn't realy a Next Gen card it hardly a speed beast yes it might be fastest by a hole 5 - 20 fps in a lot a lot of games but realy this is tiny and my settings are not overcloked there default and the system of the x1900 was a top of the range pc far above my system so if theres only 100 - 1000 point in it and my system is the lower end the x1900 isn't realy a big step up from other cards.

Yes I know it's cheaper and i'm not realy to make all you people have just got one feel bad I am just saying that realy this card isn't a NVidia buster and with the G71 only 2 - 3 months away is it realy worth the effort.

Apart from that speed the Image's quality is great very crisp.
 
It is a next gen card specifically designed for next gen Pixel Shader intensive games. As more heavy Pixel Shader games are released, the theoretical increase in performance from a X1800 to X1900 is up to 3 times. !!! :eek:

ATI have changed tack from Pixel pipelines to Shader operations. I expect G71 will be 32 Pipes, but when you haven't got the bandwidth nor Pixel Shadign power to fully satisfy those pipes it's a bit of a waste (which is why the GTX with 24 pipes doesn't perform a whole lot better than the GT with 20)

I expect nvidia to follow suit from G80 onwards IMO, but for technical forward thinking I think ATI are right on the mark. Smarter rendering rather than pure brute force....

(and this is from a 7800GT owner btw !! :D)
 
I expect nvidia to follow suit from G80 onwards IMO, but for technical forward thinking I think ATI are right on the mark. Smarter rendering rather than pure brute force....
True, but saying that i think ATI will be outdated sooner than nVidia.
nVidia's got the developed architecture which just needs improving. It was the same with the SLI and Crossfire. That's why IMO ATI are a step behind.
 
Last edited:
It is a next gen card specifically designed for next gen Pixel Shader intensive games. As more heavy Pixel Shader games are released, the theoretical increase in performance from a X1800 to X1900 is up to 3 times. !!!

The problem is that the theoretical numbers are no where near those expected. Have a look at the shader scores in 3dmarks/03/05/06 in shader mark and rightmark etc, and shader intensive FEAR. The X1900XT performs a lot worse than expected in these situations. Quite dispaointing to many.

On the flip side, It seems even faster at AA and Af, een although there has been no architecture changes to make this so. Look howmany benchmark results indicate very little difference between the X1800XT and X1900XTX with the 7800512 well in the lad without AA or AF, when these settings are on the X1900XTX then takes the lead. The slight increase in core clock memory bandwidth seem to be having a big effect. But I suspect there is also some driver cleverness going because otherwise the results just don't make sense.
And then in games like AO3 where the X1900 is way faster than the X1800, it just makes it look as though the X1800 is way under powered for shaders as it needs the X1900 to be able to get close to the 7800GTX 256.

ANyway, overall, for the most part there is little difference between the 7800 GTX 256, the X1800Xt and the X1900XT. Have a look at Guru3d, they are a reliable source, you see there is very little difference under most games. And the thing is, every review looks at 430MHz 7800s, it is very difficult to buy such low clocked cards anymore. Whack in a 490Mhz card and things look very different. Same with the GTX 512Mb, they have been using 550MHz cards where a great many are 585 or more with an extra 100MHZ on the memory- This can make a big difference with AA and AF at high res. A 580/1850 512MB GTX is faster than a X1900XTX overall.
 
tbz_ck said:
It is a next gen card specifically designed for next gen Pixel Shader intensive games. As more heavy Pixel Shader games are released, the theoretical increase in performance from a X1800 to X1900 is up to 3 times. !!! :eek:

ATI have changed tack from Pixel pipelines to Shader operations. I expect G71 will be 32 Pipes, but when you haven't got the bandwidth nor Pixel Shadign power to fully satisfy those pipes it's a bit of a waste (which is why the GTX with 24 pipes doesn't perform a whole lot better than the GT with 20)

I expect nvidia to follow suit from G80 onwards IMO, but for technical forward thinking I think ATI are right on the mark. Smarter rendering rather than pure brute force....

(and this is from a 7800GT owner btw !! :D)


ATI have the brute force method. Look at the R420 compared to the NV40, or better still the X1800XT compared to the 7800GTX 256. The X1800 has massively more bandwidth, pixel filrrate, texel fillrate, vertex rate yet they're practically the same speed. The 7800 must be incredibly efficinet as it is running so much slower and has such a deficit yet it keeps up. The X1800 was clocked to high heaven to beat it.

As for the future, well Nvidia are not following suit. The G71 will be 32 shaders with 2 ALUs (e.g. 64 equivalent to the X1900s 48 shaders), 32 TMUs compared to ATI 16, and 16 ROPs. Texturing usage is still going to increase massively, there is a long long time before shaders will be predominant. Jophn Carmacks next engine ahs the emphasis on textures. Every surface will have an incredibly high resolution and unique texture. There wont be one texture the same in the level. Texture resolutions will be as big as 40kX40K!, with standard sizes of 4096x4096 etc. Even the Unreal 3 engine is a texture hog with the standard size of textures being 2048x2048. The standard texture size in Doom 3 and HL2 (and FEAR ) is 512X512. So there will be 16X the texel count in up and coming games like UT2K7 etc. In the next carmack engine it will be more like 64X! The X1900XTs texture performance increase = 0%...
 
Last edited:
Well I currently own both a 190xtx and a GTx 7800512.(selling the GTX at present).
I went back to ATI as I have always prefferd them for IQ and drivers.
The 1900XTX really shines at high resolutions like 1920 x1200 which I use and the IQ is definately better in games like COD2, FEAR and Half Life 2.

In the Lost Coast at 1900x1200 Max detail 16AF 4AA the Gforce scores 48 fps and the ATI 62 fps.


The only thing I'm disappointed with is the awful bloated control panel and Dot Net installation that ATI are using! -it is fine when up and running but has increased my XP boot time by at least another minute!

I am also annoyed with the fact that when the 512 GTX's were released I didn't know the 1900 series was so close!....but that serves me right for being obsessed with PC hardware! :eek:
 
D.P. said:
ATI have the brute force method. Look at the R420 compared to the NV40, or better still the X1800XT compared to the 7800GTX 256. The X1800 has massively more bandwidth, pixel filrrate, texel fillrate, vertex rate yet they're practically the same speed. The 7800 must be incredibly efficinet as it is running so much slower and has such a deficit yet it keeps up. The X1800 was clocked to high heaven to beat it.

As for the future, well Nvidia are not following suit. The G71 will be 32 shaders with 2 ALUs (e.g. 64 equivalent to the X1900s 48 shaders), 32 TMUs compared to ATI 16, and 16 ROPs. Texturing usage is still going to increase massively, there is a long long time before shaders will be predominant. Jophn Carmacks next engine ahs the emphasis on textures. Every surface will have an incredibly high resolution and unique texture. There wont be one texture the same in the level. Texture resolutions will be as big as 40kX40K!, with standard sizes of 4096x4096 etc. Even the Unreal 3 engine is a texture hog with the standard size of textures being 2048x2048. The standard texture size in Doom 3 and HL2 (and FEAR ) is 512X512. So there will be 16X the texel count in up and coming games like UT2K7 etc. In the next carmack engine it will be more like 64X! The X1900XTs texture performance increase = 0%...

Yea but the X1800 and X1900 are still 16 pipe cards. So hardly inefficient i they keep up with a 24 pipe one, true?

I'm no fanboy, and I can see the merit in both architectures. Both will undoubtedly increase pipeline count in the future (as process shrinks occur and die size permits). I see where you are going with textures but I don't think texture processing grunt is going to be the be all and end all. Shading power and pixel processing will be very important in upcoming titles, and really I can't see 4096x4096 textures being the norm. For a start, which monitors can display a texture in its native resolution at that size? Most will scale/compress them to fit both memory constraints and bandwidth and actual monitor resolution if I'm understanding correctly. Ergo, shading power will be just as important.

If Xenos in X360 is anything to go by, then ATI will be moving to a more unified shader approach for PC parts within a generation or two. nVidia have also stated that (IIRC from memory looking at the B3D forums)

For me, ultimately its about bang for buck baby ;)
 
Guinny said:
I am also annoyed with the fact that when the 512 GTX's were released I didn't know the 1900 series was so close!....but that serves me right for being obsessed with PC hardware! :eek:

Its alright for some!! Some of you guys must have money to burn, im looking at cards like the 6800GS while your all talking about 1900XT's etc!
 
toy_soldier said:
Its alright for some!! Some of you guys must have money to burn, im looking at cards like the 6800GS while your all talking about 1900XT's etc!


I wouldn't say we had money to burn but we are just more foolish with it! :D
 
D.P. said:
ATI have the brute force method. Look at the R420 compared to the NV40, or better still the X1800XT compared to the 7800GTX 256. The X1800 has massively more bandwidth, pixel filrrate, texel fillrate, vertex rate yet they're practically the same speed. The 7800 must be incredibly efficinet as it is running so much slower and has such a deficit yet it keeps up. The X1800 was clocked to high heaven to beat it.

As for the future, well Nvidia are not following suit. The G71 will be 32 shaders with 2 ALUs (e.g. 64 equivalent to the X1900s 48 shaders), 32 TMUs compared to ATI 16, and 16 ROPs. Texturing usage is still going to increase massively, there is a long long time before shaders will be predominant. Jophn Carmacks next engine ahs the emphasis on textures. Every surface will have an incredibly high resolution and unique texture. There wont be one texture the same in the level. Texture resolutions will be as big as 40kX40K!, with standard sizes of 4096x4096 etc. Even the Unreal 3 engine is a texture hog with the standard size of textures being 2048x2048. The standard texture size in Doom 3 and HL2 (and FEAR ) is 512X512. So there will be 16X the texel count in up and coming games like UT2K7 etc. In the next carmack engine it will be more like 64X! The X1900XTs texture performance increase = 0%...

Just to point out two things in the GTXs favour over the x1800 is that the GTX has a higher texture fillrate btw, but only slightly. It also has a higher pixel shader instruction rate.

Regarding the G71 and x1900s ALUs, remember that the x1900 has a mini ALU which can only do ADD and that the x1900 has a seperate ALU for texture lookups whilst the G7x series use a shared alu in the shaders for this. I'll do the sums later but if you take into account the shared nvidia alu i think the x1900 throughput maybe higher. This all depends on the clockspeeds tho :p
 
Both the GTX512 and the X1900 are amazing cards. I think where ATI have won the battle this round is in their pricing structure, especially when people are looking to build crossfire or SLI systems.

I wouldn't be suprised if the G71 edges towards £600 on release - everybody has a budget and when cards perform so closely as the current batch, it would seem foolish to spend hundreds of pounds more for a few %.
 
I would also add in the 512 GTX's defense that it is a very quiet and cool running card compared to the 1900XTX.

The heat sink and build quality on the card is very high and just looks more solid and efficient than the aTI coolers!
 
Guinny said:
The only thing I'm disappointed with is the awful bloated control panel and Dot Net installation that ATI are using! -it is fine when up and running but has increased my XP boot time by at least another minute!

Install just the driver package and then use ATI Tray Tools... much better than the CCC or the CP IMO, doesnt hog much memory and doesnt require that you install .Net
 
Ive always been with nvidia... but recently sold my bfg 7800 gtx oc 256mb, and bought a connect3d x1800 xt and flashed it with the pe bios.... now standard (with pe bios) im getting 10,056 in 3d05 but with the 7800gtx heavily overclocked to the max i got 9,115 ?? And i still have plenty to overclock the ati 2!! IQ is much better on ati and it owns in games. I think the x1900 will come into its own in the next 1 or2 driver sets. People are forgeting they are only using beta drivers.. but for now im sticking with my x1800 as its very close if not better than the 512 7800 (with the pe bios).
 
Street said:
Install just the driver package and then use ATI Tray Tools... much better than the CCC or the CP IMO, doesnt hog much memory and doesnt require that you install .Net


6mths ago i would have agreed with you, but the latest CCC dont hog memory at all, they used to take up an extra 120mb or so at boot time and be very laggy when using them.

Just out of curiosity i installed the new 6.1 CCC and was very pleasently surprised, takes up less than 10mb and hardly has any lag when using it.
 
ok performace wise

x1800xt 512 gave me 4486 with the card clocked as high as the ccc will go in 3dmark06

x1800xtx latest drivers from ati site 6.1 for xp64 i get 5995 in 3dmark06 again ccc maxed out

mb asus a8r-mvp amd 64 x2 4800 stock plenty of ram

ati have released a patch for xp64 because the cd drivers would not install
so xp32 bit uses 5.13 and xp64 bit can use the modded 6.1
 
Answering the OP.

No I am not disappointed at all, in fact I am really chuffed with the performance of my 1900XT-X :)

The only thing that I am a bit miffed off about is the £30/£40 price drop that happened two days after release!!! :mad:
 
Last edited:
Back
Top Bottom