• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

X2900XT Review is up...

Man of Honour
Joined
19 Oct 2002
Posts
29,861
Location
Surrey
http://vr-zone.com/?i=4946&s=1

(don't think I'm breaking any rules with a link :confused: If I am then mucho aplogisiness :D)

In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is ****-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.
 
wonder if ita maxxed out now in performance with the lest driver release it had. or is there room for some more boost?

either way, it sucks up more power than a 8800gtx and also requires proprietry power connections and for what? worse performance? will see tomorrow to see if it can make up for its shortcommings with a good price.
 
Cyber-Mav said:
wonder if ita maxxed out now in performance with the lest driver release it had. or is there room for some more boost?

either way, it sucks up more power than a 8800gtx and also requires proprietry power connections and for what? worse performance? will see tomorrow to see if it can make up for its shortcommings with a good price.

Hard to say. The next driver release will be key If it boosts performance a second time then obviously it will indicate more headroom.

Hehe, just pressing F5 to refresh and everytime I do, someone else has read the thread :)
 
so far im not impressed by 320 streme processors at 700+mhz not being able to take down nvidia's 128 streme processors at 600 odd mhz.
but maybe time will tell. but again it seems like nvidia has won this round, but that was know from the start. :(
 
Cyber-Mav said:
so far im not impressed by 320 streme processors at 700+mhz not being able to take down nvidia's 128 streme processors at 600 odd mhz.
but maybe time will tell. but again it seems like nvidia has won this round, but that was know from the start. :(

True. But it depends on two factors in my view:

1) Whether AMD/ATI can significantly improve their drivers. Although I doubt this will happen to the extent needed. If the drivers were that bad then they would not have release the card. It's late anyway so another month would not matter in the grand scheme of things. This indicates to me that they are not that bad for the card.

2) Price. If the price point is right then it would pull ahead of the GTS and also force the GTX down in price. How could AMD/ATI do this? Well ATI now have all of AMD's fabs at their disposal so production costs should be lower than in previous generations.


Assuming the price is similar and performance is similar then the deciding factor for me is the best Linux drivers. So that means nvidia in my case.
 
im pretty disappointed by those results, looks like the price is going to be the deciding factor on the sales of these cards
 
tereu5 said:
but MSI shows watercooled ver :)


shame, withthe performance of thier card they should not really need to be water cooling.

one thing i have learnt about this next generation lot of cards, is not to bother touching them and wait for the generation above them to see if lower power consumption versions come out with jack all hat output.

the x2600 series looks interesting with only 45w power consumption, but wonder if they targetted to 7300gs cards?
 
some things to consider, hardware wise its 320 stream jobbies running at the core clock speed, nvidia's core clock isn't the speed their stream processors run at, but they run at 1.45Ghz or so(depends on model you buy).

HDR, and lack of hardware for it, 8800gtx/s seems to be able to do far high numbers of pixels per clock for hdr instructions which seems to kill the 2900xt now. why would they limit the instructions per clock so badly, well dx10 definately has tweaks to HDR to let it run much faster and easier, maybe the HDR performance is only crappy under dx9? just a guess, completely, but they know its a dx10 gpu at heart, and the cards will be in rigs for 1-2 years and most games will be dx10 by next year so maybe. drivers, and AA drivers seem to be a big thing here aswell, the results here weren't on the very latest which are said to increase hdr from 5-30%, and other things aswell so reviews from sites tomorrow with the alpha's(and most likely a beta/full release tomorrow some time i'd think) might have something different to say.

also, IQ, they've done every highest end test as 16xaa wide tent, which is likely to give a more blurred image than narrow tent. i'd want to see the narrow tent comparison, if it gives same/better IQ, and actually uses less power, well, something we need to see.

it seems very competitive with drivers out every month and already drivers seeming to have been giving good increases so far. the stability of drivers will be a huge thing to find out aswell for vista users.

as for power usage, look at the numbers, even if it does use 60W's more, a rig setup for benchmarking top end kit, with normal(ish) top end gear and under load the whole lot draws less than 400W's. a 500W psu is fine, 60w's isn't going to kill anyone or break the bank, its really just doesn't matter at all.

one of the reasons for not overly fantastic increases in fps/scores with the GTX/S series is that fact that you overclock CORE speed and NOT the shader speed(i think bios's can clock it, not sure how far they clock though) and nvidia have been promising and failing to deliver shader clocking in ntune for 5 months. ati's shaders are on the core speed, so when you bump from stock to 850Mhz you should be getting a LOT larger performance boost than a gtx/gts series gets from a similar overclock. i think the overclocking + performance boost is shown completely with that 16k 3dmark score. because of the shaders + ability to overclock them properly, and quite significantly, i see it being a GREAT card.
 
ATI's cards do NOT require proprietry power connections - they will happily run on 1 o2 6 pin conectors just like any standards higher end card. The 8 pin connector is for those who have it & for extreme overclocking where power draw may be a concern.

Cyber-Mav said:
shame, withthe performance of thier card they should not really need to be water cooling....

the x2600 series looks interesting with only 45w power consumption, but wonder if they targetted to 7300gs cards?

Water cooled cards are there for those who want it, they dont need to be watercooled, as the stock ones have an air cooler.

The X2600 Pro & X2600XT will likely compete with the 8600GT/GTS as the X2400 series are the low end DX10 parts.
 
guru3d review is up
http://www.guru3d.com/article/Videocards/431/

there is the call of jurez DX10 benchmark included in this too
icon14.gif
 
Last edited:
Cyber-Mav said:
wonder if ita maxxed out now in performance with the lest driver release it had. or is there room for some more boost?

either way, it sucks up more power than a 8800gtx and also requires proprietry power connections and for what? worse performance? will see tomorrow to see if it can make up for its shortcommings with a good price.


Dunno why but I so feel like saying, 6 months late, drivers still beta/non optimised, runs too hot, needs more juice, noisy fan and still only just comes close to GTX but ill wait till the final reviews to say that ;)

Technicaly i wanna read up on the heat/fan noise and overall performance with AA/AF used and price.

Perhaps fair to say its might be more ideal to wait for the 65nn Versions or the XTX 1gig version in 2-3 months time.
That will fix all the issues and without doubt overtake Nvidia easily.
 
Back
Top Bottom