Here's some information from some X2900 review:
The X2900 just falls well short of it's NVidia counterparts in nearly all departments, as far as I know its shader units are a lot more simpler than 8800's too, hence there being loads more of them.
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.
The X2900 just falls well short of it's NVidia counterparts in nearly all departments, as far as I know its shader units are a lot more simpler than 8800's too, hence there being loads more of them.
Last edited: