• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why is the ATi 2900XT not as good as the 8800GTS?

Here's some information from some X2900 review:

The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock.
...
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.
...
All of this sounds great on paper, but the facts are we never really saw any major specific examples of this new memory subsystem making a specific impact in games with the previous generation. We may be looking at a rather unbalanced GPU. The memory subsystem potential is incredible, but if the GPU cannot keep the memory fed the point is lost.

The X2900 just falls well short of it's NVidia counterparts in nearly all departments, as far as I know its shader units are a lot more simpler than 8800's too, hence there being loads more of them.
 
Last edited:
You can have tonnes on memory bandwidth and pixel piplines etc, but if the core GPU architecture is is inferior you end up with a slower product. NVidia are probably 12 months ahead of ATI as far as technology is concerned, and it does not look like ATI can re-address this soon.
 
Its a tad confusing this but here goes.

The GTS had 96 stream processors that can only do 1 instruction at a time, while the HD2900 actually has 64 but can do 5 instructions at once (64 x 5 = 320), so ATI need to keep the processors fed so thay can do their 5 instructions at once. At the moment they cannot do this.

Remind anyone of Netburst and the GHz wars? ;)

Dave
 
well, if anyone actually pays attention to anything, latest drivers up against each other, lets see bioshock the 2900xt is on par with the GTX in dx10, in UT3 the 2900xt is on part with the GTX, both completely ahead of the gts, and both are ahead of the GT.

infact in all but a select few nvidia way its meant to be played games the XT is either equal to the 8800gts or GTX, its rarely below the gts except in dx10 on world of conflict and lost planet, two nvidia games, of which lost planet isn't playable on any card in dx10 with decent framerate.

people constantly trash the 2900xt and its all bull.

people are all over the GT even though the very few reviews which use the GT in real resolutions 1680x1050 and 1920x1200 show it to be mostly marginally behind the gts. just remember, there are DOZENS of the usual sites WITHOUT GT reviews. because nvidia invited a very small number of journo's to this bullcrap conference with that 25 page slide showing every trick on how to make the GT look good, of which most of it is using the card in weird res's.

for instance half the reviews use 2560x1600 with unplayable numbers, but where it matches the GTX, anandtech is reviewing with only oblivion using AA. so what, does the GT do worse when using 4Xaa and textures are breaking the 512mb texture buffer, or the mem bandwidth is showing it slower than the GTS?

biggest media marketing release ever. now the GT absolutely kills when you can get it for £150, but for £175 its at most places, when most places are selling the gts 640's for £5-10 more now, its actually not a great card at all.

its only noticeably cheaper because of the thinner pcb due to lower mem bus and the cheaper core. the rv670 is going to be probably a little cheaper.


as for the streams, nvidia use 96-126 actual streams. ati use 64 "units" with 5 stream processes available in each unit, but without really good optimisation not all those 5 sp's can be used within the unit. so in the worst case scenario only 64 sp's are being used, in reality using 2/3 is fairly easy, but using 4/5 is very hard. but drivers have recently gotten much better, with games like bioshock, and lots of others getting huge increases.

remember the sp's run at core clocks on ati and on nvidia the sp's run at shader speed 1.5Ghz so even when ati use all 5 they are at half the speed, so in effect could do the same work as maybe 160-180 nvidia style stream processors. they aren't the same architechture so comparing the simple numbers doesn't work instantly.

rop's are a even bigger misleading number. essentially everything goes through the rops in the end. but 16 is more than enough tbh, 24 is great, but in complex scenes when framerates are low filling 16 rops is hard, in less complex scenes filling 16 is so easy that the rops become a limit. but this more than anything effects TOP framerate, not minimum or average framerates. which means in benchmarks less rop's hurts benchmarks, but real game experience less rop's doesn't really mean anything.
 
You can have tonnes on memory bandwidth and pixel piplines etc, but if the core GPU architecture is is inferior you end up with a slower product. NVidia are probably 12 months ahead of ATI as far as technology is concerned, and it does not look like ATI can re-address this soon.

complete and utter trash. seriously. people need to actually pay attention to the fact that in a large number of newer games, which are shader intensive the 2900xt is keeping pace with the GTX, not the gts.
 
You can have tonnes on memory bandwidth and pixel piplines etc, but if the core GPU architecture is is inferior you end up with a slower product. NVidia are probably 12 months ahead of ATI as far as technology is concerned, and it does not look like ATI can re-address this soon.

The ati 9700 vs geforce5800 was much worse for nvivda, yet they turned it around for the next cycle.

Either ati just underestimated the 8000 series or they just didn't get the 2000 series working like they thought it would. However at least they did the decent thing, unlike nvivda with the 5800, and marketed it as a mid range card and not a 8800gtx/ultra killer. Lets hope they get it right next time.
 
complete and utter trash. seriously. people need to actually pay attention to the fact that in a large number of newer games, which are shader intensive the 2900xt is keeping pace with the GTX, not the gts.
ATI had 6 months longer than NVidia to develop and release the 2900 series. When it arrived, it was still significantly slower than a GTX. ATI's current gen are not superior, equal or even close to NVidia this time around. Sure the 2900XT may win in two or three benches, but it gets soundly beaten in the vast majority. I can post links to prove this if you cannot search for yourself. Perhaps ATI's 3xxx series can be competative, but even that is recieving a lot of negative speculation atm.
 
Last edited:
ATI had 6 months longer than NVidia to develop and release the 2900 series. When it arrived, it was still significantly slower than a GTX. ATI's current gen are not superior, equal or even close to NVidia this time around. Sure the 2900XT may win in two or three benches, but it gets soundly beaten in the vast majority. I can post links to prove this if you cannot search for yourself. Perhaps ATI's 3xxx series can be competative, but even that is recieving a lot of negative speculation atm.


why don't you post a link to a recent benchmarking session then, because it will show UT3, bioshock and a vast number of other games all with the 2900xt inbetween the 8800gts and gtx most of the time. its only noticeably worse than the gts in lost planet, which must have all of 9 sales on the pc, and world in conflict, in the benchmark only, not in "in game" benchmarks.

but no, show me a benchmark where the 2900 xt gets SOUNDLY beaten. go ahead.

on release it was mostly equal to the 8800gts except with 4xaa enabled, latest benchmarks have a large number of games ahead of the gts with 4xaa enabled aswell.

people are such fanboys its unbelievable. in terms of technology, the main architechture difference is the ringbus vs crossbar mem bus, everything else is rops and stream processor units which aren't vastly different, they've paired them up and balanced them in different ways, but the technology itself is basically the same. nvidia WILL be going mem ring bus, or the equivelent nvidia named version. its why nvidia have a 384mem bus on the gtx, with crossbar its simply an unmanageable amount of traces and connections.

if you buy a 8800 gts/gt/2900pro/xt you won't have a significant difference in the resolution or quality settings you play at its as simple as that
 
ive just tested my new 2900pro, so far its great. clocked it to 800mhz core and 825 memory. with 4x AA i only see about 10fps drop in the 3 games i've tested, lost planet, Half-Life 2 Lost Coast, Half-Life 2 Deathmatch...

are theses temps good or bad? 49c ida, 72c load...
 
ok anandtech review, compared to their results for UT3 gpu performance they've done previously. in the previous article they ran 3 different maps and showed results for each map http://anandtech.com/video/showdoc.aspx?i=3127&p=8


in the 8800gt review, the numbers it states for the GTS, 70fps, were best case scenario of the 3 maps, the fastest it can do. the GTX numbers it uses aren't from that same map, infact it seems to be about an average across the 3 maps, and the 2900xt numbers it uses are the LOWEST the 2900xt on its worst map. but on the map the 2900xt does its worst 79.5fps, the gts gets 60, over 30% slower. on that same map the gtx gets 83, some what short of whats shown in the 8800 gt vs gtx numbers.

on average in UT3, and bioshock(from other reviews around the web, mostly of 2900pro recent reviews with the 7.9/7.10 drivers) the 2900xt is around 30% faster than the gts, and within 2-3% of the gtx on all maps, in almost all situations. but almost every single review of the 8800GT seems to be using incredibly odd benchmarking.

when was the last time anandtech did a review without a direct bar graph of all top end cards current and last gen? when was the last time they didn't use aa/af in gpu reviews. when was the last time they refused to give gts/2900xt/gtx numbers on the same page? never?

what about the site that only used 2560x1600? what about the missing reviews off a ridiculous number of sites who would all normally have reviews up the second the NDA was up? what about the 25 slide presentation "some reviewers" were invited to that tried to lay out exactly how to benchmark in the way Nvidia wanted.

almost all the reviews are picking and choosing results to make the gts as close as possible, the 2900xt as slow as possible?

with anandtech you can plainly see they have used the 2900xt's lowest number on the hardest running ut3 map, against the gts's highest number on the easiest map, theres simply no two ways about it.

anandtech + hl2, the gtx/2900xt in their recent analysis shows the same numbers in for the 2900xt in the 8800gt review, but they've come up with another 8-9fps for the gtx, out of nowhere. but that would be the 3 of the biggest last 3 titles released that the 2900xt matches, or within 3-4fps of the gtx, while inbetween 20-30% ahead of the GTS. but no you're right, the 2900xt is crap, especially as you can i think still find them for £160 as pro's today in the uk :o worth noting that lots of the review sites that do have reviews are comparing 2900pro's with pre overclocked 8800gt's. its just the most dodgey release of reviews i've seen in the 7-8 years i've been building computers.


i do give ati minus points for one thing, the difficulty of getting performance out of the gpu seems to entail waiting for a driver release for almost every game for it to be optimised, while nvidia do the same it would seem nvidia are going from close to final performance to final performance, where ati are going from(in some games at least) absolutely horrific performance to good final performance. now at least ati are fast with releasing new drivers, but its showing their gpu is great with the right optimising, and incredibly difficult for a default driver to seem to get good performance out of.
 
Last edited:
ive just tested my new 2900pro, so far its great. clocked it to 800mhz core and 825 memory. with 4x AA i only see about 10fps drop in the 3 games i've tested, lost planet, Half-Life 2 Lost Coast, Half-Life 2 Deathmatch...

are theses temps good or bad? 49c ida, 72c load...

Temp wise pretty good, glad to know you're happy with the upgrade from your Crossfire setup!

@drunkenmaster + 555BUK

Whilst both of you seem to be rooting for opposite sides of the coin, its quite interesting to see some of the actual technical debate over the two cards.

One thing that also must be remembered, is that, as far as I remember, Nvidia DID NOT stick to the original DX10 plan - they applied to have it changed because they couldnt make the 8800series fully compatible, ATI are meant to have designed to the original spec. Whilst this may not simulate real life performance, it does leave it open to wonder how the 8800 would have turned out if MS turned them down for the changes, or ATI had designed with the shifted goals in mind, not the original goals. Would be interesting if both sides ever fully show the potential of thier chipset, as R600 series does seem to have a lot of potential thats not being uncovered for general usage, the high 3dmark scores show it must have some power somewhere, its just making that potential available in real world situations, and as the Anandtech review shows, thier obviously is some strong potential from the R600 core, this just needs to be translated to more titles, as soon as possible, if that means redesign of some of the key driver components, then its up to ATI to do so.
 
Last edited:
ive just tested my new 2900pro, so far its great. clocked it to 800mhz core and 825 memory. with 4x AA i only see about 10fps drop in the 3 games i've tested, lost planet, Half-Life 2 Lost Coast, Half-Life 2 Deathmatch...

are theses temps good or bad? 49c ida, 72c load...

Glad your happy with it dude :). those temps seem ok from what i have read

good news for me too :D :D
 
ive just tested my new 2900pro, so far its great. clocked it to 800mhz core and 825 memory. with 4x AA i only see about 10fps drop in the 3 games i've tested, lost planet, Half-Life 2 Lost Coast, Half-Life 2 Deathmatch...

are theses temps good or bad? 49c ida, 72c load...


Dam sure you can hit 900 on the memory, have you tried higher or is it crashing on you with further clock increases?
 
Back
Top Bottom