Have you had both cards?
I'm not anti ATi, I thought the last ATi card I had was totally awsome...
I dont think someone is wrong if they claim playing Lost Planet on high at 1680x1050 with a heavily overclocked GTS is wrong, I guess there system must be wrong then for giving them that much FPS.
i have an unsold 8800gtX sitting here in this room, while my 2900xt is in my vista 64 rig, i had it in there for 3-4 months before the 2900xt came out, when i got the card suddenly i stopped getting random bluescreens, CTD's with nvidia driver reseting popup and other random irritation. also had a gts, a x1900(actually have a 1900xt in 2nd rig, a x1950pro i used between the 8800gtx/2900xt(to play lotro as it bluescreened after 5 mins without fail in late beta early full release) etc, etc.
the simple fact is that in that review, the 2900xt BEATS the 8800gts's, both, in MORE than half the benchmarks at anything except 1024x768, which is a resolution i wouldn't drop to even with a x800. so based on a brand new game, which would be most susceptible to improvements, and lost planet, a widely crap game in both dx9/10 on BOTH cards you say the 2900 is useless. the fact was the 8800gts 320 lost to the 8600gt 512 in the same benchmark for lost planet, but again, used this as a reason the 2900 was the much much worse card.
you can not pick and choose,overall its VERY close between gts/xt in everything, take out the 1024x768(be honest, you think anyone uses that res?) and the Xt is ahead, marginally. yes it doesn't beat the overclocked gts's, which take the lead back, marginally again, but you can overclock the xt's so it all ends up the same.
frankly either card gives a very similar performance level.
nvidia screwed up because it made vista generally utterly useless from crashing and other issues for months.
hardware AA is going the way of the dodo, again ATi did something architechturally great, and something Nvidia WILL do, a generation or two early--- do'h.
fact is AA is dodgey in a huge majority of big releases since and including STALKER, this is a not a short term phase, but the long term intention of game designers. the 16 pipes aren't "hugely" a problem, it doesn't help in benchmarks because when theres little to do on screen, you need base raw power to push the framerate up. so the gtx in low load scenes has a lot more raw pixel pushing power to push up the framerate to 300fps when nothings going on, and unfortunately 16 pipes is limited when it comes to maximum framerates, which sucks but thats life.
but when those 16 or 24 pipes are waiting on pixel process's to happen they become a lot less loaded, think cpu cores waiting on memory access to a point, they aren't fully loaded, so minimum framerates need less pipes.
the ati architechture is much harder to leverage all the power out of, which is a shame, i'm not sure if what was dropped from DX10 support meant game makers could put less effort into shader AA or anything, afaik it was a virtual memory usage type thing that Nvidia dropped support from and MS let drop out of the DX10 spec totally.
the simple fact is, that anyone thats honest thats used a 2900/8800 will tell you at a mid/high res they are very VERY equal to each other. they are a bunch of games they perform almost identically, there are lots of games where ATI pull ahead, and a couple where they have a ridiculous lead, and likewise there are nvidia games which let them pull ahead, and a couple that have a ridiculous lead.
obviously this means a few people playing a very particular set of games can have a massively better experience on one card over the other. a very varied/large array of game usage and you get a similar experience on both cards.
there are some very big older game benchmarks where high aa levels cause the 2900xt to drop behind, but they are old games, most of them we are talking 150 instead of 200, nothing last gen can't handle them. most(moving towards all) new games you will not be using much in the way of AA settings with ati/nvidia so that isn't largely relevant anymore.