• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The Official HD2900XT Thread ***

Status
Not open for further replies.
Obviously now keeping my GTX :) but im a little saddened seems ATI are leaving the high end market cos there just loosing too much money in that area and concentrating on cheaper/more better performing cards, Anandtech goes on to say the HD2900XT will be ATIs highest end part proberly this year.

Anandtech also got to grips and said AMD/ATI should admit they got it wrong/and were truely delayed without the poor excuses...

Tis a shame really one can always pick up a GTX on the For Sales forums here for around 300 quid making the HD2900XT pointless, or £50-60 quid more and just get a GTX new one/

The fact its hot/noisy was an instant turn of to me.... otherwise heres hoping ATI learn from there mistakes regardless I think this year when Nvidias next generation card arrives this fall ill just buy it instantly and wont make the same mistake twice.

Heres hoping since Nvidia have plenty headroom for performance/features and so does ATI that both compete directly on the Driver front !
 
Finally found what I was looking for, this is the list at the moment of PSU's that support HD2900XT's in crossfire and which are recommended by Ati.
Also has list of connectors, and as people will see the 1000W+ PSU myth is dead ;)
Click here.
 
fornowagain said:

Impressive. But unfortunately we know now that the XT performs a lot better in the synthetic benchmarks than it does in game.

Makes sense I suppose, as the sythetic benchmarks tend to benefit more from fill rate and memory bandwidth increases (which the r600 does very well indeed with its 512bit ringbus memory structure and 100Gb/s+ bandwidth).
 
Duff-Man said:
Impressive. But unfortunately we know now that the XT performs a lot better in the synthetic benchmarks than it does in game.

Makes sense I suppose, as the sythetic benchmarks tend to benefit more from fill rate and memory bandwidth increases (which the r600 does very well indeed with its 512bit ringbus memory structure and 100Gb/s+ bandwidth).

Well games are a lot more complicated than 3dmark and you have loads more variables involved in determining the end performance. I'd be willing to guess that eventually as the drivers mature for 2900XT, the performance in games will start to reflect the performance in 3dmark06.
 
titaniumx3 said:
Well games are a lot more complicated than 3dmark and you have loads more variables involved in determining the end performance. I'd be willing to guess that eventually as the drivers mature for 2900XT, the performance in games will start to reflect the performance in 3dmark06.

Not neccesarily. Drivers have nothing to do with it in this case, it's the nature of instructions passed to the card. The drivers help improve the internal efficiency of the card, and reduce overheads in the way the instructions are queued before they are sent to the card. Historically, you will see game performance and synthetic benchmark scores increasing roughly in parallel, with improved drivers. There is no particular benefit to games over the 3dmark scores.

The r600's strength lies in its huge memory bandwidth which helps it in situations where fill rate is most important - for example HDR rendering should perform well on the XT (I'd be interested to see results from the HDR 'floating balls' program, I imagine it will rival if not beat the GTX there). The strength of the 8800 instead lies in raw pixel shading power. The synthetic benchmarks have tended to benefit more from memory bandwidth increases than games do.
 
I see what you mean, but we've already seen a few significant increases in games with the latest drivers so clearly something in the drivers was holding the hardware itself back. Also, I'm pretty sure drivers can bring about performance increases specific to a certain game. As I recall with one of the driver catalyst revisions for my x1900xt, I saw a fairly large performance boost in Company Of Heroes, whereas most of other games remain unchanged.

Eitherway, I think everyone admits that the current results for the HD 2900XT are a bit odd and totally inconsistent, pointing to poor drivers.
 
From the Anandtech review:
First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses. Forcing MSAA resolve to run on the shader hardware is less than desirable and degrades both pixel throughput and shader horsepower as opposed to implementing dedicated resolve hardware in the render back ends
That could very well explain the slow AA performance, it also explains why HD2900XT is good at benchmarking as there is no AA used in 3DMark.

So this GPU uses too much power due to leaks and the AA implimentation has failed, I wonder how much AMD can do to solve these issues in a new revision?
 
drak3 said:
From Tom's review "For starters, R600 drew less overall system power than GeForce 8800GTX in the game tests."
The disadvantage of measuring system power is that it is dependant on other hardware, a GTX is a faster card so perhaps the CPU also can work faster as it isn't waiting for the GPU so the CPU uses more power compared to a slower card.
 
That seems unlikely - I'd have thought that it was at 100% load even if some of that was 'wasted' cpu power.

Easily investigated by simply looking at CPU load though.
 
Aekeron said:
That seems unlikely - I'd have thought that it was at 100% load even if some of that was 'wasted' cpu power.

Easily investigated by simply looking at CPU load though.
It was just a guess.

I have a Watt meter with my computer and the power use varies a lot when playing games or even in one game, one second it's at 380W and the next second it's at 350W.

I don't know if they have measured an average over a longer period and did excactly the same with the other card.

I think it is safe to say that the HD2900XT uses more power than a GTX, which is not a good thing as the performance is roughly the same a a GTS which deffo uses less power.

I also read in a review that the fan alone uses up to 30W and that the card uses more power when it is hot.
 
Such a strange offering from AMD. No real amazing performance but it is good value for money and once they iron out the bugs in the drivers it shoukld prove to be the best price/perofrmance card around. But it looks like the high end war is over or delayed in 2007 but who has £380 to £500 to spend on a PC graphix card anymore when a PS3 costs that and high end graphix card sales are in free fall now.

No maybe £250 is the new high end, good enough considering most people have 22" monitor as a maximum and 1680x1050 resolution can be easily dealt with cards in this price bracket.
 
Status
Not open for further replies.
Back
Top Bottom