• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2900xt owners\benchmarks thread

Hey, would any of the 2900 owners be able to post some results running Sup Cmdr ? As thats one game my x1900 struggles to run smoothly at 1600 x 1200 and I am really considering a new card come the summer (only about 2 and a half months off now!)

Even just running the benchmark would be useful, specially if someone with an amd setup could try it ... Gerard you've got an opteron or similar don't you ?
 
From a compeitor's site so only quotes i'm afraid

Source: Cust****.com
AMD explains poor Radeon HD 2900XT AA performance
AMD recently launched its first DirectX 10 graphics card, the Radeon HD 2900XT. Despite arriving six months behind Nvidia's GeForce 8800-series, AMD's new card hasn't fared too well in comparison to its rivals - we spoke to the company to find out why, and learnt why the Radeon team still thinks they made the best design decision for the future.

The R600 is finally here, and in keeping with its mysteriously long gestation, in at least in its first incarnation as the HD 2900XT, AMD's new GPU still poses a lot of questions. One of the things we noticed during our in-depth testing of the card is that compared to its principle rival, the Nvidia GeForce 8800 GTS 640MB, the HD 2900XT performs poorly in many games when anti-aliasing is enabled.

In F.E.A.R., at 1,600 x 1,200, with AA and AF disabled, the HD 2900XT easily outstripped the 640MB 8800 GTS, delivering a minimum that was 23fps higher than the latter's. However, with 4x AA, the HD 2900XT's minimum framerate dived from 82fps to 21fps, while the 640MB 8800 GTS produced a minimum of 30fps. Adding 4x AA results in a 74% drop for the Radeon, compared to only a 49% drop for the GeForce.

The Radeon's framerates suffer disproportionately with anisotropic filtering, too. Again testing in F.E.A.R. at 1,600 x 1,200, we saw the HD 2900XT's minimum FPS drop by 10 per cent with 16x anisotropic enabled, compared to 3 per cent for the GTS, although the HD 2900XT still had a faster average. It was a slightly different result at 2,560 x 1,600, as the HD 2900XT's massive bandwidth gave it a boost, although adding 16x AF still had more impact than it did on the 640MB GTS.

As most gamers will want AA and AF enabled in games, the HD 2900XT's poor performance with these processing options enabled is a serious problem for the card and ATi. We asked ATi to comment on this surprising result and the company revealed that the HD 2000-series architecture has been optimised for what it calls 'shader-based AA'. Some games, including S.T.A.L.K.E.R., already use shader-based AA, although in our tests the 640MB 8800 GTS proved to be faster than the HD 2900XT.

We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'

While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."

497802152_f90e93d76e_o.jpg

I think they took a bit of a gamble this time round and released a card with this concept a bit too early. Real dx10 games which fully utilise the toys available being a couple of years away and people still working for dx9 compatibility
But we'll see what happens :)
 
Last edited:
Not sure if this is the right place to put it, could a mod move it if they feel it necessary

I found this article pretty interesting, its not really benchmarks but how far the core was pushed using extreme cooling and it had some nice info on using 3rd party cooling

http://www.nordichardware.com/news,6331.html

http://www.nordichardware.com/Reviews/?skrivelse=510

Shows a lot of potential for the 65nm version tbh

More importantly
The one thing that has probably caused the most discussion with the HD 2900XT is the 8-pin PCI-E connector for additional power. The only purpose of this connector is to deliver more stable voltages and, of course, more power when overclocking. You can do without and the fact is, that our tests were done with two 6-pin connectors. You don't have to buy a power supply with an 8-pin connector to overclock HD 2900XT.
 
Bah shader AA ... if its anything like the alleged AA in stalker they can keep it, I prefer the old method :)

I haven't seen any of you guys start clocking yet, wassa matter with you all ? get to it :p
 
TheOtherOption said:
Hey, would any of the 2900 owners be able to post some results running Sup Cmdr ? As thats one game my x1900 struggles to run smoothly at 1600 x 1200 and I am really considering a new card come the summer (only about 2 and a half months off now!)

Even just running the benchmark would be useful, specially if someone with an amd setup could try it ... Gerard you've got an opteron or similar don't you ?


fx60, ill dload the dmeo and give it a run through later.
 
Guys, what we have to remember is Nvidia have had a 6 month or so + head start on the ati HD series cards, in terms of drivers.

So comparing a new out ati to a 6 month old GTS/GTX is hardly fair to begin with.
 
eracer2006 said:
Guys, what we have to remember is Nvidia have had a 6 month or so + head start on the ati HD series cards, in terms of drivers.

So comparing a new out ati to a 6 month old GTS/GTX is hardly fair to begin with.
ATI had two months on Nvidia in terms of development, lol. No excuse that.

Unless you're saying nobody should work on drivers until a card is launched. ;)
 
Chronictank said:
From a compeitor's site so only quotes i'm afraid

I think they took a bit of a gamble this time round and released a card with this concept a bit too early. Real dx10 games which fully utilise the toys available being a couple of years away and people still working for dx9 compatibility
But we'll see what happens :)

Well that explains a lot! I do think it is cool to see they've thought ahead, but in my honest opinion it is a step too far. I think by the time this 'shader based AA' becomes common amongst games, nvidia will already have brought out a more powerful GPU capable of doing this and the R600 will be old tech.

It's a shame really as I was expecting the AA/AF issue to be resolved with new drivers; that seems very unlikely now. :(
 
Two oblivion AA screenshots to compair AA modes on my 2900 XT

Sorry there not taken in same place couldnt remember where i took the other 1st one when i took the 2nd, but yeh you can tell big difference anyway.

Screenshots are 1680x1050 PNG format

The 2nd screenshot looks sharper but personally i pfer the 1st one as the leave's on the trees seem softer


1st one, 24x Wide Tent AA, 100% Full everything else


2nd one, 16x Box AA, 100% full everything else
 
Last edited:
Ulfhedjinn said:
ATI had two months on Nvidia in terms of development, lol. No excuse that.

Unless you're saying nobody should work on drivers until a card is launched. ;)
Exactly.

Both companies have been dragging their feet a little though TBH. ATi in general and Nvidia with their astonishingly bad vista drivers!

gt
 
gt_junkie said:
Exactly.

Both companies have been dragging their feet a little though TBH. ATi in general and Nvidia with their astonishingly bad vista drivers!
I agree, except about the Vista drivers. I can actually get 1:1 mapping working in Vista, not in XP. :(
 
eracer2006 said:
Guys, what we have to remember is Nvidia have had a 6 month or so + head start on the ati HD series cards, in terms of drivers.

So comparing a new out ati to a 6 month old GTS/GTX is hardly fair to begin with.
The R600 was taped out in June 06, early samples were stamped August 06. It could be said that delays gave them more time to optimise drivers.

------------------------

Of interest to me as I have a 975 board, ATI HD 2900XT CrossFire: Intel 975X versus Intel P35

 
titaniumx3 said:
It's a shame really as I was expecting the AA/AF issue to be resolved with new drivers; that seems very unlikely now. :(
Not at all mate, their new methods mean they can fix it through drivers unlike previously when such things were put on hardware.
However there obviously will be somewhat of a bottleneck at the AA and AF resolving at the end
The question is whether they want to go to such lenghts to undo what they have designed the card to do
 
IceShock said:
im on a E6600 @ 3ghz, there on a quad at stock i think

but i get 19628 on 05 with stock card and single aswel..
3dmark05 is getting on a bit now, its not optimised for Quad multithreading. Also unlike 06, the cpu score is not calculated into the final score. There is an optional CPU score if you run it. Their clock speed is lower at 2.66Ghz, that will effect it. Still seems too low with the 8.37.4.3 drivers, maybe some of the difference could be down to 64bit Vista.

 
Last edited:
Back
Top Bottom