• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anandtech: Real World DirectX 10 Performance

Soldato
Joined
3 Nov 2004
Posts
9,871
Location
UK
Real World DirectX 10 Performance: It Ain't Pretty

Takes a while to load, give it a chance. Hopefully they've sorted their server problem, so you can read it.

Intel Core 2 Extreme X6800 (2.93GHz/4MB)
ASUS P5W-DH
Intel 975X
ATI Catalyst 8.38.9.1-rc2
NVIDIA ForceWare 162.18
1280 x 800 - 32-bit @ 60Hz
Windows Vista x86





I'll reproduce this whole, as the page is taking so long to load.

Final Words

For now, AMD does seem to have an advantage in Call of Juarez, while NVIDIA leads the way in Company of Heroes and Lost Planet. But as far as NVIDIA vs. AMD in DirectX 10 performance, we really don't want to call a winner right now. It's just way too early, and there are many different factors behind what we are seeing here. As the dust settles and everyone gets fully optimized DirectX 10 drivers out the door with a wider variety of games, then we'll be happy to take a second look.

The more important fact to realize is that DirectX 10 is finally here. While developers are used to programmable hardware after years with DirectX 9, there is still room for experimentation and learning with geometry shaders, more flexibility, lower state change and object overhead, and (especially) faster hardware. But DirectX 10 isn't an instant pass to huge performance and incredible effects.

Let's look at it like this. There are really three ways a game can come to be in DirectX 10, and almost all games over the next few years will ship with a DX9 path as well. The easiest thing to do would be a straight port of features from DirectX 9 (which should generally be slightly faster than the DirectX 9 counterpart if drivers are of equal quality). We could also see games offer a DirectX 10 version with enhanced features that could still be implemented in DX9 in order to offer an incentive for users to move to a DX 10 capable platform. The most aggressive option is to implement a game focused around effects that can only be effectively achieved through DirectX 10.

Games which could absolutely only be done in DX10 won't hit for quite a while for a number of reasons. The majority of users will still be on DX9 platforms. It is logical to spend the most effort developing for the user base that will actually be paying for the games. Developers are certainly interested in taking advantage of DX10, but all games for the next couple years will definitely have a DX9 path. It doesn't make sense to rewrite everything from the ground up if you don't have to.

We are also hearing that some of the exclusive DX10 features that could enable unique and amazing effects DX9 isn't capable of just don't perform well enough on current hardware. Geometry shader heavy code, especially involving geometry amplification, does not perform equally well on all available platforms (and we're looking at doing some synthetic tests to help demonstrate this). The performance of some DX10 features is lacking to the point where developers are limited in how intensely they can use these new features.

Developers won't write code that will work fine on one platform and not at all on another. The decisions on how to implement a game are in the hands of the developer, and that's where gamers rightly look when performance is bad or hardware and feature support is not complete. Building a consistent experience for all gamers is important. It won't be until most users have hardware that can handle all the bells and whistles well that we'll see games start to really push the limits of DX10 and reach beyond what DX9 can do.

In conversations with developers we've had thus far, we get the impression that straight ports of DX9 to DX10 won't be the norm either. After all, why would a developer want to spend extra time and effort developing, testing and debugging multiple code paths that do exactly the same thing? This fact, combined with the lack of performance in key DX10 features on current hardware, means it's very likely that the majority of DX10 titles coming out in the near term will only be slightly enhanced versions of what could have been done through DX9.

Both NVIDIA and AMD were very upset over how little we thought of their DX10 class mainstream hardware. They both argued that graphics cards are no longer just about 3D, and additional video decode hardware and DX10 support add a lot of value above the previous generation. We certainly don't see it this way. Yes, we can't expect last years high end performance to trickle down to the low end segment. But we should demand that this generation's $150 part always outperform last generation's.

This is especially important in a generation that defines the baseline of support for a new API. The 2400 and 8400 cards will always be the lowest common denominator in DX10 hardware (until Intel builds a DX10 part, but luckily, most developers will ignore that). We can reasonably expect that people who want to play games will opt for at least an 8600 or a 2600 series card. Going forward, developers will have to take that into account, and we won't be able to see key features of games require more horsepower than these cards for the next couple years.

AMD and NVIDIA had the chance to define the minimum performance of a DX10 class part higher than what we can expect from cards that barely get by with DX9 code. By choosing to design their hardware without a significant consistent performance advantage over the X1600 and 7600 class of parts, developers have even less incentive (not to mention ability) to push next generation features only possible with DX10 into their games. These cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9.

Even our high end hardware struggled to keep up in some cases, and the highest resolution we tested was 2.3 megapixels. Pushing the resolution up to 4 MP (with 30" display resolutions of 2560x1600) will absolutely bring all of our cards to their knees. We really need to see faster hardware before developers can start doing more incredible things with DirectX 10.
 
Last edited:
DirectX 9 vs. DirectX 10

For Company of Heroes, we see huge performance drops in moving to DirectX 10 from DirectX 9. The new lighting and shadowing techniques combined with liberal geometry shader use are responsible for at least halving performance when running the more detailed DX10 path. NVIDIA seems to handle the new features Relic added better than AMD. These results are especially impressive remembering that NVIDIA already outperformed AMD hardware under DX9.

Lost Planet is a completely different animal. With Capcom going for a performance boost under DX10, we can see that they actually succeeded with the top of the line NVIDIA cards (the green bars represent performance increases in moving to DX10). There isn't much else enticing about the DX10 version of Lost planet, and it's clear that AMD's drivers haven't been optimized to tackle this game quite yet.






Still there is some DX10 performance increases in scaling, can only get better.
 
Last edited:
well, Company of heroes and lost planet looks very playable in directx10 on 8800's. I have no doubt performance will keep increasing with newer drivers too, and that TRUE dx10 games will make better use of it.
 
I don't know how accurate this is but nVidia recently said their newer, currently unreleased drivers boost DX10 performance by an average of 25% in the games they tested.
 
JamieL said:
I don't know how accurate this is but nVidia recently said their newer, currently unreleased drivers boost DX10 performance by an average of 25% in the games they tested.


thats interesting. people always say that the 2900 drivers will improve but looks like they seem to forget that the 8 series drivers are also being worked on. if this said 25% speed boost is true its throws everything out the window. :confused:
 
What a suprise, they all suck in Dx10. and we all knew this would be the case.
so it looks like a complete new system for dx10 ram,motherboard, and gfx card plus with ati's little hicup i can only see nvidia milking the g80 for all it's worth, then release a new card that will only just cut the musterd(badly need a killer card form ati) so next gen dx10 games also suck.
Time for some wallet voting me thinks,just canceled my order for an 8800 and my x1950xt is staying put, no way am i going to game at 1280x1024 dont care how good any dx10 game is.
 
Thats pretty scary surely cards should perform better then that seems as the majority of gamers i would say only have low - mid range graphics cards. Is this the result of poor coding or amazing graphics?
 
One thing I'd just like to say, ignoring early drivers, amd vs nvidia, etc... those results are shocking. I mean 1280x1024 with 4xaa and we see frames below 25fps and this is on top of the range hardware. :eek: I'll stick with DX9 for a little longer methinks till I really really have to switch.
 
I know a lot of people will be upgrading for Crysis in the near future and with the Intel CPU price cuts.

So what is the consensus on Vista and DX10 gaming. Is it justified to buy Vista yet, or are we still on the back foot, awaiting patches, drivers and service packs?
 
"Don't get G80, keep waiting for R600!" - Glad I didn't.

"You'll wish you did, R600 will pwn @ DX10!" - Apparently not.

Either way, looks like both products suck at DirectX 10, and like I've said before I keep being reminded of the SM3 launch every time I read more about DirectX 10. Looks like "operation wait for 65nm/45nm GPUs and then go quad-core at the same time" is steaming ahead nicely.
 
ATI Catalyst 8.38.9.1-rc2

Gah why did they use these pre historic drivers, I believe the 7.6 improved things a little.
 
That site is full of it anyhow, they lied about some miracle fix on broken PvP for 6800Ultras claiming new .dll's for WMP10 etc, I dont even rate them nowdays and aint visited since then.

I really doubt any new DX10 game will be so bad on ATI or Nvidia flagship cards as that would be a kick in the nutz for them both companies and customers buying £300-500+ card would be ******.

They both had AXX to DX10 Demos and vice versa for game makers who had early DX10 card samples, I think it will work out through more and more drivers, Lost Planet seems ok, not sure if DIRT is DX10, think demo was (please input).

No swearing
 
The fact of the matter is that the developers are still getting to grips with it it does take time with a whole new API. Along with the drivers this is clearly not just what our cards are capable of. Expect it to improve vastly, I can't remember if the same happened with DX9.
 
Was no real trouble from S.M 2.0 to 3.0 it was a patch for Farcry (1.3) and it ran well on 6800Ultra as the 1st DX9C S.M 3.0/P.S 3.0 card, but I cant think back to DX8.1 to DX9 (non b or c)

I had a G4 TI 4600 (only TI's were DX8.1, the FX 5000 ballsups were suppose to be DX9 but story goes they were borked and actually ran games in DX8.1 :p
 
DX10 needs crossfire and SLI by the look of it

I wish they would report the minimum frame rate of the test as thats whats important as gamers hate slowdown
 
I doubt it, and will test on a spare HDD with Vista and 8800Ultra running Lost Planet full game not buggy demo but still a port.

Again, is Dirt final or demo a DX10 game aswell as DX9 ?
 
Last edited:
Back
Top Bottom