GEARS pc performance article

Considering rainbow six vegas ran like a crippled tortoise i doubt it. :)
Wasn't really the engine's fault though TBH, just those idiots at Ubisoft who don't know how to code a port to save their lives. I didn't really believe for one second Epic Games would let one of their ports be that bad.
 
Considering rainbow six vegas ran like a crippled tortoise i doubt it. :)

Dodgy console port made by a lazy company. Rather than Epic, who made the original converting it for PC themselves.

Plus Rainbox six used an early build of the UE3 engine, just like Roboblitz did.
 
Thas how [H] do their reviews, those settings are what the cards are playable at.:)

Aye, if you want to find the settings you can play at just go with [H]Ocp reviews, instead of searching through pages of other reviews. however it does get annoying when you want to know what its like higher.
 
Aye, if you want to find the settings you can play at just go with [H]Ocp reviews, instead of searching through pages of other reviews. however it does get annoying when you want to know what its like higher.
Aye there's pros and cons to both reviewing techniques IMO!

I used to seriously hate [H] for the way they do it but now I see the sense in it. If I want to see stricter benchmarks, not quite as "real-world" ones in a sense, then I'll go to the plethora of other sites.
 
The only benefit was being able to use “On/Antialiasing” for 4X AA, but you need a very fast video card to use that setting. If you have a GeForce 8800 GTX/Ultra level video card you can enjoy that setting. But on the GeForce 8800 GT/GTS and Radeon HD 2900 XT you will have to sacrifice other in-game options or resolutions in order to use 4X AA. It is rather a shame because we noticed that 4X AA does improve the visual quality of this game in a noticeable way.

Eh howcome they say "GT/GTS" needs other settings sacrificed?

I thought the GT was basically a GTX nearly anyhow?
 
No quad core tests? UT3 loves quad core so GOW should too boosting fps even more...
 
I thought the GT was basically a GTX nearly anyhow?
The GT is a GTS line substitution, it's not intended to compete with the GTX it just happens to.

The reason a GT suffers compared to a GTX with 4xAA on is because it has 512MB RAM as opposed to 768MB.
 
It's probably coded in DirectX 9 shader and made compatible with DirectX 10 thats why theres very little differences between the screenshots.
DX10 is fully backwards compatible with DX9 because Vista uses DirectX 9L to render it, so that's not neccessary. Fact is you'll be hard pushed to find a difference between DX9 and DX10 in most current DX10 titles unless you like magnifying screenshots by 8x and pressing your nose up against the screen. :)

Otherwise I'd actually bother having Vista installed, lol.
 
There is very little difference because so far the only thing DX10 has proven to us is that it can achieve the same things DX9 can can with a performance loss. Epic are probably ticking boxes with the inclusion of DX10.
 
DX10 is fully backwards compatible with DX9 because Vista uses DirectX 9L to render it, so that's not neccessary. Fact is you'll be hard pushed to find a difference between DX9 and DX10 in most current DX10 titles unless you like magnifying screenshots by 8x and pressing your nose up against the screen. :)

Otherwise I'd actually bother having Vista installed, lol.

Yeah, sorry I should have worded it better. What I mean is the .fx for Gears are probably all written in DirectX 9 and as far as I can see from the screenshots there’s not much utilization of DirectX 10 except perhaps some of the light shaders.

If Gears were written in DirectX 10, there should be some substantial performance boost over DirectX9 as DirectX 10 can handle god knows how many more instructions.
 
Last edited:
Yeah, sorry I should have worded it better. What I mean is the .fx for Gears are probably all written in DirectX 9 and as far as I can see from the screenshots there’s not much utilization of DirectX 10 except perhaps some of the light shaders.

If Gears were written in DirectX 10, there should be some substantial performance boost over DirectX9 as DirectX 10 can handle god knows how many more instructions.
Ah I know what you mean now, and yeah I agree.
 
[h] are ridiculous, as said, oranges to apples is fine, trying to find the max "playable settings" is FINE, infact its GOOD> but you can't omit apples to apples aswell. aswell as the fact they are flat out talking crap.

for instance high texture detail vs medium texture details in MOST games makes minimal performance inpact as long as they fit in memory. running the 2900xt with medium gives the IMPRESSION it can't do HIGH, in reality it can probably do it with the SAME numbers, the 320MB gts would be another question though. but its simple, the need to do a basic chart showing the numbers for all cards on the same settings. [h] have been ******** basically since they switched to this beyond twisted way of benchmarking.
ati are also fairly notorious for better texture compression for a few generations and the higher textures are very much likely to give next to no impact, to a point, the jump to 1600x1200 also takes a small hit on 2900xt cards. this is because it has a hard time with max framerates as its 16 rops are rather on the low side, but maintaining that same performance at higher resolutions is still rather easy for it. loads of games show a fairly small res impact.


crysis for instance shows almost no performance inpact if i either run medium settings then up only the texture detail, or run high settings and drop to medium texture, virtually no performance difference in either situation.

but, of course gears would run better on a pc even at higher res. only fools think the 360/ps3 are upto date hardware wise.
 
Ah I know what you mean now, and yeah I agree.

dx10 is actually doing quite a lot that you can plainly see without resorting to looking in. however, its still not going to be vastly different to the dx9 version of the game. how a game looks is more down to quality of textures, time put into drawing characters and so on. i mean, hellgate looks like utter crap because they simply can't get a complex texture system working with their stupid random map generator, and because that took so long to make(with lots of bugs in the final part of the beta still) they spent time on that rather than drawning things to a better level of detail.

textures and colouring, the basics have the biggest impact. hellgate with the best lighting in the world of everything turned down doesn't look that different because the thing you see the most, all the buildings and characters, all look like crap on the highest setting. put good lighting on a crap texture, its still a crap texture.

the little things though, in world in conflict theres quite a lot of differences, better detail on houses, detail drawn much further in to the distance. there were shadows on helicopters like under the "wing" of the helicopter, all of which was absent in dx9. the craters were better with a wider area of effect aswell, but a lot of that is down to a little extra overhead and a few not "better effects" but easier to use optimisations to add more stuff in.

but dx10 does improve smoke a lot, and other clipping issues. in bioshock and lots of games already you can see things like clouds of smoke clipping heavily with surfaces in games under dx9, but looking far better and much more realistic on dx10. but in the end, its only the edge of a bit of smoke you spin past for a split second.

it gives a little more realism, but makes it in general easier for designers to just do that little bit more. textures, and time spent designing and drawing better walls, characters and stuff are still far and away the biggest things that impact the overall image quality. but those things aren't dx9 either, thats just basic design.
 
The GT is a GTS line substitution, it's not intended to compete with the GTX it just happens to.

The reason a GT suffers compared to a GTX with 4xAA on is because it has 512MB RAM as opposed to 768MB.

In the demo you'll probs find an 8800GT can use AA just like been able to use 16xAA in UT3 so at least 4xAA should be possible in GOW unless its miles better graphics than UT3.
 
What settings would be equivalent to the Xbox version?
At a guess: 1280x720 or 1920x1080 (or SD, all depending on your TV) with 2xAA and 0xAF, not forgetting that the PC version of Gears Of War has had a bit of a facelift in regards to textures and effects anyway.

So, to make an understatement, playing in 1600x1200/1680x1050 with 4xAA/16xAF is going to be far superior.

[h] are ridiculous, as said, oranges to apples is fine, trying to find the max "playable settings" is FINE, infact its GOOD> but you can't omit apples to apples aswell. aswell as the fact they are flat out talking crap.
They include an apples-to-apples page most of the time these days. I'm not sure why they didn't do it for this particular review, but it won't be long until sites like Anandtech do their own. :)

Don't get me wrong. I'm not cheerleading for [H], in fact I rarely read them these days because Anandtech is the site for me, I just think there's pros and cons to both methods really.
 
Last edited:
In the demo you'll probs find an 8800GT can use AA just like been able to use 16xAA in UT3 so at least 4xAA should be possible in GOW unless its miles better graphics than UT3.
Yeah but doesn't 4xMSAA have pretty much the same performance impact as 16xCSAA?

Unless you mean 16xQ which I don't think any card could run UT3 with. :eek:
 
Back
Top Bottom