• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What would I need to max Metro at 2560x1440?

gtx570 in SLI won;t do it if you want a pretty consistant 35fps and any form of aa and that is probably still with DOF and tessellation off.

Metro2033 is horribly coded. Even with my 4.4Ghz i7 and my gtx470's in SLI running at 840 core, the benchmark at 1920 x 1200 with 4xaa and DOF and tessellation drops to 9fps in places.

So I reckon tri gtx580 might just cut the mustard but even then you may have to clock back from the max settings.

Interestling enough with DOF and tessellation off, the benchmark flies and I can average 70 to 90fps with peaks of 400+ fps.
 
Last edited:
Metro 2033 is not a great game at all & Crysis Warhead still looks better.

Metro 2033 flatters to deceive but play it for a while & you realise its just a slightly better looking STALKER with the same old russia translation issues & boring gameplay certainly not worth upgrading your system for to play at that res either!
 
Metro2033 is horribly coded. Even with my 4.4Ghz i7 and my gtx470's in SLI running at 840 core, the benchmark at 1920 x 1200 with 4xaa and DOF and tessellation drops to 9fps in places.

That'll be the VRAM rather than the coding. It still makes me laugh how people claim to know if a game is badly coded. Some continue to say this about Crysis, a game from 2007 that still looks better than anything launched since.

PC gamers moan about games that don't make use of the latest tech, then moan when they do. Make your minds up.
 
Yeah metro is a poor corridor shooter with poor performance for what it offers, now if it was a crysis style free roaming shooter with the GFX it has then I could understand the drop in performance, but it's a corridor shooter FFS.
 
That'll be the VRAM rather than the coding. It still makes me laugh how people claim to know if a game is badly coded. Some continue to say this about Crysis, a game from 2007 that still looks better than anything launched since.

PC gamers moan about games that don't make use of the latest tech, then moan when they do. Make your minds up.

I'm sorry but you shouldn;t be running out of vram at 1920 x 1200. And how can you make a game which runs between 30 and 420 fps with no dof/aa/and tesselation? I doubt it's a vram limit in that situation.

Point is that certain points of Metro2033 can alter your framerate by a factor of 10. I have never seen a game flucuate as much as that.

So all that happens if you turn everything on max is that you get a range of 9 to 90 fps with an average of 45fps at 1920 x 1200.

That average doesn't make the game playable as when it dips to 9fps it annoys the hell out of you.

In fact even with dof and tessellation off, it's still annoying to be running between 30 and 400 fps despite a nice 90fps average.

AT least crysis is pretty consistant with it's framerates.
 
At that resolution he's better off with the 69xx series (even though nvidia likes the game better):
HardOCP like to review at that res:
2560x1600 AAA - 16X AF Global quality = very high (DoF disabled)
GTX 580 (Min/Max/Avg FPS) - 23, 49. 35.3
HD 6970 (Min/Max/Avg FPS) - 21, 48, 33.6

Turning the settings down
2560x1600 No AA - 16X AF Global quality = high
GTX 570 (Min/Max/Avg FPS) - 23, 49, 34.7
HD 6950 (Min/Max/Avg FPS) - 21, 48, 34.4

So the GTX 570 is more comparible with the HD 6950 at this res. Neither the 580 or 6970 can max out settings, so I imagine your best choice would be to crossfire the 6950 or 6970.
 
TBF i dont think the Metro benchmark is indicative of ingame performance anyway, it seems like it's there to destroy ur hardware ;). i can play the game on much higher settings than i can efficiently run the benchmark.
 
I plyed it with my system in my sig, at 1920x1200 everything max, but MSAA off and DEP off. For about 40% of the game I had DEP on, amd it was still playble, but I turned it off for extra frames, and becuase it didnt really make that much of a visable difference.

Good looking game especially on the surface!
 
At that resolution he's better off with the 69xx series (even though nvidia likes the game better):
HardOCP like to review at that res:
2560x1600 AAA - 16X AF Global quality = very high (DoF disabled)
GTX 580 (Min/Max/Avg FPS) - 23, 49. 35.3
HD 6970 (Min/Max/Avg FPS) - 21, 48, 33.6

Turning the settings down
2560x1600 No AA - 16X AF Global quality = high
GTX 570 (Min/Max/Avg FPS) - 23, 49, 34.7
HD 6950 (Min/Max/Avg FPS) - 21, 48, 34.4

So the GTX 570 is more comparible with the HD 6950 at this res. Neither the 580 or 6970 can max out settings, so I imagine your best choice would be to crossfire the 6950 or 6970.

Are you sure a GTX 580 can max out settings? (Ahh Im refering to 1920x1200 here....
 
Are you sure a GTX 580 can max out settings? (Ahh Im refering to 1920x1200 here....
I wouldn't have the feintest, I'm just quoting results from a reviews site. Nvidia does well with Metro, but it looks like AMD is better (pound for pound) with Metro at the resolutions the OP wants, and probably even more so if he goes dual card (although I haven't checked that).

EDIT - Ok I've just checked AnAnd review
For this game, at this res, according to these reviews (that's the caveats out the way, I don't want to start a war) AMD is the better choice.
Single card, 580 is just ahead of 6970, which is just ahead of 570 - price wize 6970 wins. Dual card, 6970 wins easily, with 6950 second (must be a memory thing, as 580 does better once resolution is reduced).

So, why is the slower setup, with no anti alias, getting 60% more FPS?

I find this annoying as I use these benchmarks to decide what card I'll want if I want to max out all the games I will be playing now and in the future.
Indeed. Reviews and benchmark comparisons is about all we have to go on to choose a card, we can hardly test them all ourselves, so it's pretty irritating when the reviews are that far out. Anyway, if you really want to run at 2560 x 1440 (with the best settings you can), it looks like you should go AMD.
 
Last edited:
The benchmarks I posted in the 580GTX thread are fairly accurate of what to expect, I've found the game very playable at the settings I posted there, no DOF & Physx really help, I personally don't think the game would be very playable at 2560x1440 without turning eye candy off, you're talking console fps at best.

The in built benchmark is: ("C:\Program Files (x86)\Steam\steamapps\common\metro 2033\metro2033benchmark.exe")

metro.png


metro900.png


That's with 580gtx SLI@900/1800/2200/1125mv.
 
I run metro 2033 max settings @ 1680x1050 res with a gtx 580 oc and there is no lag so i would think two gtx 580s / oc would be able to do op's rez.
 
Yet it does.

Perhaps they should have just done a straight console port? Then again that would have people moaning about it not making use of modern cards.

Really? I mean no dof, no tesseallation and no AA and the game is using more than the 1280mb on the gtx470?

What's a good program to check vram usuage and I will have a look tonight.

And even the 2gb cards show the same drop in framerates at 1920 x 1200. Are you seriously saying Metro was written to use 3 or 4gb or vram with AA off? The graphics are nice but they are not that nice especially compared to crysis/stalker etc.

If that is true, it's still bad coding IMO as no way should it use that much vram.
 
Last edited:
I run metro 2033 max settings @ 1680x1050 res with a gtx 580 oc and there is no lag so i would think two gtx 580s / oc would be able to do op's rez.
That seems like a reasonable assumption, but according to the reviews it doesn't turn out that way.
From the AnAnd review:
"CF/SLI makes things all the more interesting, as any kind of parity the GTX 400/500 series has goes right out the window at 2560. AMD simply outscales NVIDIA here, leading to the 6970 CF surpassing the mighty 580 SLI by 30%"
And the 6970 is a lot cheaper. So would you recommend the OP buys two 580s?
 
Back
Top Bottom