• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

1920x1200 - best GFX?

Soldato
Joined
2 Dec 2009
Posts
4,018
Location
Midlands
Before I start, let me say that previously I have used Crossfire to great effect, however, I tried HD6970XF and it didn't quite work out.

So what I am after is a decent card that will last me ~18months and present no memory-fill issues and still allow all the top games to be played at 1920x1200 res without any discernible lag or stutter.

Ideally, I would like a card which is quiet-ish (having 2 x HD6970s blaring at 65% RPM is not fun) but will fit in my HAF-X case without issue.

I don't mean this in a bad way, but price is of no issue. I am also put off the GTX590 due to heat, noise and sparky incidents. Likewise the HD6990 due to noise and heat.

Any decent contenders?

Was considering the MSI Lightning N580 but that may present memory-fill issues due to only having 1.5GB?

Thanks in advance for your suggestions :)
 
I'm pretty confident that a single 6970 will last 18 months fine at 1920x1200. You may not be able to play games in a year on ultra ultra high, but you should easily hit good frame-rates on high settings.

1.5GB should be plenty for your resolution. At the moment, some games just exceed 1GB at 1920x1200.
 
I'm pretty confident that a single 6970 will last 18 months fine at 1920x1200. You may not be able to play games in a year on ultra ultra high, but you should easily hit good frame-rates on high settings.

1.5GB should be plenty for your resolution. At the moment, some games just exceed 1GB at 1920x1200.


I agree.
 
If money is no object and you're concerned about available graphics RAM, what about the Gainward GTX 580 Phantom? Dead quiet, looks stunning and has 3GB RAM.

But to be honest just one of those 6970s you had should do you fine for the next 18 months as joxang said.
 
and still allow all the top games to be played at 1920x1200 res without any discernible lag or stutter

I can confirm that 580 SLI is already killed.

203159j3339icdss7u70le.jpg


You probably need 2 x 580 3GB version.
 
I can confirm that 580 SLI is already killed.

You probably need 2 x 580 3GB version.

Let's be honest, Metro is the new Crysis or Doom 3 in that it's maximum quality settings require hardware that doesn't realistically exist at the time it was released.

Yes, buying something that can max out Metro will pretty much guarantee everything else you can throw at your GPU will run perfectly, but you'd have to question why, even if you have an unlimited budget.

Don't get me wrong, we all want to play our games with maximum eye candy, but sometimes the differences you'd get are minimal and hardly worth the added investment in order to say "yes, I can do it".

In an unlimited budget scenario, I'd say run with a Radeon 6990 (or 2) and watercool the things - the only real downside to that card is the poor cooling system, something that can be fixed with new coolers (when they arrive) or water.

But that's hardly realistic.
 
Let's be honest, Metro is the new Crysis or Doom 3 in that it's maximum quality settings require hardware that doesn't realistically exist at the time it was released.

Yes, buying something that can max out Metro will pretty much guarantee everything else you can throw at your GPU will run perfectly, but you'd have to question why, even if you have an unlimited budget.

Don't get me wrong, we all want to play our games with maximum eye candy, but sometimes the differences you'd get are minimal and hardly worth the added investment in order to say "yes, I can do it".

In an unlimited budget scenario, I'd say run with a Radeon 6990 (or 2) and watercool the things - the only real downside to that card is the poor cooling system, something that can be fixed with new coolers (when they arrive) or water.

But that's hardly realistic.

LOL I understand what you mean, it's just that I'm proving to the OP that his requirement is not easy to be satisfied :D

Lots of people were saying that Crysis was a cr@p game with **** optimization in 2008. However today it has been awarded many titles like "best image quality". We could expect to see graphics cards in the 28 nm node to deal with Metro 2033.
 
LOL I understand what you mean, it's just that I'm proving to the OP that his requirement is not easy to be satisfied :D

Lots of people were saying that Crysis was a cr@p game with **** optimization in 2008. However today it has been awarded many titles like "best image quality". We could expect to see graphics cards in the 28 nm node to deal with Metro 2033.

Funny thing is it's still a crap game at 60 fps on max settings.
 
I can confirm that 580 SLI is already killed.

203159j3339icdss7u70le.jpg


You probably need 2 x 580 3GB version.
No offense, but where does it imply it is the lack of VRAM is what make it not getting the results as it should have been? Looking at the settings, it could well be because of PhysX is enable without a dedicated card (it's known that having the same card doing BOTH graphic rendering and PhysX at the same time could cause performance hit and even shuttering) what causes the lower frame rate. PhysX aside, DOF and tesselations are enabled as well, so what make you so sure the low frame rate is cause by running out of VRAM? I have yet to see evidence of Metro2033 using more than 1536MB of VRAM...

And yea...like sldsmkd said, while Metro 2033 is hardly a 'top game'...in fact it is far from it. It's only famous for being the most demanding game around...and I don't see the point of using it as representation for all future releases.
 
Last edited:
No offense, but where does it imply it is the lack of VRAM is what make it not getting the results as it should have been? Looking at the settings, it could well be because of PhysX is enable without a dedicated card (it's known that having the same card doing BOTH graphic rendering and PhysX at the same time could cause performance hit and even shuttering) what causes the lower frame rate. PhysX aside, DOF and tesselations are enabled as well, so what make you so sure the low frame rate is cause by running out of VRAM? I have yet to see evidence of Metro2033 using more than 1536MB of VRAM...

And yea...like sldsmkd said, while Metro 2033 is hardly a 'top game'...in fact it is far from it. It's only famous for being the most demanding game around...and I don't see the point of using it as representation for all future releases.

No offense taken :) as long as Metro 2033 is popular among all sort of graphics card reviews and demoralizing every 40nm graphics card, I wouldn't even care about how people criticise it, just like how infamous Crysis used to be.

There can be various factors to impact the min fps, e.g. the algorithm to calculate the min fps. Not even 2 x 580 3GB can get much better min fps from this benchmark. This benchmark doesn't necessarily use more than 1.5GB vram either. However vram shortage is already proved in these screenshots at 1080p and if the player rotates the camera quickly he could capture much lower fps:

1.jpg


2.jpg


3.jpg


4.jpg


5.jpg


6.jpg
 
Last edited:
I'm running that resolution and not really having much problems, though admittedly I do not have metro or DA2. However when I do eventually get DA2 I'll let you know how it is ;).

Ultimately how well a gfx card will be able to cope is highly reliant on what it is you want it to cope with.
 
I'm running that resolution and not really having much problems, though admittedly I do not have metro or DA2. However when I do eventually get DA2 I'll let you know how it is ;).

Ultimately how well a gfx card will be able to cope is highly reliant on what it is you want it to cope with.

Just remember to install the official High Resolution Texture Pack for the PC version of Dragon Age II and use at least 4AA. My 2 x 5870 CF used to get lag (min fps < 30) when there were many enemies. Also some complicated scenes in Lowtown.
 
So in conclusion... R6970 Lightning 2GB? yes/no?

Not even 2 x R6970 Lightning in CrossfireX can guarantee smooth experience for Metro 2033 though. I guess you'll have to wait for 28nm node for a leap to have a better chance to last 18 months. For now you can hardly find any decent card like 5870 or even 8800 to be able to last that long.
 
Last edited:
Dragon Age II would surely eat more than 1GB vram. It was AMD's game so 6970CF used to be a lot better than 580SLI when nVidia didn't have a patch :)

Really you dont know what you are talking about here.

DAII was the Xbox 360s game, not ATIs, nor Nvidias. It had absolutely no optimisation done to allow it to work well on PCs, this had nothing to do with either ATI or Nvidia.

Not even 2 x R6970 Lightning in CrossfireX can guarantee smooth experience for Metro 2033 though. I guess you'll have to wait for 28nm node for a leap to have a better chance to last 18 months. For now you can hardly find any decent card like 5870 or even 8800 to be able to last that long.

If a game is a poorly coded console port like Metro, or simply just badly coded like WoW, a new graphics card wont do anything to magically allow the game to suddenly work 100% smooth.
 
Last edited:
Really you dont know what you are talking about here.

DAII was the Xbox 360s game, not ATIs, nor Nvidias. It had absolutely no optimisation done to allow it to work well on PCs, this had nothing to do with either ATI or Nvidia.

So according to your theory, how come Dragon Age II was used in AMD 6990's marketing review?

If a game is a poorly coded console port like Metro, or simply just badly coded like WoW, a new graphics card wont do anything to magically allow the game to suddenly work 100% smooth.

Then you should only play games like Far Cry 2, which would not be called a poorly coded console port, or maybe even Max Payne. :D I knew lots of people used to criticise how crap the optimization of Crysis was.
 
Back
Top Bottom