• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Latest single GPU just dont cut it (will they ever)

Wow, 2560x1440!

There was a time when 720p was unthinkable! I appreciate the love for the big monitors, but such people will find that they're on the absolute bleeding edge of what their hardware can deliver. Some games were not even designed with a PC in mind. Few less have been optimized for 1440p. It's just the way of the world.

I personally prefer a smaller screen with a reasonably fine dpi. Right now, 23 inches at 1080p is the current sweet spot for gaming imo; the right balance between visual quality, hardware performance and price. But I appreciate that once you get used to higher resolutions, it's hard to go back!
 
Greater AA levels definitely reduce frames at the higher resolutions.
Each new gen does of course improve the performance at the high end.

I play Skyrim @ 2560x1600, the fps is locked at 60 all settings on max (FXAA instead of AA)
 
Last edited:
OP, whats your GPU/CPU usage in game?


Capture.png


On

Napoleon: Total War seems to stuggle getting above 30fps
 
:rolleyes: here we go again, eyes don't use "fps" >.<

Yes, I know that, but if the minimum is 24 (or 26) then we see near zero difference than If it was 70000000. As long frame rate is consistently high, it won't make a difference.

EDIT: just thought about it. I did a derp. It's hz that dictate smoothness. Ignore me.
 
Wow, 2560x1440!

There was a time when 720p was unthinkable! I appreciate the love for the big monitors, but such people will find that they're on the absolute bleeding edge of what their hardware can deliver. Some games were not even designed with a PC in mind. Few less have been optimized for 1440p. It's just the way of the world.

I personally prefer a smaller screen with a reasonably fine dpi. Right now, 23 inches at 1080p is the current sweet spot for gaming imo; the right balance between visual quality, hardware performance and price. But I appreciate that once you get used to higher resolutions, it's hard to go back!

2560x screens have the finest DPI.

Your 266-mm pixel pitch 23" V
255-mm pixel pitch 20" 1600x1200
250-mm pixel pitch 30" 2560x1600
233-mm pixel pitch 27" 2560x1440
 
Last edited:
Surely only a little, if at all. Yes, a 2500k will give higher frame rates but where do you draw the line? An i7 must be worthy of pairing up with a gpu like that?

Exactly, with the higher res the CPU becomes less of an issue, you're relying more on the GPU to push all those extra pixels.
 
With all the bull that's going on about who makes the fastest GPU , I thought some ingame Benchmarks with Fraps at 2560x1440 were needed here is ATI's offering

The results are at everything maxed no in game mods

i7 920 at 4 gig
12 gig ddr3 mem at 1600 mhz
XFX 7970 Black Edition 3072MB gpu 1125 mem 1575 no voltage mod.

All average results

Crysis 45 fps
BF3 muti 64 player 43 fps
Skyrim 44 fps
Shift 2 Unleashed 25 fps
Napoleon: Total War 30 fps

Will a single card ever be powerful enough to run Crysis at max settings a 5 years old game.
Drop off AA a little then steady 60fps, modern games will be future proof for a long time.

You would looking at Trifire for that.
 
Drop down to 1920*1080 in games that are too demanding, cards will catch up, they always do but resolutions will increase again no doubt.

Either that or drop some other settings, many games have settings that the user barely notices but with huge performance hits.
 
... having a duel GPU card, or Xfire/SLI is quite terrible to live with, so many people have problems/issues a lot of the time, when most want to slot the card in, download the drivers and start playing games....

I strongly disagree with this. I've been running SLI 460's for over a year now with no problem. Multi GPU has a lot going for it, and driver problems are largely a thing of the past.
 
When will these bloody threads end????

If you bought a 7970 and don't like it's performance, then poor you for not reading reviews. Drivers will mature and performance will get better but stop using crappy coded games.
If you haven't bought a 7970 and stand there wagging you finger, grow up. You are entitled to your opinion but it is not as credible as someone who owns one

Rant, sorry
 
I strongly disagree with this. I've been running SLI 460's for over a year now with no problem. Multi GPU has a lot going for it, and driver problems are largely a thing of the past.
But you are talking about your experience of using GTX460SLI. I recall people were complaining that performance of crossfire drivers for AMD cards has been pants for pass few months.
 
Why are people so desperate for 60fps?

We can only see 24...

100fps.com

I remember testing years ago when I had a CRT, and I could see the difference at around 100fps+. However, I'm not sure if that's more the smoothness or the responsiveness of the game.

Go test it yourself...switch between 30 and 60fps and see if you notice the difference :)
 
Why are people so desperate for 60fps?

We can only see 24...

That myth stems from the fact that films use a 24fps playback rate.

The biggest difference is that film incorporates a very realistic motion blur. Each frame shows displays a weighted average of what has occurred during the previous 1/24th of a second.

This is very different to the case of video games, where each frame is a static image. Static images must be displayed far more quickly in order to fool the eye into seeing a fluid image. Motion blur added as a post-processing effect in games can improve the perception of smoothness, but it isn't nearly so sophisticated a method as used in film (or pre-rendered CGI), and does not have quite the same impact.

Do you remember a decade or so back? Poor quality CGI in TV shows would often fail to add motion blur, which would lead to pretty jerky-looking imagery. Both are displayed at 24fps, but the CGI is presented without motion blur, and so looks far less smooth.

...Anyway, you can see this for yourself. Run a game at 24fps, then run it at 60fps+. You'll see the difference in smoothness.
 
Back
Top Bottom