• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Layte's super-duper 480SLI Benchmark-O-rama!

Soldato
Joined
1 Apr 2010
Posts
3,034
Okay, I got my GTX480 yesterday and I'm playing around with it. It seems stable at 825mhz with stock voltages, and it seems to roast if I up the voltage so I'm going to stick with that for now.

I've used the ASUS overclocking tool, but it's a bit clunky, and doesn't let you adjust memory timings, so I'm looking to use rivatuner. I can't see the usual driver button though... Hopefully I'm just doing something stupid? (see pic below)

The Asus overvolting tool is nasty, It lasted a day on my rig before I ditched it. Stick to MSI afterburner. :)
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Okay, will grab afterburner. Cheers guys :)

From the apps I've had a chance to play around with, the card seems to be a bit of a beast. I'll probably go SLI when I get back to 2560 res. From the benchmarks in this thread and elsewhere, it seems to scale well with SLI.

I'll see if I can dig out an old program I wrote to measure micro-stutter. It would be interesting to see how much of it there is with SLI and crossfire in this generation. That can make a big difference in performance, even though it never shows up in average-FPS benchmarks.
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Okay, here is the microstutter program I wrote a while back:

http://www.mediafire.com/?ioxbd5jz7tigbed

You run a FRAPS benchmark, save the file in the same directory as the .exe file above, and run the program. It will analyse the data and output an index for microstutter, and also the apparent framerate

The apparent framerate is the framerate taking microstutter into account; i.e. actual_fps / (1 + ms_index/100). If you were running at 50fps and the two GPUs were spitting out their frames at exactly the same time, with a 40ms gap in between, the apparent fps would be 25. if they were output with perfectly even spacing, it would be 50.

It would be interesting to see some crossfire vs SLI results, particularly as a few people are reporting better gaming experiences with a single GTX480 than with crossfire 5870s
 
Last edited:
Permabanned
Joined
22 Feb 2010
Posts
1,629
Ok Quad fire up and running.

Only at 900/1200 because i dont want the air cooled one to blow up

35918 on Vantage at 44k GPU score.
I required my 5970 & 5870 at 1050/1300 to get 43k :? lol
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Well, I've got my GTX480-SLI setup working now. I have to say, it's really a bit of a beast. I find it hard to give it enough load to max out both GPUs at 1920*1200. Even with crysis at 1920 with 8xAA I wasn't able to push above 85-95% GPU load!

Anyway... As you probably figured from my previous posts, one of my interests is in microstutter. For some reason people don't seem to think it's an issue, but that's mainly because a) they don't understand what it is, and b) you can't "just see it" from looking at a game.

Anyway, I did a few tests with the program I wrote to examine the GTX480s:

First, the good news: Under less than 100% GPU load there is very little microstutter. For example, crysis at 1920*1200 8xQAA:





Pushing up to 2560*1600 with 4xAA, we see a bit more MS creeping in. Now, I have to say, at full GPU load with my old 4870x2 I was generally seeing a microstutter index of over 20, and sometimes 30+, so in that regard this is an improvement:





As a comparison, I ran the same test with SLI disabled. At first I got less than 50% the framerate which was a bit of a :confused: moment, as 100%+ SLI scaling should be impossible. But it turns out that afterburner resets to default clocks when you disable SLI (even though the GUI still shows the overclock). Anyway, fixing that I get:





... So no microstutter (as expected from a single GPU). The small variation is to be expected from regular game-scene changes.


Now, I know that the microstutter index is a little hard to visualise unless you're used to dealing with statistics, so I plotted the instantaneous framerate variation for the single and dual-GPU setups. I chose a selection of 35 frames to highlight the effect of microstutter. Not all the SLI game scene is this severe, but you get the picture:






Anyway, there you have it. Microstutter is alive and well with the GTX480s. If you can manage to push them to their limits, that is! As always, a single GPU will give you a smoother gameplay experience at a given framerate, but adding a second GPU gives better results than a single GPU overall.

I would really like to see some results from multi-GPU ATI users. I'm not interested for any fanboy gloating type reasons, or even purchase justification - I'd just like to see how the two different technologies handle things. I suspect that they will be very similar.
 
Permabanned
Joined
1 Aug 2010
Posts
592
Can someone with a 5970 do this microstutter bench please it would be very much appreciated, i was set on it till i read a bit about microstutter and im worried that its going to be really noticeable.
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Can someone with a 5970 do this microstutter bench please it would be very much appreciated, i was set on it till i read a bit about microstutter and im worried that its going to be really noticeable.

Well, I posted about this on hardforums, and someone posted a framelog from 3dmark vantage for me to test, coming from a 5870 crossfire setup. Some parts of the log were absolutely shocking (see below):



[I should mention, this is plotting instantaneous framerate (y-axis) against frame number (x) for a small sub-sample of the frametimes. This behaviour cropped up over around half of the benchmark, excluding the areas which were clearly loading screens (fps > 3000) ].


I ran vantage, benchmarking the two main game tests, and didn't see anything so extreme in the logs. I could find only small patches where the variation was as strong as the crysis example I gave a few posts above.

I'd very much like to have more data though, as this isn't enough to draw any firm conclusions. I'm also going to try and push the GTXs to see if there are any circumstances under which I can replicate this extreme microstutter behaviour.
 
Last edited:
Soldato
Joined
24 Mar 2008
Posts
4,654
Location
High Wycombe
Very interesting stuff Duff-man, I would like to do this test for you but not sure I will get time to get it all done any time soon. I will see what I can do though, would be very interesting in seeing the results compared :)

EDIT: Can only go upto 1920x1200 res but be interesting in testing that and comparing and as Raven said Heaven benchmark?
 
Last edited:
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
When I get home, I will run the heaven benchmark :)

Pauly, to get a solid comparison, shall we say 1920*1200, 4xAA, 16xAF?

We can say one run on "normal" tessellation (so as not to limit the ATI cards in that regard), and one run on extreme tessellation. Start the FRAPS benchmark (F11) as you start benchmarking heaven, and run for 60s. Don't worry about rushing it - any time you get chance would be appreciated :)

By the way, once you have FRAPS installed, it's really quick to get the microstutter index. Just select the 'frametimes' tick-box in FRAPS, select "limit to 60s" in order for consistency, run the benchmark, then stick the framelog into the folder of the microstutter program. Rename it to "input", and run the executable. Should compute in a fraction of a second.

Alternatively just post the framelog from FRAPS and I can do it for you.


...BTW I only have a 1920*1200 screen at the moment, but the crysis benchmark is a little unique. It lets you render at a higher resolution, with your monitor downscaling it. Makes reading the green in-game stats a little difficult, but very handy for assessing what performance would be like on a 30" screen!
 
Last edited:
Soldato
Joined
24 Mar 2008
Posts
4,654
Location
High Wycombe
Yep sounds good to me mate, no problem doing that for you, Should hopefully get time tonight though really should be packing up my room! :)

Know what you mean about Crysis being able to do that, only benchmark I found thats able to do that as well.

Just saw your post over on XS saying about the same and Oztopher has replied with some results, i actually linked him to your thread here to do some testing for me as well (friend of mine) as i knew he had 4890's. Interestingly I get the same with issue he gets with DiRT2 where I need to enable vsynch to make it smooth, otherwise it can seem a little "jittery" with it off. He just sent me a message on MSN saying he gets 2% microstutter index when running it with vsynch enabled.

EDIT, I think it may be best to start a new thread on this to stop this one going too far off topic :)
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Okay. When I have a few more data-points I will make a new thread :) In my experience, making a thread that implies people's hardware isn't performing as well as they think is a recipe for being flamed down on this forum. If I can gather enough evidence, I might not get roasted quite so badly (wishful thinking?).


And yes - with vsync enabled you should see virtually no microstutter, as long as your framerate is above 60fps. The frames should be very regular, which is what's being shown by his 2% value.
 
Last edited:
Soldato
Joined
24 Mar 2008
Posts
4,654
Location
High Wycombe
Ha, I know what you mean! I find it more interesting from the testing viewpoint and will be good to compare and see how it goes, for myself I have only had DiRT2 cause that problem and generally for all other games I run with vsynch off and I don't notice any issues.

Will try and get them done tonight for you mate as i'm quite intrigued myself
 
Associate
Joined
13 Nov 2007
Posts
1,483
Microstutter!!!!

Unigine, 2.1 default settings - 5970 + 5870, @ 900/1250, res = 1920x1080
==========================================

Number of frames = 5686
Processing data...



**************************************************************************

Global Average Frametime (ms): 10.552
... ie Average Framerate (fps): 94.771

Average Local Frametime Variation (ms): 4.028


*** Average Microstutter Index (%) ***: 38.210

Apparent Framerate is: 68.570

**************************************************************************



Fortran Pause - Enter command<CR> or <CR> to continue.

Unigine, 2.1 default settings - 5970, @ 900/1250, res = 1920x1080 Tried a few, got one run where the Index was 42 :)
==========================================

Number of frames = 3827
Processing data...



**************************************************************************

Global Average Frametime (ms): 15.681
... ie Average Framerate (fps): 63.772

Average Local Frametime Variation (ms): 7.703


*** Average Microstutter Index (%) ***: 48.818

Apparent Framerate is: 42.852

**************************************************************************



Fortran Pause - Enter command<CR> or <CR> to continue.
 
Last edited:
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Thanks for the results, blackninja - much appreciated :)

Well, I think we've found a candidate for "worst-case scenario" as far as microstutter goes. The heaven benchmark seems to really make the cards sweat (~100% load), and microstutter like crazy while they're at it. I see some pretty heavy microstutter with the GTX480s, although not quite as much as blackninja is getting with the tri-fire r800 setup. But I suppose, it's to be expected that a 3-GPU setup will suffer more from microstutter than a two-GPU setup.

It was also suggested to me that setting the "render max frames ahead" to 5, up from the default 3, might help to improve things. So I tried that as well.


Running at 1920*1200, with 4xAA, 16xAF, and starting a 60s FRAPS benchmark right as I start the benchmark in heaven, I get the following:

Normal tessellation, render max 3 frames ahead:


Extreme tessellation, render max 3 frames ahead:







Choosing instead to render 5 frames ahead, I see an improvement, albeit very slight:

Normal tessellation:


Extreme tessellation:



I have to say, as I watched the benchmark unfold I KNEW the value was going to be high. I mean, it seemed pretty smooth overall, but for me 60fps is usually rock-solid (by eye anyway). In this, I saw the framerate counter at ~74, 75, and I could still tell that it wasn't perfectly smooth. As objects passed by they seemed to be just a little jerky. Looking more closely at the frametimes file, there is some pretty poor behaviour in there. Not quite as bad as the plot I posted above from the 5870 x-fire vantage run, but still pretty disappointing.


I think it's fair to say that multi-GPU setups are not offering quite the real-world performance increases that their "near-100% scaling" benchmarks would suggest. Perhaps if we can get enough data together I can write something up to send to a few review sites, and just maybe get them to press nvidia / ATI for comment...
 
Last edited:
Back
Top Bottom