• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

To SLI or not?

Yes, I should have mentioned the 6000 series, like I have on other threads. I was simply providing a link to one of the ones he listed. Ocuk currently stock it.

Scaremongering I am most certainly not trying to do and with regard to microstutter, in short, what I have said 'skipping' etc is what can happen. I can't explain it like Duffman did (nice post btw Duff) but still, a lack of smoothness is what can occur. Recently I tried running a 4890 in a few games and although there was, in my case - on the whole, not a lot of difference...it was defo smoother. With one or two games the X2 was much worse, though generally speaking I have had a fairly good ride with crossfire.

I have nothing against sli or crossfire, I'm sure many users would be / have been very happy with sli 460's or otherwise.

I don't know when I next will be purchasing a new card, apart from AA ((though I suspect that the cpu holds it back very slightly in the AA department - not certain yet - I'd like to hang on to it and try it out on the next cpu / mobo to see if I can push more out of it)) in some titles it flattens most games but I do know that I will be going back to a single gpu, well probably. :p
 
Last edited:
I've gone down the SLi route a couple of times, it's good but I'd still rather have a single great card than a couple of average ones.
Plenty of horsepower with minimum fuss = more time playing games instead of messing around with SLi/CF profiles, drivers etc etc.

This is basically why I started this thread, to hear more about what issue's I might encounter.

Hey Rhodan,

as your not buying your graphic solution for a while yet I think you have time to get some homework done and weigh up all the pro's and con's . . . your in a good place right now to be considering your GPU options

I haven't heard from many GTX 460 SLI owners in this thread which is disappointing considering how many times they are quoted in build specs. I want to hear about your experiences!
 
Last edited:
That's exactly when the 768MB should be bought! . . . right now and at the same time? . . . . the single 1GB card should be bought now if the user is intended to add another in say another 1GB card 12-18months time! . . . tis logical no? ;)

But then you're stuck with 2 cards with a lack of texture memory and whatever SLI issues that may exist.
At least with the single GTX470 you have a solid performing card and the option of adding another.

TBH, I wouldn't buy 2 1gb GTX460's at the same time either... The rule of "buy the best performing single card you can" always applies.
 
Hey Duff-Man :)

Ok this is a bit confusing . . . because you stated this? . . .

I'm saying that there will be an identifiable reduction in performance. I'm not making any comment about how one particular setup will "feel" with one particular game. Besides, when you run at a sufficiently high framerate you're going to get good performance, microstutter or not. The only difference is, to reach any given user's threshold of "good performance" will require a slightly higher framerate with a dual-GPu setup.


To me what is being "suggested" is that there is a users-experience performance "hit" that is shadowy and generally offputting to anyone considering going the bang-forbuck SLI route! . . . what is strange about this is if what you are suggesting is true it is a kinda FPS "Stealth Tax" which is hard to gauge actual performance . .

A very vague and wishy-washy way of putting it, but I suppose that's more or less what I'm saying.

According to a new friendly forum user: While the FPS of 460 SLI is impressive, with multiple GPU's the perceived smoothness doesn't translate well in terms of FPS like a single GPU does.
As a rule of thumb, you can take SLI/Xfire Fps and reduce it by 10-30% to get an indication of real world performance...
#19

Well end-user experiences will always vary. I'm interested in the analysis more than the subjective opinion. The "10-30% reduction" comes from my analysis of microstutter in multi-GPU setups under near-full load. This statement is based on solid analysis, and if you've read the thread on XS and hardforums then you understand where it comes from.

edit: they may have the statistics very slightly wrong though. The microstutter correction formula is: Apparent framerate = real_framerate / (1+MS), where the MS is the microstutter value (between zero and 1). So, a 10% microstutter leads to performance ~91% the original framerate, and a 30% microstutter leads performance to ~77% the original framerate. So the reduction could better be described as "between 9% and 23%.

I hope you can understand why this is a problem Duff-Man? . . . if its true (which I'm not convinced yet it is) then it means any performance chart like what is in post #17 is not valid . . .

Doesn't mean it's invalid - all it means is that it does not tell the whole story. It glosses over some important information. But that has always been the case with tech-reviews to a certain extent.

...You're attempting to quantify a very abstract concept (the way our eyes perceive a set of discrete frames to be a smooth moving image), into a simple numerical form. This will always involve glossing over details to form broad conclusions. You can't analyse and discuss every frame output without losing your readership. In this case it's an issue that affects multi-GPU users differentially from single GPU users, and I wish review sites would pay it more attention.

As for convincing you - I have given a scientific analysis of the effect, and explained it in terms of the operation of the GPU with the rest of the system. I have shown that it exists, it is observable, repeatable, and predictable. I can't really do anything more than that without access to a formal testing lab with lots of different hardware configurations.


If a product does not function in the way it is meant to function and produce the results its meant to then there is a case? . . . for example referencing the bench results in #17 Alien vs Predator (DX 11) 4xAA would you say anyone could tell the difference betwen the SLI'ed GTX 460's and the HD 5870 in actual use? . . . or would this "SLI Stealth Tax" make it seem slower . . .

No case whatsoever... The card IS pumping out the correct number of frames per second afterall. Nvidia and AMD don't guarantee they're being output regularly... It's just another facet of the technology that the end user should be aware of in order to make an informed decision. Nothing more...

I don't want to get involved in individual cases, unless people are willing to have a go at measuring their own microstutter, but in the example you give:

AVP 4xAA 1920-res:
GTX460 SLI: 61fps av
5870: 45 fps av

In order for the GTX460 SLI setup to seem worse than the 5870 it would need be showing over 36% microstutter. Although the degree of microstutter does vary over the course of a benchmark, a value of over 30% is rare. In this case it's reasonable to assume that the GTX460 SLI setup would seem smoother in most circumstances. But, comparing to the GTX480 at 56fps, it's reasonable to assume the single GTX480 would appear smoother in this particular case.
 
Last edited:
The lads have found an interesting way to measure the observable variance in fraps frame rate output, nothing more. This is a well known issue since the early days, and has been traditionally offset by frame buffering.

The assumption that these variances are the direct cause of the mysterious phenomenon called microstutter (which apparently some people can see and some can't) is still an assumption. It's a good theory, but variables are not taken into account. For example,
- when exactly is each frame recorded as complete? when it's drawn to the back or front buffer? how many frames are buffered? this alone could throw the 'observable' variance figures up in the air.
- if wider variance is a constant feature of multi-gpu rendering, why do people only observe stuttering in only a small number of scenarios? Surely you would be able to 'see' it all the time? Particularly side-by-side with a single gpu machine?
I appreciate the work that's been put into this, but from what I've seen, I'm really sceptical of the conclusions - especially this - "framerate = real_framerate / (1+MS)"!!!

I bought 2 gtx460's over a 480 because it was £40 cheaper at the time. All I can tell you is that I've had no issues and performance is staggering. Games I've been playing lately -

LFD2
BFBC2
Mass Effect
Crysis Warhead
Dragonage
Aliens vs Predetor
Mirrors Edge
Riddick Dark Athena
Modern Warfare 2
Dawn of War II

Note that I haven't bought any new games in a couple of months. Still waiting for the day I have a driver problem!

A lot depends on prices, and the lay out of your case.
 
Last edited:
The assumption that these variances are the direct cause of the mysterious phenomenon called microstutter (which apparently some people can see and some can't) is still an assumption.

"Microstutter" is framerate variance on the frame-by-frame level. If somebody wants to call something else microstutter, then that's fine, but it won't be the issue we're discussing.


- when exactly is each frame recorded as complete? when it's drawn to the back or front buffer? this alone could throw the 'observable' variance figures up in the air.

FRAPS records the time the frame is output to the screen (front end of the buffer).

- if wider variance is a constant feature of multi-gpu rendering, why do people only observe stuttering in only a small number of scenarios? Surely you would be able to 'see' it all the time? Particularly side-by-side with a single gpu machine?

First of all, it only appears like "stuttering" when you have a slow enough framerate to catch individual frames. Any other time it simply acts to reduce the apparent framerate. This is a product of the way the eye works (catching the longer gap between the frames as a measure of smoothness).

If you concocted a scenario where you had two machines running side by side, both at precisely the same framerate, one of which was single-GPU and the other dual-GPU, you would see that the single GPU setup looked "smoother". Increase the framerate of the multi-GPu setup by 20% or so, and this difference would disappear.

Of course, if you're playing at sufficiently high framerate then the game will seem smooth whether you have microstutter or not.


I appreciate the work that's been put into this, but from what I've seen, I'm really sceptical of the conclusions - especially this - "framerate = real_framerate / (1+MS)"!!!

That little formula is only a result of the base assumption: That apparent smoothless (at any instant) is determined by the largest gap between frames, rather than counting the number of frames. The "MS", or microstutter percentage, is a statistical measure of variance over the course of a benchmark. It is non-dimensional and consistent. Read the readme that comes with the program - it explains the math. On this side of things the work is solid (I do this kind of thing for a living). There are plenty of things to be skeptical about, but trust me on the math ;)


I bought 2 gtx460's over a 480 because it was £40 cheaper at the time. All I can tell you is that I've had no issues and performance is staggering.
...

That's great, and there is clearly nothing wrong with your setup. But similarly, you cannot say without testing side-by-side that the performance is "better" than a single GTX480.

As I have said many times, multi-GPU setups are still attractive - you just need to take into account framerate variance and appreciate that there will be a small reduction from the performance you might expect if you only look at average-FPS benchmarks.
 
I upgraded my 5850 to a 5870 and now crysis plays with no lag on max settings and 4x aa on 1920x 1200
I now won't need a better card until a game I want is more gpu intensive

The card is very quiet

I got mine for 210 second hand
 
Hi Duff-Man :)

thanks for taking the time to reply . . . not quite done yet! :p

A very vague and wishy-washy way of putting it, but I suppose that's more or less what I'm saying.
If you read all the technical stuff that you wrote you may understand how a simple layman like myself is left scratching his head at what the heck you are actually saying . . . so to feed all that technical data in and come back with a "vague and wishy-washy" understanding of your suggestion is actually quite an achievement! ;)

...You're attempting to quantify a very abstract concept (the way our eyes perceive a set of discrete frames to be a smooth moving image), into a simple numerical form. This will always involve glossing over details to form broad conclusions.
That's what I do . . . take a "very abstract concept" and first attempt to understand it then find a way to explain it to other people in simple layman speak . . . . Once I understand the scenario I can also either help find a fix or attempt to disprove the problem actually exists! . . .

As a budding epistemologist I only deal in Black or White . . . thus so far despite the huge effort on your behalf the subject matter is still clearly "Grey" . . . and I don't do Grey! . . .

As for convincing you - I have given a scientific analysis of the effect, and explained it in terms of the operation of the GPU with the rest of the system. I have shown that it exists, it is observable, repeatable, and predictable. I can't really do anything more than that without access to a formal testing lab with lots of different hardware configurations.
What I would say is your work so far and the conclusions drawn are perhaps half baked . . . I certainly have not seen what I would consider "compelling" evidence that a normal layman needs concern himself with . . . the worry I have is two fold, firstly your making claims on the Internet that are perhaps being misinterpreted, this "microstutter" scenario is being exaggerated to some extent thanks to the power of the Interweb-chinese-whispers-rumour-mill (9%-30% reduction . . whisper whisper . . 10%-30% reduction . . whisper whisper 20%-40% reduction etc) . .

The second thing is I am wondering to what extent some of this could be due to the . . . if you read post #56 you will see a short post by me in reply to a friendly forum users video he posted as proof that "microstutter" was real . . . if you watch the video linked in the post and then read my reply I wonder . . . if this "judder, judder, judder" thingy to some extent being generated by the very tools you are using to measure the frames? :confused:

Can I assume that in the sake of thorough Scientific method that you disabled fraps and any diagnostics tool that interact with the PC at a hardware level (CPU-z, RiverTuner etc) and then did some runs again to observe if you could percieved any issue with the naked eye? . . .

There is a good chance the very tools you are using to "observe" could be the tools creating a problem?

You say the problem increases as GPU load increases? . . . this should be very easy to set-up and "observe" any differences (tool-less and with the nake-eye ) by doing different runs with more and more levels of AA and maybe working out the right balance of texture settings to push the average framerate down to 60/50/40FPS etc to see if your theory stands up to the naked eye . . .

This has got to be sorted one way or another and clearly moved into either the black area or the white area . . . videos have to be made and one way or another the case closed . . .

If this was Quantum Physics then I could understand it taking a bit longer to prove eitherway but its not . . . . can you get people to disable any FPS measuring apps and 3rd party software like your program and CPU-z etc and try to test again with the naked eye! . . .

What you are claiming clashes with many many people that have stated their SLI experience is great? . . . are they running v-Sync maybe or is it off? . . . are they running at 80FPS maybe and can't tell due to high FPS or are they running 40FPS and can't tell because there is no problem? . . . is there settings not enough to fully load their cards or are they CPU limited . . . tons and tons of scenarios which really ought to have been tested properly before any conclusions darwn . . .

I don't think its reasonable to make claims like the following that attempt to void the validity of classic FPS charts and could put potential punters off a SLI purchase based on "conjecture", "non-exhaustive testing" and the fact this could all be called by "Observer effect"? . . . I appreciate your efforts so far btw and good luck trying to get closure on this one way or another! :cool:

I'm saying that there will be an identifiable reduction in performance
 
I upgraded my 5850 to a 5870 and now crysis plays with no lag on max settings and 4x aa on 1920x 1200
I now won't need a better card until a game I want is more gpu intensive

The card is very quiet

I got mine for 210 second hand




lol clock the 5850 to 850 core and you would have basically 5870 performance.
 
Hi Duff-Man :)If you read all the technical stuff that you wrote you may understand how a simple layman like myself is left scratching his head at what the heck you are actually saying

I'm sorry if my writing style is somewhat unintelligible. I'm used to writing articles for scientific journals, or conferences, or technical reports. Sometimes it is difficult to express complex concepts in a way that a "layman" will understand. Sometimes simple analogies can't capture all the relevant facets of a technical problem.



What I would say is your work so far and the conclusions drawn are perhaps half baked . . . I certainly have not seen what I would consider "compelling" evidence that a normal layman needs concern himself with
I have to say I take exception at this statement. I have made a reasonably formal analysis of the phenomenon. You have already explained that you don't understand everything I have said, so to dismiss it as "half baked" is unfair. Please understand the entire set of points before dismissing them. If you have specific questions then ask them, but don't dismiss what I have done simply because you don't understand it.

You are welcome to undertake your own investigations into the phenomenon. No-one is stopping you.

. . . the worry I have is two fold, firstly your making claims on the Internet that are perhaps being misinterpreted, this "microstutter" scenario is being exaggerated to some extent thanks to the power of the Interweb-chinese-whispers-rumour-mill (9%-30% reduction . . whisper whisper . . 10%-30% reduction . . whisper whisper 20%-40% reduction etc) . .

I measure the quantitative effects, and report them. As with all data reporting, some people will misunderstand the results and their implications, and as a result, unwittingly spread misinformation. There is nothing that I can do about this. I can only report the facts - I can't force everyone to understand them.

The second thing is I am wondering to what extent some of this could be due to the . . . if you read post #56 you will see a short post by me in reply to a friendly forum users video he posted as proof that "microstutter" was real . . . if you watch the video linked in the post and then read my reply I wonder . . . if this "judder, judder, judder" thingy to some extent being generated by the very tools you are using to measure the frames? :confused:

Just ignore the video post - it's ridiculous. The video is captured at 24fps. How can you observe a phenomenon that occurs over timescales much less than 1/24 th second? You can't... It isn't something you can see on a video.

Can I assume that in the sake of thorough Scientific method that you disabled fraps and any diagnostics tool that interact with the PC at a hardware level (CPU-z, RiverTuner etc) and then did some runs again to observe if you could percieved any issue with the naked eye? . . .

There is a good chance the very tools you are using to "observe" could be the tools creating a problem?

Yes to the above, but that is not at all scientific. It is 100% subjective and so is irrelevant. But yes, I have observed the phenomenon in all its forms without FRAPs etc.

You say the problem increases as GPU load increases? . . . this should be very easy to set-up and "observe" any differences (tool-less and with the nake-eye ) by doing different runs with more and more levels of AA and maybe working out the right balance of texture settings to push the average framerate down to 60/50/40FPS etc to see if your theory stands up to the naked eye . . .

I have tested many configurations as far as CPU limitation goes. If you have read my threads then you know this. As far as "naked eye" assessments go - these are subjective and irrelevant from a scientific point of view. But yes, if you can give enough load to really make the GPU crawl, you can actually see the framerate irregularity.

This has got to be sorted one way or another and clearly moved into either the black area or the white area . . . videos have to be made and one way or another the case closed . . .
Videos WILL NOT show the microstutter phenomenon. They only take a snapshot of the output every 1/24th second or so. MS occurs much more rapidly.

If this was Quantum Physics then I could understand it taking a bit longer to prove eitherway but its not . . . . can you get people to disable any FPS measuring apps and 3rd party software like your program and CPU-z etc and try to test again with the naked eye! . . .

I have demonstrated the phenomenon exists (as many others have before me), and I have used analytical data to describe the scenarios in which is has an effect, and the magnitude of this effect. If you don't consider this "proof" then that's no issue of mine. Run all the tests you like to convince yourself one way or another - all the tools are available.

What you are claiming clashes with many many people that have stated their SLI experience is great? . . . are they running v-Sync maybe or is it off? . . . are they running at 80FPS maybe and can't tell due to high FPS or are they running 40FPS and can't tell because there is no problem? . . . is there settings not enough to fully load their cards or are they CPU limited . . . tons and tons of scenarios which really ought to have been tested properly before any conclusions darwn . . .

Okay, this is the last time I will make this point:

- SLI can STILL GIVE "GREAT" PERFORMANCE
- Adding a second card will STILL IMPROVE PERFORMANCE over the single card solution
- BUT... Adding the second card will NOT give the same real-world performance increase as you might expect from the raw FPS increase

NOTHING I have said suggests people can't have a good gaming experience with SLI setups. The only practical effect for the end-user is a slight devaluation of mid-range SLI setups in comparison to single high-end GPU setups.


I don't think its reasonable to make claims like the following that attempt to void the validity of classic FPS charts and could put potential punters off a SLI purchase based on "conjecture", "non-exhaustive testing" and the fact this could all be called by "Observer effect"? . . . I appreciate your efforts so far btw and good luck trying to get closure on this one way or another! :cool:

I already have closure on the issue, from a scientific point of view. I don't care if people are offended because it makes their hardware look a little less slick. It is what it is. I made my tool publically available to gather results from a wider range of hardware configurations. As a part of this I analysed the results in the hope of letting people make a more informed decision about their GPU upgrades. If you, or anyone else, chooses to ignore the results, disbelieve them, or even misinterpret them, then that's not my concern either.




----------------------------------------------------

Look, I'll answer any more specific questions you have about the phenomenon, the way I analysed it, or the interpretation of the findings. But please - enough with the "I don't buy it", or "your work sucks" type comments. I don't have time to get caught up in such petty arguments. If you think the analysis is incomplete - then complete it. If you believe that what I have shown is false - then disprove it. But really - I do this kind of thing for a living (data analysis of numerical phenomena), and have a PhD in the area, so have a little faith that I know what I'm doing here and that I'm not a complete moron ;)

I have no pre-set agenda to promote or dismiss microstutter. It was an interesting and generally misunderstood phenomenon, and so I wanted to understand its behaviour. Now I do.
 
Last edited:
Hey Emlyn :)

But then you're stuck with 2 cards with a lack of texture memory
What do you mean "lack of texture memory" . . . for who you or me or everyone? . . . 768MB is enough for me and I'm sure other people, the benchmarks tell you all you need to know . . . . Lets not forget 672 Stream Processors having it large, two PCI-E slots "filled" and doing what there meant to be doing and more affordable and faster than a HD 5870 and GTX 480 @ 1920x1200 . . . not to bad being "stuck" with that! ;)

12-18 months down the line sell them both, recoup some cash and consider the new options . . .

and whatever SLI issues that may exist
Duff-Man!!!!! :p

At least with the single GTX470 you have a solid performing card and the option of adding another.
Yup I can at least see some logic in this, such a pity that someone has to pay a £40-£50 premium for naff all extra performance though and have a good working PCI-E X16 express slot sitting there "empty" on a motherboard . . . will that second slot ever get filled before you upgrade the mobo? . . .

Seems a waste to pay a premium ££ for a Dual-Slot 8x/8x mobo and then leave it half used? . . . yes down the line and assuming the end user sticks with the mobo and assuming that in 12-18 months that 1024MB really becomes the norm he at least can stick another 12-18month old GPU in to get a 1024MB SLI set-up . . . but wait . . . what did you just say about SLI?

and whatever SLI issues that may exist
:D

TBH, I wouldn't buy 2 1gb GTX460's at the same time either...
Yup I agree, except if I was going to the moon for 3 years . . . GTX 460 768MB SLI makes a lot of sense right now if one has the two PCI-E 8x/8x slots and £240 odd smackers . . . .

Thing that doesn't appeal to me about buying a 1024MB now is obviously there not great value for money then what you gonna do in 12-18month? . . . buy another? . . . in 12-18months time is a GTX 460 1024MB SLI'ed system gona be as exciting as whatever tech exists then? . . .

The rule of "buy the best performing single card you can" always applies.
that "rule" is broken! :D

Situation might be interesting when 2 cards in SLI are cheaper and faster than strongest card on the market #5

When 2x mid-range cards are both faster and substantially cheaper in combined cost than the current top end single card. (Or poor availability of the top end card #6

Yeah this roughly my view on SLI aswell #10

Yes, unless the performance of SLI lower ends beats a higher end equal to roughly the same cash #14

I don't have any favourites, whatever gives me best bang for my buck. #15
 
Last edited:
lol clock the 5850 to 850 core and you would have basically 5870 performance.

I tried but never managed to get it to play like the 5870 plays.. I got to a point when the pc just crashed or I'd get black squares.. thought I may ruin the card so decided on the 5870 didn't cost much more than the 5850 in the end
 
What do you mean "lack of texture memory" . . . for who you or me or everyone? . . . 768MB is enough for me and I'm sure other people, the benchmarks tell you all you need to know . . . . Lets not forget 672 Stream Processors having it large, two PCI-E slots "filled" and doing what there meant to be doing and more affordable and faster than a HD 4870 and GTX 480 @ 1920x1200 . . . not to bad being "stuck" with that! ;)

If you're running SLI, chances are you're running larger resolutions and still wanting all the details.
I can't say I'd want to be "limited" by low(er) texture memory in this situation.

I know that the 2 x 768mb 460's offer very good performance for the cash, but I'd still not buy 2 of them.

A GTX470 would cost less than the 460's. That's a pro right there.

I like budget purchases, but there are times when they are just not needed. If I was running a larger monitor I'd take the 470 any day.
Futureproof isn't really a word that has a place in PC gaming but... The 470 is the better option when looking to the future.

This isn't about filling slots on motherboards, it's just to do with what is the wisest move.

At some stage you have to admit that 768mb memory on a graphics card IS a limitation, not in a budget situation on a lower res monitor.
In a thread where SLI is mentioned, 2 low memory cards are not the way forward.
 
Last edited:
I'm sorry if my writing style is somewhat unintelligible. I'm used to writing articles for scientific journals, or conferences, or technical reports. Sometimes it is difficult to express complex concepts in a way that a "layman" will understand.
Hmmm . . . personally I have found if I don't fully understand a "complex" situation I am unable to explain it simply to another "layman" . . . I have no doubt you have a good notion in your head but I was hoping you understoof the scenario well enough to take a complex situation and paint a simple picture! ;)

I have to say I take exception at this statement. I have made a reasonably formal analysis of the phenomenon. You have already explained that you don't understand everything I have said, so to dismiss it as "half baked" is unfair. Please understand the entire set of points before dismissing them.
The reason I don't understand everything you have said is as much your fault as not being able to explain complex scenarios "simply" as it is my fault at understanding them . . . put that pointy finger back in its holster please! :D

I can only report the facts - I can't force everyone to understand them.
If people can't understand the facts because of the way/style/speech you are reporting them I'm not sure how useful they are . . . . at least you know I suppose! . . .

Just ignore the video post - it's ridiculous.
Duly noted!

Ejizz "microstutter" excusive video here

It is 100% subjective and so is irrelevant
So you have proven to yourself that something is happening that not everyone will be aware of and certainly not something that most people even need to know about . . . and certainly not something that your budding SLI user needs concern himself with . . . . the more I understand this scenario the more it appears to be a "Storm In A TeaCup" ;)

I have tested many configurations as far as CPU limitation goes. If you have read my threads then you know this. As far as "naked eye" assessments go - these are subjective and irrelevant from a scientific point of view.
I spent an hour carefully reading your one thread . . . and are you basing your conclusions drawn on your specific hardware alone or the submissions of the nice folks in that thread too . . .

I have demonstrated the phenomenon exists (as many others have before me), and I have used analytical data to describe the scenarios in which is has an effect, and the magnitude of this effect. If you don't consider this "proof" then that's no issue of mine.
There is one thing you seem to have forget Duff-Man and that is . . .

"The Observer effect"

I think there is a chance you may be creating the problem yourself? . . . I put it to you that the very tools you are using to measure this "phenomena" are the very tools that is causing it? . . .

I already have closure on the issue, from a scientific point of view

Can you disprove that your theory is flawed? . . . scientifically I'm not sure how you can! :D

As far as "naked eye" assessments go - these are subjective and irrelevant from a scientific point of view - Duff-Man

Adding the second card will NOT give the same real-world performance increase as you might expect from the raw FPS increase
The only two areas of SLI I heard about before and accept can effect performance are:

  • SLI Scaling
  • SLI Overhead

I don't care if people are offended because it makes their hardware look a little less slick.
I'm suprised you said that . . . very suprised indeed? :confused:

I think you may be missing the point of this questioning entirely if you think in anyway my motivation is about peoples hardware "looking a little less slick" . . . that's such a strange thing to say?

The point in me asking you 99 questions is to determine exactly what you know about a possible situation where hardware being bought with certain expectations is failing to meet those expectations . . .

If somebody buys a product that they see reviewed at 80FPS across several reviews with the same hardware they own and buys that product and gets 80FPS athough gets hit by a "SLI Stealth Tax" so in effect the performance is the same as a single GPU giving off say 65FPS then thats not right and proper is it?

Of course as you say . . . . "It is 100% subjective" hmmmm . . .

But please - enough with the "I don't buy it", or "your work sucks" type comments. I don't have time to get caught up in such petty arguments.
Just remember I am "Devils Advocate" so you need to answer the questions and not buckle under the cross examination . . . you are making a statement delivered in a technical style that is a bit "Smokescreen & Mirrors" . . . and I am having to get you to decrypt it into normal everyday terms that can be understood by normal everyday punters . . . If I say I don't buy it then please don't be offended . . . I never said your work "sucks" thats just you being tetchy? . .

My concern here is only your claims stand up to close scrutiny . . . there is no "petty arguments" but simply "examination" . . . you are having to backup your work and I am suspecting now your conclusions may be "flawed" due to the "The Observer effect" . . . it happens and its nothing personal but I do think perhaps this whole issue has been blown totally out of proportion by a great many people . . .

You say this whole "microstutter" thing is 100% subjective . . . I think the only way to debunk this myth from a laymans point of view is to arrange two identical configs side by side, one with a SLI config running at 40-80FPS and a single GPU running at 40-80FPS (both with no apps!) and do a bunch of Blind A-B testing! . . . if your theorys hold true what you would expect is for some people with a keen eye to pick out which games are running on SLI right? . . . 70FPS runs, 60FPS runs, 50FPS runs . . . 99.9% GPU loads etc . . .

I have no more further questions at this time! :cool:

Nothing wrong with playing devil's advocate.
 
Last edited:
Back
Top Bottom