Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
They don't like it and will shot it down as it tells the real truth on how your game is performing, if your game is running crap and a stuttery mess, this will tell you that![]()
The problem here is not in using FRAPS to measure average framerates over the run of a benchmark, but rather when it comes to using FRAPS to measure individual frames. FRAPS is at the very start of the rendering pipeline; it’s before the GPU, it’s before the drivers, it’s even before Direct3D and the context queue. As such FRAPS can tell you all about what goes into the rendering pipeline, but FRAPS cannot tell you what comes out of the rendering pipeline.
So to use FRAPS in this method as a way of measuring frame intervals is problematic. Considering in particular that the application can only pass off a new frame when the context queue is ready for it, what FRAPS is actually measuring is the very start of the rendering pipeline, which not unlike a true pipe is limited by what comes after it. If the pipeline is backed up for whatever reason (context queue, drivers, etc), then FRAPS is essentially reporting on what the pipeline is doing, and not the frame interval on the final displayed frames. Simply put, FRAPS cannot tell you the frame interval at the end of the pipeline, it can only infer it from what it’s seeing.
AMD’s problem then is twofold. Going back to our definitions of latency versus frame intervals, FRAPS cannot measure “latency”. The context queue in particular will throw off any attempt to measure true frame latency. The amount of time between present calls is not the amount of time it took a frame to move through the pipeline, especially if the next Present call was delayed for any reason.
AMD’s second problem then is that even when FRAPS is being used to measure frame intervals, due to the issued we’ve mentioned earlier it’s simply not an accurate representation of what the user is seeing. Not only can FRAPS sometimes encounter anomalies that don’t translate to the end of the rendering pipeline, but FRAPS is going to see stuttering that the user cannot. It’s this last bit that is of particular concern to AMD. If FRAPS is saying that AMD cards are having more stuttering – even if the user cannot see it – then are AMD cards worse?"
the anandtech article is excellent.
In July you have more options in driver for multi gpu and they are on it.
new managment helps.
I'm not taking about the spikes.
If your game is stuttering though Locky you can tell. You don't need fraps to tell you if a game is smooth.
yeah I know that lol
but fraps just confirms it and you can use to base an improvement from.
+1
Two work very well together.![]()
Thats bad advice in 2013......
Waste of time buying 2x690's for the simple fact that thre gimped by 2GB Vram limit so anygame that you play will only have 2GB of ram available therefore limiting the excellent performance of the card regarding its core and shaders.
OP dont waste your money on 2x690's stick with 1 and wait for a more worthwhile upgrade to come along.
I also had 2x690's when they came out, Waste of money cause of the Vram limit.
Uploaded with ImageShack.us
OP dont waste your money on 2x690's stick with 1 and wait for a more worthwhile upgrade to come along.
I also had 2x690's when they came out, Waste of money cause of the Vram limit.
Uploaded with ImageShack.us
I have got what you would consider a worthwhile upgrade (4 Titans) but for 24/7 use I prefer the GTX 690s.
Thats bad advice in 2013......
Waste of time buying 2x690's for the simple fact that thre gimped by 2GB Vram limit so anygame that you play will only have 2GB of ram available therefore limiting the excellent performance of the card regarding its core and shaders.
OP dont waste your money on 2x690's stick with 1 and wait for a more worthwhile upgrade to come along.
I also had 2x690's when they came out, Waste of money cause of the Vram limit.
Uploaded with ImageShack.us
If you use Nvidia inspector and turn your 8xMSAA+SGSSAA on 99% of games you will run out of vram i guarantee it, go check for yourselves.
Even on 4xMSAA your going ot be hitting a Vram bottleneck at resolutions over 1920*1200.
Even playing old games like L4D2 with the above settings hits the Vram wall.
Then there is downsampling which eats up even more Vram.
Then there is the new consoles looming over the horizon where 2GB of Vram not be enough.
So buying a 2nd 690 now stands as a bad choice and bad advice to give to somebody.
Plenty of examples in this thread of 690 hitting vram limit early on.....
http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club
Gregster why are your barking? Calm down, Im giving good truthful advice here.
Who on earth runs games with 8xMSAA+SGSSAA, it might push vram above 2gb but it's a sure fine way to gimp your fps to unplayable levels especially on super high resolutions.