• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Quick update from the 7 day 4870 thread

Status
Not open for further replies.
I gotta laugh at this - the mods closed the old thread as it was descending into a flame war, and so the OP just opened a new one and it's beginning to go exactly the same way. :D

He obviously likes to start it up. You obviously can't use Crysis as a fair benchmark. COD4 or GRID would have been better IMO.
 
Update... after extensive testing my results are valid... the OSD only shows a lower res taking a screenshot shows it is actually running at 1680x1050.

cry260.jpg


My only conclusion is the OP is getting a lower result due to running the core and shader clocks unlinked (when I do this I get a 12% performance reduction).

Unfortunatly my BFG card won't OC too well - the core is good for 775 but the shader craps out after 1500 which limits me to about 680 or so on the core - running them unlinked results in slower performance than if I had the core linked to the slower shader speed. On the plus side tho the shader seems to respond well to lower temps - with water cooling I think I can get some killer performance out of this card.
 
and heres running the same clocks as the OP:

clocknlocks.jpg


notice how close the numbers are... the extra SPs making all of 0.47fps difference :P

Also running the clocks out of sync is deffinatly impacting performance... something I'm gonna have to look into in more detail as the core will clock a long way higher than the shaders.
 
Last edited:
Lol 'Crysis doesn't count cause its NV sponsored/optimised'
Likewise 'Use GRID it's fair!'

Different games will inheriently run better on different architectures. Same with CPUs some apps favour one design over another. ATI's architecture means it can potentially yoyo between extreme performance, and just average (or even poor) performance. NV's will generally be more stable. You can even see this frame to frame in the same game:

1222726393qhM1IEZJCb_4_2.gif


Discounting any game just because it doesn't give you the results you want is ludicrous - and that goes for both sides. Pick the games you like the most and get the card that performs the best with them. Who cares what other people think? It's what works for you and that's all that matters.
 
This is why I went for the 260 - and I don't care if it can be "fixed" in later drivers because I want the stability now not at some undetermined point in the future... its the same story across a lot of games if you graph the fps over time the 260 just has a much more stable baseline fps. Even if they fix the fluctuations in the fps on the 4870 I still don't see the baseline being better.

The 280 is still hideously overpriced for the performance tho... realistically the 260 should be around £140 and 280 around £180 for the performance they give.
 
This thread is amazing!

Oh and I once bought a 260, but it gave me both cancer and aids, then ran off with my wife. :(
 
Lol 'Crysis doesn't count cause its NV sponsored/optimised'
Likewise 'Use GRID it's fair!'

Different games will inheriently run better on different architectures. Same with CPUs some apps favour one design over another. ATI's architecture means it can potentially yoyo between extreme performance, and just average (or even poor) performance. NV's will generally be more stable. You can even see this frame to frame in the same game:

1222726393qhM1IEZJCb_4_2.gif


Discounting any game just because it doesn't give you the results you want is ludicrous - and that goes for both sides. Pick the games you like the most and get the card that performs the best with them. Who cares what other people think? It's what works for you and that's all that matters.

That minimum of 4 fps could have been anything and i don't think it was happening a lot or the average fps on the 4870 would have been lower than the gtx. To have a good average fps you have to have a steady flow of drames not lots of dips the max fps between the cards was only 3 fps of a difference and the 4870 around the same average fps this would not have been possible if the 4870 kept on hitting low frame rates.
 
That minimum of 4 fps could have been anything and i don't think it was happening a lot or the average fps on the 4870 would have been lower than the gtx. To have a good average fps you have to have a steady flow of drames not lots of dips the max fps between the cards was only 3 fps of a difference and the 4870 around the same average fps this would not have been possible if the 4870 kept on hitting low frame rates.

look at the graph (and look at some other graphs) even if you take the dips out and clean up the fluctuation as if assuming future drivers would do - the 260 still has a much tigher baseline.

I'm not saying the 260 is the superior card... just saying the story is far from as simple as some people make out.
 
That minimum of 4 fps could have been anything and i don't think it was happening a lot or the average fps on the 4870 would have been lower than the gtx. To have a good average fps you have to have a steady flow of drames not lots of dips the max fps between the cards was only 3 fps of a difference and the 4870 around the same average fps this would not have been possible if the 4870 kept on hitting low frame rates.

When using averages, the idea is to get the average framerate. This means extreme highs and extreme lows will have little effect on the average FPS given. A few highs will increase it only slightly, and a few lows will decrease it only slightly. The 4870 doesn't 'keep' hitting low FPS (or high FPS) but it does it from time to time. Some people don't even notice, others it can drive nuts.

1222726393qhM1IEZJCb_4_1.gif


I think this graph shows it best, the framerate of the 4870 X2 is universally playable here, and the GTX 280 is most certainly not. However the 280 offers almost the same FPS all the time while the X2 is a little more... active. Just like screen flicker - this may or may not even be an issue at all. I find it difficult to aim if the FPS dips a lot, so I prefer NV card in general (altho that X2 is a bit of a beast). Other people don't even notice, some people can even play a game like Crysis with a framerate of only 20fps.

Actually come to think of it - it's probably just like the microstutter problem. A wild framerate will have an uneven time between frames. A steady one will have a fairly even time between frames. Some people see microstutter, some don't. I found 9800GTX SLI barely more playable than a single 9800GTX due to microstutter. Others love it.

As always - get the best card for yourself.
 
It's amazing how you can start a brand new thread on this subject and the exact same people will without fail join it and post the exact same things they posted in the last 10x zomg ATI vx NV threads.

They really are starting to get rather sad now. Just buy whichever card suits you best and live with it... why the need for all this pathetic and repetetive graph posting showing the same numbers we've all seen 10000 times before just in order to argue with some numbnuts who spouts off their opinions about ATI or NV based on their threadbare GPU knowledge?

People indulge clueless trolls like elpedro etc far too often here methinks when they should just turn the other cheek and ignore the tripe they post.
 
Sorry Boogle but you totally discounted your argument as soon as you pulled out benchmarks from [H].
 
My only conclusion is the OP is getting a lower result due to running the core and shader clocks unlinked (when I do this I get a 12% performance reduction).

I linked the clocks with Expertool? What would the shader be at 730 core (which I'm running atm)

People indulge clueless trolls like elpedro etc far too often here methinks when they should just turn the other cheek and ignore the tripe they post.
Thanks for that I guess... But I'm not clueless, you don't know me and the 'tripe' I posted was actually pretty unbiased opinions on both cards which I used; frequently backing up ATI but prefering the 260 as I had no problems with it. Also what is your definition of a troll - mine in this scenario would be someone who purely posts crap to annoy, offend and get him/herself banned, no?
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom