Last edited:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Not really, 2560x1600 4x AA could be pushing the Vram on the ATI card & the 5xxx & DX11 drivers have some maturing to do.
No I mean it shows the 295 as 16 min at 1920x1200 and 21 min at 2560x1600, reckon they are the other way around.
No I mean it shows the 295 as 16 min at 1920x1200 and 21 min at 2560x1600, reckon they are the other way around.
Who cares, it doesn't matter
The thing about minimum framerates though is that they can be affected by transient spikes. Like, say your computer hiccups randomly (accessing a file, computing something in the background etc), it can show up in the benchmark. Such things would be smoothed out in the average FPS, but not neccesarily in the minimum value.
The ideal way to measure the minimum framerate is to run the test several times, and take the "maximum minimum", i.e. the largest of the recorded minimum values. If they just run the test once, or they take the absolute minimum of all the runs, it's entirely possible for a faster benchmark to have a lower minimum framerate for the reasons stated above.
What we need in addition to minimum is minimum 95th percentile so sort the results in order and report the result at 5% in.
I have nto tired this game yet I was going to install tonight to see if there was any use of dx11 but are you saying that there is no visual difference between dx10 & dx11?
I have nto tired this game yet I was going to install tonight to see if there was any use of dx11 but are you saying that there is no visual difference between dx10 & dx11?
A game which just uses optimisations of both dx11 and dx10.1, showing that the 4890 is faster than the 285gtx(at half the cost) and likewise the X2 beating the 295GTX comftably, as we saw similar numbers with Assasins Creed in dx10.1 vs dx10.
If Microsoft hadn't buckled like a cheap whore being punched in the stomach, ATi would have likely had the performance league in the majority of titles for some time.
Nvidia would probably have been forced to add dx10.1 in much sooner(maybe with their refresh products) and everyone would have been better off.
Which is why they had Dx10.1 taken out of Ass Creed, as when it was in, the performance boost it gave ATi cards, brought them right up to theirs, which of course they didn't want, as their cards were much higher priced.