Ahh mantle included, will give this game a miss![]()
Why would that be?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Ahh mantle included, will give this game a miss![]()
Ahh mantle included, will give this game a miss![]()
Why would you avoid a game when it supports two API's providing users with a choice?
It still supports DX11 you know...
Seeing those frames, I am so glad I have a G-Sync monitor. It will eat those low frames and turn them into Lionel Richie smooth, easily![]()

Didn't Gsync loose its effect on anything 40fps and below?
<30fps on my monitor it does but anything over works a charm.


Is it just me or does something not look right there.
Ignoring he Nvidia results and just looking at the AMD ones specifically the DX vs Mantle ones.
260x DX min 15 Mantle min 16
270x DX min 26 Mantle min 27
280x DX min 31 Mantle min 33
290 DX min 41 Mantle min 44
290X DX min 46 Mantle min 52
I can only assume that it is the way the benchmark has been run to give a better showing to the high end cards, because I though Mantle was suppose to help the little guy more.
Where does the idea that performance must be above a certain level come from? It's never been the case historically. A high end gpu has NEVER give a guarantee of a specific performance level at a given resolution. Time moves on, there were games 5 years ago that pushed hardware and didn't give 60fps at 1080p, there were games that ran at 120fps easily, and the same was true the year after that, and every year till now.
Suddenly everyone believes a particular game not giving more than 60fps means automatically crap rather than just... you know, requiring more power.
I don't know if it does, it might be a crap engine, it might simply look fantastic or have some insane effects enabled that once disabled improve performance drastically.
500fps isn't worthwhile, you get a generation of cards that increases performance, then game dev's use that power for better effects and performance goes down.... that is how the industry works and always has done.
Where does the idea that performance must be above a certain level come from? It's never been the case historically. A high end gpu has NEVER give a guarantee of a specific performance level at a given resolution. Time moves on, there were games 5 years ago that pushed hardware and didn't give 60fps at 1080p, there were games that ran at 120fps easily, and the same was true the year after that, and every year till now.

Suddenly everyone believes a particular game not giving more than 60fps means automatically crap rather than just... you know, requiring more power.

I don't know if it does, it might be a crap engine, it might simply look fantastic or have some insane effects enabled that once disabled improve performance drastically.
500fps isn't worthwhile, you get a generation of cards that increases performance, then game dev's use that power for better effects and performance goes down.... that is how the industry works and always has done.