• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Dragon Age Inquisition Performance Benchmarks

Seeing those frames, I am so glad I have a G-Sync monitor. It will eat those low frames and turn them into Lionel Richie smooth, easily :cool:
 
Why would you avoid a game when it supports two API's providing users with a choice?

It still supports DX11 you know...

To be fair there is no choice if your an nvidia user, theres dx11 or another choice is dont buy it.

If the game runs crap anyhow even on wunder mantle then why buy it unless your a fan of the games or you buy anything related to mantle.

A better optimised game for all would have been nice, but meh.
 
Like I said in another Benchmark thread. It will be better once we all bench the game on OCUK and see what we all really get..
I expect Mantle to run away with it though. :p
 
Where does the idea that performance must be above a certain level come from? It's never been the case historically. A high end gpu has NEVER give a guarantee of a specific performance level at a given resolution. Time moves on, there were games 5 years ago that pushed hardware and didn't give 60fps at 1080p, there were games that ran at 120fps easily, and the same was true the year after that, and every year till now.

Suddenly everyone believes a particular game not giving more than 60fps means automatically crap rather than just... you know, requiring more power.

I don't know if it does, it might be a crap engine, it might simply look fantastic or have some insane effects enabled that once disabled improve performance drastically.

500fps isn't worthwhile, you get a generation of cards that increases performance, then game dev's use that power for better effects and performance goes down.... that is how the industry works and always has done.
 
Is it just me or does something not look right there.
Ignoring he Nvidia results and just looking at the AMD ones specifically the DX vs Mantle ones.


260x DX min 15 Mantle min 16
270x DX min 26 Mantle min 27
280x DX min 31 Mantle min 33
290 DX min 41 Mantle min 44
290X DX min 46 Mantle min 52

I can only assume that it is the way the benchmark has been run to give a better showing to the high end cards, because I though Mantle was suppose to help the little guy more.
 
Is it just me or does something not look right there.
Ignoring he Nvidia results and just looking at the AMD ones specifically the DX vs Mantle ones.


260x DX min 15 Mantle min 16
270x DX min 26 Mantle min 27
280x DX min 31 Mantle min 33
290 DX min 41 Mantle min 44
290X DX min 46 Mantle min 52

I can only assume that it is the way the benchmark has been run to give a better showing to the high end cards, because I though Mantle was suppose to help the little guy more.

Mantle reduces CPU bottleneck. It doesn't suddenly make a 270x perform like a 290x using DX
 
Where does the idea that performance must be above a certain level come from? It's never been the case historically. A high end gpu has NEVER give a guarantee of a specific performance level at a given resolution. Time moves on, there were games 5 years ago that pushed hardware and didn't give 60fps at 1080p, there were games that ran at 120fps easily, and the same was true the year after that, and every year till now.

Suddenly everyone believes a particular game not giving more than 60fps means automatically crap rather than just... you know, requiring more power.

I don't know if it does, it might be a crap engine, it might simply look fantastic or have some insane effects enabled that once disabled improve performance drastically.

500fps isn't worthwhile, you get a generation of cards that increases performance, then game dev's use that power for better effects and performance goes down.... that is how the industry works and always has done.

Well done, captain obvious.
 
Where does the idea that performance must be above a certain level come from? It's never been the case historically. A high end gpu has NEVER give a guarantee of a specific performance level at a given resolution. Time moves on, there were games 5 years ago that pushed hardware and didn't give 60fps at 1080p, there were games that ran at 120fps easily, and the same was true the year after that, and every year till now.

Really? I never knew this :o

Suddenly everyone believes a particular game not giving more than 60fps means automatically crap rather than just... you know, requiring more power.

I don't for one minute think that everyone believes that and you are massively exaggerating. :o

I don't know if it does, it might be a crap engine, it might simply look fantastic or have some insane effects enabled that once disabled improve performance drastically.

Well you have given out some rather damning statements, so probably good if you backed up the info with some solids.

500fps isn't worthwhile, you get a generation of cards that increases performance, then game dev's use that power for better effects and performance goes down.... that is how the industry works and always has done.

I don't normally read your posts as they just state the obvious and seems nothing has changed. this time though, you have gone off on a mass of exaggeration and stating the obvious in an obvious manner...
 
Back
Top Bottom