• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Its going to be interesting, but I can imagine which ever team turns out to be slower, there will be cries of 'its not a fair test because XYZ'.
Fun times ahead. :)

Nvidia are crying already:

"We believe there will be better examples of true DirectX 12 performance and we continue to work with Microsoft on their DX12 API, games and benchmarks. The GeForce architecture and drivers for DX12 performance is second to none - when accurate DX12 metrics arrive, the story will be the same as it was for DX11."
 
Nvidia are crying already:

"We believe there will be better examples of true DirectX 12 performance and we continue to work with Microsoft on their DX12 API, games and benchmarks. The GeForce architecture and drivers for DX12 performance is second to none - when accurate DX12 metrics arrive, the story will be the same as it was for DX11."
DX 11 always seemed a bit "bunged it" as apposed to DX9 when that came out.
So Nvidia saying this isn't accurate?
 
Stunning :cool:

DX12 is shaping up to be some jump in PC gaming performance! Great times ahead for us.

The most amazing thing is how much more grunt the Hawaii still has under the hood. Considering it was shipped as the Titan Killer originally. But instead it is not just titan (original) killing, it is keeping up with a card a generation newer than it.

Although someone could do with adding a titan (original) benchmark since it could also show some major gains.
 
NVIDIA: We Don’t Believe AotS Benchmark To Be A Good Indicator Of DX12 Performance

As you all know, NVIDIA released the 355.60 driver specifically for Ashes of the Singularity’s Alpha, which is in itself a rare occurrence for a game still in development. Even so, we registered mixed results in our DX12 performance benchmarks with NVIDIA cards and clearly the company noticed all of this on its own, as they reached out to the press in order to give their side to the story.
We were able to get a detailed statement from NVIDIA’s Brian Burke, Senior PR Manager. Here’s what he had to say on the matter:
This title is in an early Alpha stage according to the creator. It’s hard to say what is going on with alpha software. It is still being finished and optimized. It still has bugs, such as the one that Oxide found where there is an issue on their side which negatively effects DX12 performance when MSAA is used. They are hoping to have a fix on their side shortly.
We think the game looks intriguing, but an alpha benchmark has limited usefulness. It will tell you how your system runs a series of preselected scenes from the alpha version of Ashes of Singularity. We do not believe it is a good indicator of overall DirectX 12 gaming performance.
We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown. We have the upmost confidence in DX12, our DX12 drivers and our architecture’s ability to perform in DX12.
When accurate DX12 metrics arrive, the story will be the same as it was for DX11.
It should be noted that NVIDIA’s mention of a MSAA performance bug while running on DX12 has been contested by developer Oxide Games, which published a blog post of its own talking about some “misinformation” being spread on the Ashes of the Singularity benchmark. They also dispute the fact that this test is not useful, of course:
It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!
There’s also this cryptic but seemingly ominous tweet by Brad Wardell, CEO of Stardock, which is publishing Ashes of the Singularity.

Our take
NVIDIA and Oxide/Stardock are at odds right now, and it’s easy to understand why. This is the first publicly available benchmark of DX12 performance and obviously NVIDIA would have liked to get different results than these ones; on the other hand, Oxide and Stardock aren’t interested in taking the blame for all of this.
Obviously, it is impossible for us to say where the issue really lies, but there clearly is one right now. Of course, when it comes to consumers, there seems to be no cause for concern – whether there’s something wrong on Ashes of the Singularity’s Alpha or on NVIDIA’s 355.60 driver, it will probably be fixed way before the game’s release.
There’s no reason to think that NVIDIA cards won’t enjoy fairly similar DX12 performance boosts to what we have seen on AMD cards once the software is mature. As mentioned by Oxide in their blog post, DirectX 11 was quite terrible at first and that went on for a few years; DirectX 12, on the other hand, seems in much better shape already at least on the AMD side.
Let’s give it a bit more time and it will likely shine on NVIDIA cards as well.

Read more: http://wccftech.com/nvidia-we-dont-...-indicator-of-dx12-performance/#ixzz3j5fToQ2n
 
As you all know, NVIDIA released the 355.60 driver specifically for Ashes of the Singularity’s Alpha, which is in itself a rare occurrence for a game still in development. Even so, we registered mixed results in our DX12 performance benchmarks with NVIDIA cards and clearly the company noticed all of this on its own, as they reached out to the press in order to give their side to the story.
We were able to get a detailed statement from NVIDIA’s Brian Burke, Senior PR Manager. Here’s what he had to say on the matter:
This title is in an early Alpha stage according to the creator. It’s hard to say what is going on with alpha software. It is still being finished and optimized. It still has bugs, such as the one that Oxide found where there is an issue on their side which negatively effects DX12 performance when MSAA is used. They are hoping to have a fix on their side shortly.
We think the game looks intriguing, but an alpha benchmark has limited usefulness. It will tell you how your system runs a series of preselected scenes from the alpha version of Ashes of Singularity. We do not believe it is a good indicator of overall DirectX 12 gaming performance.
We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown. We have the upmost confidence in DX12, our DX12 drivers and our architecture’s ability to perform in DX12.
When accurate DX12 metrics arrive, the story will be the same as it was for DX11.
It should be noted that NVIDIA’s mention of a MSAA performance bug while running on DX12 has been contested by developer Oxide Games, which published a blog post of its own talking about some “misinformation” being spread on the Ashes of the Singularity benchmark. They also dispute the fact that this test is not useful, of course:
It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!
There’s also this cryptic but seemingly ominous tweet by Brad Wardell, CEO of Stardock, which is publishing Ashes of the Singularity.

Our take
NVIDIA and Oxide/Stardock are at odds right now, and it’s easy to understand why. This is the first publicly available benchmark of DX12 performance and obviously NVIDIA would have liked to get different results than these ones; on the other hand, Oxide and Stardock aren’t interested in taking the blame for all of this.
Obviously, it is impossible for us to say where the issue really lies, but there clearly is one right now. Of course, when it comes to consumers, there seems to be no cause for concern – whether there’s something wrong on Ashes of the Singularity’s Alpha or on NVIDIA’s 355.60 driver, it will probably be fixed way before the game’s release.
There’s no reason to think that NVIDIA cards won’t enjoy fairly similar DX12 performance boosts to what we have seen on AMD cards once the software is mature. As mentioned by Oxide in their blog post, DirectX 11 was quite terrible at first and that went on for a few years; DirectX 12, on the other hand, seems in much better shape already at least on the AMD side.
Let’s give it a bit more time and it will likely shine on NVIDIA cards as well.

Read more: http://wccftech.com/nvidia-we-dont-...-indicator-of-dx12-performance/#ixzz3j5fToQ2n

Oh.... it looks ok to me.
 
Something else i noticed with the PCper benchmark.

Yes it is still an alpha so things might not be running on all cores. But DX12 still seems to love high IPC as show with the difference between the 8370 and 6700k/x5960.

Compared to the application preferring more cores. but it could just be that the rest of the engine (ai etc) is running on up to 4 cores.
 
Looks like a completely flawed benchmark since icnreasing the resoluton from 1080p to 1600p is actually increasing the average FPS, completely useless results.
 
Looks like a completely flawed benchmark since icnreasing the resoluton from 1080p to 1600p is actually increasing the average FPS, completely useless results.

Oh you are talking about the FX 6300 result. could be something to do with it being less cpu bound. But it is clearly cpu bound across all resolutions in that test.

But in the x5960 test 1600p shows a decrease at higher resolution.

It's not a benchmark.

It is a benchmark. this is running a full game simulation.
 
So the question is, is there an issue with AMD's dx 11 performance or Nvidias dx12 performance. Or is it just that AMD gpus really are being bottlenecked more in dx11.

Don't bite my head off.
 
http://www.extremetech.com/gaming/2...-singularity-amd-and-nvidia-go-head-to-head/2

So the question is, is there an issue with AMD's dx 11 performance or Nvidias dx12 performance. Or is it just that AMD gpus really are being bottlenecked more in dx11.

Don't bite my head off.

It looks like currently there is something "wrong" with nvidia's 4xMSAA results, nvidia say its the devs problem, the dev says it's nvidia's... will it be fixed before the game releases? Probably.

Looking at the non-MSAA results, it puts the FuryX and 980ti relatively in the same place as DX11 games/reviews, would be interesting to see some overclocked results.
But basically this is shaping up the same way as mantle - next to no improvement for single GPU (ignoring the fact that AMD's DX11 performance seems so far behind)
 
Last edited:
Looks like a completely flawed benchmark since icnreasing the resoluton from 1080p to 1600p is actually increasing the average FPS, completely useless results.

Yes, pick one out of 10 sets of results, where it's clearly CPU bound, and use that to dismiss everything else, great plan...
 
Back
Top Bottom