• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Caporegime
Joined
17 Mar 2012
Posts
50,844
Location
ARC-L1, Stanton System
What should you expect out of a non-synthetic benchmark?


But what is it exactly that you are going to see in a benchmark that is measuring actual gameplay performance? If you run the Ashes of the Singularity Benchmark, what you are seeing will not be a synthetic benchmark. Synthetic benchmarks can be useful, but they do not give an accurate picture to an end user as to what expect in real world scenarios.
Our benchmark run is going to dump a huge amount of data which we caution may take time and analysis to interpret correctly. For example, though we felt obligated to put an overall FPS average, we don’t feel that it’s a very useful number. As a practical matter, PC gamers tend to be more interested the minimum performance they can expect.


People want a single number to point to, but the reality is that things just aren’t that simple. Real world test and data are like that. Our benchmark mode of Ashes isn’t actually a specific benchmark application, rather it’s simply a 3 minute game script executing with a few adjustments to increase consistency from run to run.
What makes it not a specific benchmark application? By that,we mean that every part of the game is running and executing. This means AI scripts, audio processing, physics, firing solutions, etc. It’s what we use to measure the impact of gameplay changes so that we can better optimize our code.


Because games have different draw call needs, we’ve divided the benchmark into different subsections, trying to give equal weight to each one. Under the normal scenario, the driver overhead differences between D3D11 and D3D12 will not be huge on a fast CPU. However, under medium and heavy the differences will start to show up until we can see massive performance differences. Keep in mind that these are entire app performance numbers, not just graphics.


http://oxidegames.com/2015/08/16/the-birth-of-a-new-api/


Benchmarks.

PCPer

http://www.pcper.com/reviews/Graphi...ted-Ashes-Singularity-Benchmark/Results-Avera




There is a lot of information in each graph so be sure you are paying attention closely to what is being showcased and what bars represent what data. This first graph shows all the GPUs, resolutions and APIs running on the Core i7-5960X, our highest end processor benchmarked. The first thing that stands out to me is how little different there is between the DX11 and DX12 scores on the NVIDIA GTX 980 configuration. In fact, only the 1080p / Low preset shows a performance advantage at all for DX12 over the DX11 in this case; the other three results are showing better DX11 performance!

The AMD results are very different – the DX12 scores are as much as 80% faster than the DX11 scores giving the R9 390X a significant FPS improvement. So does that mean AMD’s results are automatically the better of the two? Not really. Note the DX11 scores for the GTX 980 and the R9 390X – at 1080p / Low the GTX 980 averages 71.4 FPS while the R9 390X averages only 43.1 FPS. That is a massive gap! After utilizing the DX12 that comparison changes to 78.3 FPS vs 78.0 FPS – a tie. The AMD DX12 implementation with Ashes of the Singularity in this case has made up the difference of the DX11 results and brought the R9 390X to a performance tie with the GTX 980.
Computerbase

http://www.computerbase.de/2015-08/...diagramm-ashes-of-the-singularity-3840-x-2160





Eurogamer

http://www.eurogamer.net/articles/digitalfoundry-2015-ashes-of-the-singularity-dx12-benchmark-tested


DidgitalFoundry




Wccftech


NVIDIA: We Don’t Believe AotS Benchmark To Be A Good Indicator Of DX12 Performance
http://wccftech.com/nvidia-we-dont-believe-aots-benchmark-a-good-indicator-of-dx12-performance/

'Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.'



We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver. Though, it would be helpful if Nvidia quit shooting the messenger.

http://forums.oxidegames.com/470406
http://forums.oxidegames.com/470406
 
Last edited:
This is the important paragraph for me. :)

Being fair to all the graphics vendors

Often we get asked about fairness, that is, usually if in regards to treating Nvidia and AMD equally? Are we working closer with one vendor then another? The answer is that we have an open access policy. Our goal is to make our game run as fast as possible on everyone’s machine, regardless of what hardware our players have.

To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.

We only have two requirements for implementing vendor optimizations: We require that it not be a loss for other hardware implementations, and we require that it doesn’t move the engine architecture backward (that is, we are not jeopardizing the future for the present).

They felt a need to make a point of being unbiased, Did some optimizations cripple the competition? :D

On a serious note i'm looking forward to some DX12 stuff.....
 
As you all know, NVIDIA released the 355.60 driver specifically for Ashes of the Singularity’s Alpha, which is in itself a rare occurrence for a game still in development. Even so, we registered mixed results in our DX12 performance benchmarks with NVIDIA cards and clearly the company noticed all of this on its own, as they reached out to the press in order to give their side to the story.
We were able to get a detailed statement from NVIDIA’s Brian Burke, Senior PR Manager. Here’s what he had to say on the matter:
This title is in an early Alpha stage according to the creator. It’s hard to say what is going on with alpha software. It is still being finished and optimized. It still has bugs, such as the one that Oxide found where there is an issue on their side which negatively effects DX12 performance when MSAA is used. They are hoping to have a fix on their side shortly.
We think the game looks intriguing, but an alpha benchmark has limited usefulness. It will tell you how your system runs a series of preselected scenes from the alpha version of Ashes of Singularity. We do not believe it is a good indicator of overall DirectX 12 gaming performance.
We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown. We have the upmost confidence in DX12, our DX12 drivers and our architecture’s ability to perform in DX12.
When accurate DX12 metrics arrive, the story will be the same as it was for DX11.
It should be noted that NVIDIA’s mention of a MSAA performance bug while running on DX12 has been contested by developer Oxide Games, which published a blog post of its own talking about some “misinformation” being spread on the Ashes of the Singularity benchmark. They also dispute the fact that this test is not useful, of course:
It should not be considered that because the game is not yet publically out, it’s not a legitimate test. While there are still optimizations to be had, Ashes of the Singularity in its pre-beta stage is as – or more – optimized as most released games. What’s the point of optimizing code 6 months after a title is released, after all? Certainly, things will change a bit until release. But PC games with digital updates are always changing, we certainly won’t hold back from making big changes post launch if we feel it makes the game better!
There’s also this cryptic but seemingly ominous tweet by Brad Wardell, CEO of Stardock, which is publishing Ashes of the Singularity.

Our take
NVIDIA and Oxide/Stardock are at odds right now, and it’s easy to understand why. This is the first publicly available benchmark of DX12 performance and obviously NVIDIA would have liked to get different results than these ones; on the other hand, Oxide and Stardock aren’t interested in taking the blame for all of this.
Obviously, it is impossible for us to say where the issue really lies, but there clearly is one right now. Of course, when it comes to consumers, there seems to be no cause for concern – whether there’s something wrong on Ashes of the Singularity’s Alpha or on NVIDIA’s 355.60 driver, it will probably be fixed way before the game’s release.
There’s no reason to think that NVIDIA cards won’t enjoy fairly similar DX12 performance boosts to what we have seen on AMD cards once the software is mature. As mentioned by Oxide in their blog post, DirectX 11 was quite terrible at first and that went on for a few years; DirectX 12, on the other hand, seems in much better shape already at least on the AMD side.
Let’s give it a bit more time and it will likely shine on NVIDIA cards as well.

Read more: http://wccftech.com/nvidia-we-dont-...-indicator-of-dx12-performance/#ixzz3j5fToQ2n

Oh.... it looks ok to me.
 
Simiulalrly, there is often an increase in FPS going from low settigns to high setitngs.

Then there is the fact that in some of the nvidia benches the DX12 result is lower than DX 11 which si apretty obvious sign the game enigne or driver has a flaw.

Its pretty black and white that the numbers coming out of that benchmark are compelte junk, for both vendors. There are some glimpses that AND will do well but it is hard to trust any of it when the numbers have fundamental flaws. Increasing the resolution should never increase average FPS.

If AMD had this problem i would tend to agree its an engine problem, but they don't.

I can see why Nvidia are being so defensive, it looks like a Driver issue on Nvidia's part.
 
The results I have seen so far are total garbage. Any benchmark that does not scale with resolution is totally flawed. Any DX12 bench that does not scale with resolution even more so as the whole idea of the new API is to remove the CPU bottleneck.

I will only take any new DX12 bench seriously when it scales in the same way as Heaven 4 both with resolution and number of GPUs.

I expect there will be a new version of the Heaven bench that uses DX12 and I think that will be a far better guide.


You mean like higher FPS at lower res or lower IQ? it is doing that :p
 
'Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.'



We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver. Though, it would be helpful if Nvidia quit shooting the messenger.

http://forums.oxidegames.com/470406
 
Last edited:
To be fair it was Nvidia's Drivers.

'Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.'



We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver. Though, it would be helpful if Nvidia quit shooting the messenger.

http://forums.oxidegames.com/470406

i'm sure now that Nvidia know about it they will fix it. :)
 
I've noticed a few little niggles and bugs from my limited time playing around with it, but it seems like it's going to be a good benchmark. It's great to see all 16 threads getting fully utilised in a game.

Just think, consoles have been utilizing multiple threads for a decade and a half, its taken all that time of Microsoft dragging their feet before we can use more CPU power than what is equivalent of a multi-core CPU of the time.

High powered GPU's can now stretch their legs.
 
Last edited:
Just a decade, Xbox was the first 21st century console to ditch the single core style in favour of it's tricore in 2005 (The Sega Saturn did have dual CPUs and multithreading in 1994, but SEGA moved back to single core afterwards as it made it easier for developers).

Yeah i don't care :p 10 years is 9 years too long.
 
This thread man. This thread.

Looking forward to seeing more benchmarks from other DX 12 titles now!


Me too, so much so i have taken to doing it myself, spent the last year preparing in Cryengine but just switched to Unreal Engine and started from scratch, Crytek dragging their feet on getting a DX12 engine out.
 
Some real gains for AMD and Nvidia, great to see AMD's GPU's showing some nice leaps in performance.

Exactly, and they perform about equally, i don't get why Nvidia are so upset by it, what did they want to see, AMD falling way behind?
after all their DX12 PR maybe.

Its all good really :)
 
I may be in the minority here but I am really not looking to the DX12 future from the way it's shaping up.

If we get to the point as it looks like where £500 CPUs actually offer noticeable improvements over £250 ones then that frees up the GPUs to run free and we're back to the Y2K era of having to replace our systems every year to keep running stuff at decent settings :(

People can complain all they like about how game improvements have slowed dramatically over the past 5+ years however I kind of like the fact that you can still play new games at decent settings using a 3 year old setup.

Maybe I'm just getting old lol, I remember the past and have no desire to relive it :P

This benchmark is an extreme example, an i5 can push 13m Draw Calls Per second with 33ms Latency (30 FPS), trust me nothing is going to call more than 5m instances a second for a while yet.

Like the Star Swarm Benchmark this has been tweaked to call vast amounts instances where it ordinarily wouldn't.
 
True. But although this game is doing more, it is not pushing draw calls to the extreme, the performance is being made up due to the reduced driver overheads. But it does not mean the CPU is doing nothing, I was quite surprised with AMDMatt showing all 12 threads being used on his 5960x at 100%.

He has 3 or 4 Fury-X
 
What they wanted to see was something representative of the Dx12 performance they see in other game engines, not something that is obviously filled with bugs, exactly what nvidia publicly state. Really not hard to figure out is it.

When the first Dx12 benchmark is a complet dud it is very disappointing.


The bugs are in Nvidia's Drivers. :)

'Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.'



We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver. Though, it would be helpful if Nvidia quit shooting the messenger.

http://forums.oxidegames.com/470406
http://forums.oxidegames.com/470406
 
Back
Top Bottom