• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Stands to reason to me that AMD would have a lead in performance and drivers given they have been optimising for Mantle since at least the HD7xxx series and aspects of mantle made it into both Vulcan and DX12, seems they would have a pretty decent lead in DX12 optimisations, be that hardware or software.

Except Mantle has nothing to do with DX12.
 
Stands to reason to me that AMD would have a lead in performance and drivers given they have been optimising for Mantle since at least the HD7xxx series and aspects of mantle made it into both Vulcan and DX12, seems they would have a pretty decent lead in DX12 optimisations, be that hardware or software.

Nvidia's response seemed a bit panicky which only gives credence to the speculation of some that AMDs hardware may be better at DX12. Perhaps though they just have a long memory and ATIs massive dominance in early DX9 gave them a few shivers.

Personally, I think it's a non issue at the moment but time will tell I guess.

Think I'll just enjoy the show

But Nvidia have stated that they have been working with Microsoft on Directx 12 for the past 4 years. Regardless of what AMD may have been doing with their own Low Abstraction API.

In all honesty it is just a case of the Nvidia side of things not seeing what they wanted. i.e The AMD cards remaining behind the Nvidia ones.

In the GPU bound situations i don't see the situation changing that drastically when a pure GPU benchmark comes out. Considering Nvidia have already been working with Stardock to help implement faster shaders that were originally running slower on their own hardware.

But everyone is a winner overall, Although not as drastic as with AMD, the nvidia cards still saw a boost in performance.

And i had a bit more thinking on why the Nvidia cards may have lost a few FPS, what i thought up was the massive number of extra point lights the Directx12 version of the game now has compared to the DX11 version. So essentially the DX12 version of the game has Higher IQ yet is running near the same performance by a few fps. But then again it could all be pure statistical errors.
 
This +100


The only thing I would say about this benchmark is, If this is indicative of the final game, then you definitely don't want to be running it on an AMD card without DX12. Truly shocking performance.

I've just read that the shocking performance on AMD in DirectX11 is deliberate to make the DX12 result look better. Well I suppose it takes all sorts to wear a tin foil hat. :p


LtMatt I assume that the Theoretical CPU Framerate result goes up and down with higher or lower CPU clocks ?



Oh and one last thing. Called it back on post 19 ;)

If this is anythign liek DX12 then game developers simply wont touch it.

Luckily for gamers the benchmark is hopelessly wrong.

The above is a good point about DX11, the 390X is a very good card and often beats out the GTX 980 at high resolutions which begs the question -

If most users who buy the game will still be using DX11 why have the game devs made so little effort to look after them ?

The above also questions how reliable this bench is for DX12 if the devs have given so little thought to DX11 users.
 
The above is a good point about DX11, the 390X is a very good card and often beats out the GTX 980 at high resolutions which begs the question -

If most users who buy the game will still be using DX11 why have the game devs made so little effort to look after them ?

The above also questions how reliable this bench is for DX12 if the devs have given so little thought to DX11 users.

Why would anyone with a DX12 capable card use DX11?

I'm sure there are still some VLM and Fermi GPU's out there in rigs but it can't be many, those cards belong in a museum.
 
Why would anyone with a DX12 capable card use DX11?

I'm sure there are still some on VLM and Fermi GPU's but it can't be many, those cards belong in a museum.

Because the majority of windows users won't all be upgrading to Win 10, there will be loads who stay on Win 7 and 8 for reasons of their own.
 
The above is a good point about DX11, the 390X is a very good card and often beats out the GTX 980 at high resolutions which begs the question -

If most users who buy the game will still be using DX11 why have the game devs made so little effort to look after them ?

The above also questions how reliable this bench is for DX12 if the devs have given so little thought to DX11 users.

What?
I don't know any game that can handle that many units under details on DX11.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/

Oxide also is very proud of its DX11 engine. As a team, we were one of the first groups to use DX11 during Sid Meier’s Civilization V, so we’ve been using it longer than almost anyone and know exactly how to get the get the most performance out of it. However, it took 3 engines and 6 years to get to this point . We believe that Nitrous is one of the fastest, if not the fastest, DX11 engines ever made.

It would have been easy to engineer a game or benchmark that showed D3D12 simply destroying D3D11 in terms of performance, but the truth is that not all players will have access to D3D12, and this benchmark is about yielding real data so that the industry as a whole can learn. We’ve worked tirelessly over the last years with the IHVs and quite literally seen D3D11 performance more than double in just a few years time. If you happen to have an older driver laying around, you’ll see just that. Still, despite these huge gains in recent years, we’re just about out of runway.

Tbh, DX11 performance in that bench is awesome. AMD just suffers from high overhead (same thing that has been hot topic for the past couple of years now).
 
Last edited:
Because the majority of windows users won't all be upgrading to Win 10, there will be loads who stay on Win 7 and 8 for reasons of their own.

Point taken and agreed, but really its their loss, Windows 10 with Classic Shell is just like Windows 8.1 with Classic Shell which is a lot like windows 7..... and its free right now.
 
What?
I don't know any game that can handle that many units under details on DX11.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/



Tbh, DX11 performance in that bench is awesome. AMD just suffers from high overhead (same thing that has been hot topic for the past couple of years now).

The DX11 performance on the AMD cards in the bench is shocking and nothing to do with the hardware, this is an epic and arrogant fail by the game devs.

I still use 290Xs and I fully expect them to best GTX 980s @2160p using DX11.
 
But Nvidia have stated that they have been working with Microsoft on Directx 12 for the past 4 years. Regardless of what AMD may have been doing with their own Low Abstraction API.

In all honesty it is just a case of the Nvidia side of things not seeing what they wanted. i.e The AMD cards remaining behind the Nvidia ones.

In the GPU bound situations i don't see the situation changing that drastically when a pure GPU benchmark comes out. Considering Nvidia have already been working with Stardock to help implement faster shaders that were originally running slower on their own hardware.

But everyone is a winner overall, Although not as drastic as with AMD, the nvidia cards still saw a boost in performance.

And i had a bit more thinking on why the Nvidia cards may have lost a few FPS, what i thought up was the massive number of extra point lights the Directx12 version of the game now has compared to the DX11 version. So essentially the DX12 version of the game has Higher IQ yet is running near the same performance by a few fps. But then again it could all be pure statistical errors.

Yeah, I'm only speculating myself. Primarily I see DX12 as a revelation in CPU performance but the GPU tests will pit AMD and Nvidia against each other directly and I don't really expect to see much difference between them but I do expect there to be a massive difference when compared to DX11 given AMDs CPU overheads in that API.

I'm just hoping that AMD go back and rework some of their DX11 code, seems to me that they invested heavily in low level APIs and that's clearly paid off for them, but legacy DX11 is going to be with us for a while no matter how fast DX12 is adopted.
 
The DX11 performance on the AMD cards in the bench is shocking and nothing to do with the hardware, this is an epic and arrogant fail by the game devs.

I still use 290Xs and I fully expect them to best GTX 980s @2160p using DX11.

Kaap, Mate :) even i accept AMD's DX11 Driver overheads still need work.

i also think AMD have largely ignored their own DX11 optimization for this to put as much resource as they can into getting DX12 Drivers right, which seems to have paid off.... but somewhat compounded the DX11 problem.
 
The problem with AMD is they come off as a company with too few resources and too many fronts to fight.

It's only time until they get stretched again and fall behind in a certain area.

Either, CPU, GPU drivers, game optimizations, xfire performance etc etc etc....

They can not fight both Intel and Nvidia it's impossible.
 
Last edited:
Or 900 Thousand strands of Grass, 80 Thousand Tress and Shrubs to make ones world of vegetation looks like something that actually resembles reality.

I'm not sure you understood what I meant by assets. I'm not referring to texture density or the ability to draw the same thing a bazillion times or draw an endless tessellated water table unseen but assets as in how Oxide refers to them as units. Having the ability to have thousands or hundreds of thousands of independent units. That's the crux of this whole thing that DX12 brings to this rts game.
 
Nvidia would've been better off just saying nothing rather than release this crappy PR statement which does nothing other than make them look stupid.

Infact, it doesnt. It's only lowly forum members that say that.

If Nvidia have had that code for months like Oxide say, and have updates to code to compile on a daily basis like Oxide say, then I'm pretty sure that Oxide have something they need to look at. Oxide are a two bit dev heavily involved with AMD so like AMD will say anything to have a go at Nvidia. Infact Oxide have said themselves that there are issues with their MSAA code.
 
Infact Oxide have said themselves that there are issues with their MSAA code.

They never said the problem was with their MSAA code, they said the MSAA path in the current DX12 drivers is broke/sub optimal. Hence Nvidia and everyone else need to fix it in their drivers. And they even offered to implement a workaround by using a DX11 type MSAA shader instead.
 
Infact, it doesnt. It's only lowly forum members that say that.

If Nvidia have had that code for months like Oxide say, and have updates to code to compile on a daily basis like Oxide say, then I'm pretty sure that Oxide have something they need to look at. Oxide are a two bit dev heavily involved with AMD so like AMD will say anything to have a go at Nvidia. Infact Oxide have said themselves that there are issues with their MSAA code.

Nvidia had the code long enough to create a driver for it... Also Oxide said anything but that, their statement put the blame on Nvidia's drivers for the poor performance.
 
Infact, it doesnt. It's only lowly forum members that say that.

If Nvidia have had that code for months like Oxide say, and have updates to code to compile on a daily basis like Oxide say, then I'm pretty sure that Oxide have something they need to look at. Oxide are a two bit dev heavily involved with AMD so like AMD will say anything to have a go at Nvidia. Infact Oxide have said themselves that there are issues with their MSAA code.

By that logic every dev involved with Nvidia is a two bit dev too, right?

Nvidia cards are getting the same performance as AMD so whats the problem?
 
Last edited:
Back
Top Bottom