• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Not so much when its bad PR.

DX12 is great for everyone though, should probably just focus on that. Shame they can't work on DX12 without releasing some crap PR statement alienating the developer, I mean that looks so good for Nvidia. A large amount of their users will probably believe whatever tuff Nvidia farts out in a PR statement.

Before anyone decides to flame me, I should add that AMD are no better with releasing crappy PR statements. At least Nvidia doesn't have Roy Taylor. :p

Maybe on the broader issue of developers, nvidia are slightly more 'robust' and wary of cementing themselves alongside developers. Look at how Arkham Knight turned out.....

nvidia may or may not have had reason to place blame on oxide but regardless, the finished DX12 product will undoubtedly see parity between AMD and nvidia in some aspects and gains and losses in others and as far as i'm concerned, good. If AMD pull off a blinder and Titan X prices fall as a result, I can go SLI sooner than I'd hoped! The last ten pages of this thread have just turned into the standard faceless comments and insults over software that is in its infancy which is pretty childish and reddit-esque.

I genuinely can't see why people are getting themselves so angry over it and insulting and shouting each other down. Just be grateful that an improvement is on its way and enjoy it when it is released ffs (not you just a generalisation of the comments I've read today :eek:).

Anyway, I stand by my original comment in this thread, I hope the game is actually worth playing as an RTS and doesn't ride on the back of DX12.
 
I would love to see an MMO get Low Abstraction API support, i can see WoW getting it fairly quickly when Vulkan drops seeing that Blizzard are backers of it.

Same here, IMO the is a very good chance that WoW will be bumped from OpenGL to Vulkan before it goes from DX11 to DX12, this is OFC dependant on how well Apple support Vulkan in OSX.
 
Same here, IMO the is a very good chance that WoW will be bumped from OpenGL to Vulkan before it goes from DX11 to DX12, this is OFC dependant on how well Apple support Vulkan in OSX.

I think Apple are only supporting their own METAL API from what i have read, which is a shame. But it has shown improvements and i think the next OSX interface uses METAL for the majority of its rendering. and many of the native OSX Applications that it is shipped with are getting support.

AS much as i hate apple as a company, OSX itself is not that bad.
 
Do you know what a stress test is? It seems not.

Apparently not, so please do enlighten me :) I bow to you superior knowledge, although that clearly doesn't extend to how an RTS GAME will eat all CPU Time in much the same way as a Stress Test.


Also, yes there are CONSTANT issues with HT still now, and no I don't.

For you maybe, but the rest of the world has next to no issue with HT.

Because the internet is still full of complaints about it and has been for the last 13 Years since HT was introduced ????



The results I mentioned re: FX8370 vs. 5960X have been reproduced by others ... and it is ameliorated by disabling HT.

Maybe so - doesn't mean that everyone has an issue. Glad it fixes it for you in your unique circumstance.

The stutter and freezing in desktop use (in my use), and frame drops in games disappears when HT is disabled, unless under highly threaded workloads where it could benefit.

If you are getting stutter and freezing on your desktop at idle, then something else is causing it - HT doesn't make your desktop stutter - how can it?

I have numerous PC's I manage here at work - majority are I3's and so have HT, but also Xeons with HT, and several iterations of I7 with HT. In the past we have had Pentium 4's with HT. Not once has anyone ever complained about desktop stuttering - it is unique to your setup, whether it's something you have installed or some setting you have changed somewhere.

This is something I have never encountered on the FX6xxxs and FX8xxxs I've built for others, ever, and have encountered on every single HT (enabled) Intel CPU I've ever used (never used a Skylake or Haswell-E - but at least the latter certainly still exhibits this behaviour).

The FX6xxxs and FX8xxxs had exactly the same issues as HT first did, until patches were made available to manage the use of the shared FPUs.
 
Last edited:
I wonder how this thread would have gone had Nvidia not made such a public fracas with Oxide Games.

Just looking at the results it looks like a good thing for both and there is no reason why that would have been the conclusion of the debate.

Yup.:D



100% agree with Joel Hruska's conclusion opeth, thanks mate.:)

On another note:

Any mods able to step in here and help put a stop with the hating in here please:



^
This, as greg said, the discussions were better than now, because you can't have a discussion any more, only fights n snipes@OcUK.:o

Iv'e not been posting in here much due to the relentless arguments, please sort it out, this used to be a great place to come:( and we really need it back.:)

The 3 strike rule launched, and it was great, everyone got on and it was good again, is it still enforced, or could you re-instate it again???

+1 ^^^^
 
Last edited:
I would love to see an MMO get Low Abstraction API support, i can see WoW getting it fairly quickly when Vulkan drops seeing that Blizzard are backers of it.

You are a big, big optimist if you think that. I don't see WoW getting that sort of update any sooner than 2 years from now, at the very least.
 
I wonder how this thread would have gone had Nvidia not made such a public fracas with Oxide Games.

Just looking at the results it looks like a good thing for both and there is no reason why that would have been the conclusion of the debate.



+1 ^^^^

They didn't make a fracas and most of the talk has come from here and accusing Nvidia of this and that. I read through this thread again earlier to see if I was right to get annoyed and my annoyance was nothing to do with the thread or Nvidia or AMD but the way some people are over reacting and frankly quite personal in their responses. I did some quick looking when I got in from work last night and Oxide Games is made up of some knowledgeable people from the industry and they are sided with AMD but that doesn't make them wrong. They and AMD have done well on their DX12 implementation and Nvidia do need to do some work on their DX12 in this but neither is doing badly, so I don't get why all the fuss?
 
They didn't make a fracas and most of the talk has come from here and accusing Nvidia of this and that. I read through this thread again earlier to see if I was right to get annoyed and my annoyance was nothing to do with the thread or Nvidia or AMD but the way some people are over reacting and frankly quite personal in their responses. I did some quick looking when I got in from work last night and Oxide Games is made up of some knowledgeable people from the industry and they are sided with AMD but that doesn't make them wrong. They and AMD have done well on their DX12 implementation and Nvidia do need to do some work on their DX12 in this but neither is doing badly, so I don't get why all the fuss?

This whole debate, including you is about Nvidia accusing Oxide Games of having Code issues that put Nvidia at a disadvantage.

Now personally i don't see any disadvantage for Nvidia in this, AMD are a little ahead in DX12 efficiency but not much at all and they have that same little advantage in FutureMark's API overhead.

As for the apparent MSAA bug, that was Nvidia's Driver.

All of that has sparked arguments in this thread, you were also involved, Greg.

If you just look at the results without Nvidia's whinging they look like a good thing for both Nvidia and AMD.
That ultimately is what they are.
 
You have failed to read my post Humbug. My annoyance is with the way people are treating each other and dumbass remarks like Fanboy etc. I used to enjoy a good debate and me and you have been involved in many but some of the comments in here are pathetic and in the olden days, me and you would have seen a 3 month suspension if we said those things.

That is my point and the reason I got annoyed.
 
You are a big, big optimist if you think that. I don't see WoW getting that sort of update any sooner than 2 years from now, at the very least.

WoW was updated to DX11 less than a year after it was released (DX11 not WoW ;P) so it all depends on how Vulkan/DX12 do.

If Apple support Vulkan well (or at least at all) it will be worth Blizzard porting the OSX version from OpenGL to Vulkan, which in turn makes it worth porting the DX11 version to Vulkan (as it was going to be ported to DX12 anyway and this means Blizzard can use one new API across both platforms).

If Apple fail to support Vulkan properly as they did OpenGL then Blizzard will probably just port DX11 to DX12 and leave OSX on OpenGL.



me and you

you and I ;)
 
You have failed to read my post Humbug. My annoyance is with the way people are treating each other and dumbass remarks like Fanboy etc. I used to enjoy a good debate and me and you have been involved in many but some of the comments in here are pathetic and in the olden days, me and you would have seen a 3 month suspension if we said those things.

That is my point and the reason I got annoyed.

I'm not quite that far in the crap as to get a 3 Month suspension, i have had 1 strike but i take your point, Moderating seems very inconsistent in here, its why i +1 Tommy.
 
Problem is, WoW have lost nearly half their subscribers, so it will depend what happens and if they see any advantage to upgrading given how old the whole thing is

Although the next expansion is in development, and always causes a surge of re-subscribers (albeit for a short time), maybe they might get it in time for that.
 
Problem is, WoW have lost nearly half their subscribers, so it will depend what happens and if they see any advantage to upgrading given how old the whole thing is

Even on new hardware it still causes massive FPS drops in raids and built up areas in cities. DX 12 or alike vulkan could make the game run so much more better! I remember when i was running the game i was getting sub 50FPS and was thinking on this old s.... no way. Turned on DX11 and ran loads better but still thought i should be getting more FPS.
 
Indeed R7, it will also prevent the rendering thread from locking up the game thread, so it should further reduce latency and hitching. Since the renderer no longer has to wait on driver callbacks etc.
 
This is just straight up bizarre. ArsTechnica's benchmarks indicates that 980Ti actually *loses* performance under DX12.

I don't know what to make of it. It was always clear that AMD were behind in their drivers when it came to DX11 performance, but was their GPU architecture always that far ahead of the game that when 'unlocked' with DX12, they perform just as well as much more expensive Nvidia cards? Have Nvidia simply focused too much on DX11 serial functionality?

Interesting times ahead, it looks like.
 
This is just straight up bizarre. ArsTechnica's benchmarks indicates that 980Ti actually *loses* performance under DX12.

I don't know what to make of it. It was always clear that AMD were behind in their drivers when it came to DX11 performance, but was their GPU architecture always that far ahead of the game that when 'unlocked' with DX12, they perform just as well as much more expensive Nvidia cards? Have Nvidia simply focused too much on DX11 serial functionality?

Interesting times ahead, it looks like.

Most of the reviews show nvidia losing performance on DX12, hence the discussions. But this is one game... the other demos we've heard of nvidia gain performance under DX12, so what is happening with this one is up for debate... nvidia say it's the devs fault, the devs blame nvidia, and round and round it goes on forums.
 
Back
Top Bottom