• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ashes of the Singularity Coming, with DX12 Benchmark in thread.

Just think, consoles have been utilizing multiple threads for a decade and a half

Just a decade, Xbox was the first 21st century console to ditch the single core style in favour of it's tricore in 2005 (The Sega Saturn did have dual CPUs and multithreading in 1994, but SEGA moved back to single core afterwards as it made it easier for developers).
 
Just a decade, Xbox was the first 21st century console to ditch the single core style in favour of it's tricore in 2005 (The Sega Saturn did have dual CPUs and multithreading in 1994, but SEGA moved back to single core afterwards as it made it easier for developers).

Yeah i don't care :p 10 years is 9 years too long.
 
Indeed, I agree. My point is that some of the people on this forum seem to be "act" (in a very poor way) non biased, yet always seem to dig at AMD for every little detail, regardless of how small. Then "conveniently" ignore nvidia's problems until they are huge.

Yet THIS time, its quite a small problem, most likely just needs a driver update or something, and yet the amount of damage control is hilarious!

Very true dude. Sadly its as bad whichever side of the fence you sit on, despite what the other side would have you believe ;)
 
Indeed, I agree. My point is that some of the people on this forum seem to be "act" (in a very poor way) non biased, yet always seem to dig at AMD for every little detail, regardless of how small. Then "conveniently" ignore nvidia's problems until they are huge.

Yet THIS time, its quite a small problem, most likely just needs a driver update or something, and yet the amount of damage control is hilarious!

Well, the Nvidia crew would like to believe that their hardware is better but cannot take it when AMD's cards are finally on par with Nvidia's, all due to driver overhead reduction.

AMD's results are not even beating the Nvidia results, just matching them yet the fanboys are getting worried. Imagine the uproar if AMD pulled ahead by a larger margin...:p
 
Very true dude. Sadly its as bad whichever side of the fence you sit on, despite what the other side would have you believe ;)

Haha yes I imagine. As much as I try not to, I am biased towards AMD, although I like to think I am not a fanboy. I think just because they generally offer better price to performance and I am a massive penny pincher. Everyone is biased though, its kinda human nature...
 
Wow!
Some users on here! Go through great strengths to get others in trouble.

About higher Resolutions and Frame rate.. Since me switching to 1440p from 1080p I have noticed a couple games show no sign of performance drop.. In fact GTA5 performance is better for me.. RO2 is another game I have noticed better performance going higher Resolution. This not always the case though some games do indeed so a decrease in performance.

But what I am saying is not all games do drop in performance the higher the resolution, I wanted to showcase this with GTA5 but servers is down.
 
Last edited:
This thread man. This thread.

Looking forward to seeing more benchmarks from other DX 12 titles now!


Me too, so much so i have taken to doing it myself, spent the last year preparing in Cryengine but just switched to Unreal Engine and started from scratch, Crytek dragging their feet on getting a DX12 engine out.
 
Remind me again how the increasingly thin skinned and defensive AMD fans react to anything with a Nvidia logo on it.

U4q3XfC.jpg.png
 
'Nvidia mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode. More on that issue here. On top of that, the effect on their numbers is fairly inconsequential. As the HW vendor's DirectX 12 drivers mature, you will see DirectX 12 performance pull out ahead even further.'



We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver. Though, it would be helpful if Nvidia quit shooting the messenger.

http://forums.oxidegames.com/470406

Owned? Looks like oxide games just made nVidia look silly! And look how nVidia are quick on the gun to shoot down some one else when it's them!

Quite annoyed with nVidia for this tbh! It was their driver after all!
 
Indeed, I agree. My point is that some of the people on this forum seem to be "act" (in a very poor way) non biased, yet always seem to dig at AMD for every little detail, regardless of how small. Then "conveniently" ignore nvidia's problems until they are huge.

Yet THIS time, its quite a small problem, most likely just needs a driver update or something, and yet the amount of damage control is hilarious!

Are you kidding, It goes both ways. Something runs better on Nvidia, that is Nvidia's fault or the developer or gameworks or or or.....

If it runs better on AMD and that is because of the awesome AMD hardware and DX12 performance etc.... We all want AMD and Nvidia competing but it gets boring watching the same old reactions. This applies vice versa.
 
I did think it was funny that nvidia blamed oxide for buggy MSAA support. Given that MSAA is built into DX12 (well, it is in DX11 anyway) and you just have to enable it, it's all handled by the API. Unless they removed support for that in DX12 (which wouldn't be too surprising given what DX12's aim is) I don't see how it could be oxide's fault at all.
 
I did think it was funny that nvidia blamed oxide for buggy MSAA support. Given that MSAA is built into DX12 (well, it is in DX11 anyway) and you just have to enable it, it's all handled by the API. :/

They all try to pass the blame, we have seen it numerous times from both vendors.
 
Some real gains for AMD and Nvidia, great to see AMD's GPU's showing some nice leaps in performance.
 
So apart from people being more focused on what number is displayed in the corner of their screen whilst playing the game, is the game actually going to be worth purchasing or is it going to ride on the crest of an initial DX12 release?

I like RTS games and like to see innovation in them but with everyone bickering over decimal points in scores I haven't seen anything to indicate the longevity, gameplay and immersion in the actual game!
 
Some real gains for AMD and Nvidia, great to see AMD's GPU's showing some nice leaps in performance.

Exactly, and they perform about equally, i don't get why Nvidia are so upset by it, what did they want to see, AMD falling way behind?
after all their DX12 PR maybe.

Its all good really :)
 
The problem with the above is the scores are not reflecting the changes in resolution.

It could be argued that some of the CPUs will find running this bench difficult but I am pretty sure if a 5960X with 8/16 cores threads is struggling to show the difference something is very wrong with the bench.:)

Edit looking at the 6700k scores I would say the 5960X is only using 4/8 cores threads.

I know you like synthetic benches but for things like DX12 is it really best top use a bench that deliberately tries to avoid any kind of CPU bottleneck while testing the GPU, thus ignoring one of the main points of DX12?

Different benchmarks are suited to different things and scores won't necessarily change based on resolution if CPU-bound - just because it isn't relevant to your setup doesn't make it a useless bench. Given the bench shows GPU scaling just fine with a high-end CPU I'm not really sure what you're so worked up about anyway. Yes, being able to separately bench CPU & GPU is nice, but real games all use both so combined tests have relevance too. I'm not saying synthetic ones that try to avoid bottlenecks on one side or the other shouldn't exist, but to suggest anything that uses more of the system is pointless suggests you've forgotten that other people do more than bench their machines.

Of course, if we're not thinking about buying this game then it being 'real game performance' or a moderate approximation of this becomes far less relevant :p

Edit: Not saying this is a great bench, may be super-flawed for some reason or another, just saying that it being possible to be CPU or GPU bound is not inherently bad.
 
Last edited:
So apart from people being more focused on what number is displayed in the corner of their screen whilst playing the game, is the game actually going to be worth purchasing or is it going to ride on the crest of an initial DX12 release?

I like RTS games and like to see innovation in them but with everyone bickering over decimal points in scores I haven't seen anything to indicate the longevity, gameplay and immersion in the actual game!

If it's anything like the first Supreme Commander it will be a great game. We haven't had any recent RTS AFAIK so can't wait.
 
Yeah i don't care :p 10 years is 9 years too long.

I may be in the minority here but I am really not looking to the DX12 future from the way it's shaping up.

If we get to the point as it looks like where £500 CPUs actually offer noticeable improvements over £250 ones then that frees up the GPUs to run free and we're back to the Y2K era of having to replace our systems every year to keep running stuff at decent settings :(

People can complain all they like about how game improvements have slowed dramatically over the past 5+ years however I kind of like the fact that you can still play new games at decent settings using a 3 year old setup.

Maybe I'm just getting old lol, I remember the past and have no desire to relive it :P
 
Back
Top Bottom