• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Confirms GCN Cards Don’t Feature Full DirectX 12 Support – Feature Level 11_1 on GCN 1.0, Featur

It is because all the mantle games made until now were made in the dx11 limitations (game engines built to dx11) so users who cannot use mantle can play too. So there was no need to use it really. If you don't cross the limits then it can't show why it is better. Example: mantle can do many times more drawcalls than dx11, but if the game only use as many as dx can handle, then you will se no difference...if it uses more then you could see dx collapses mantle goes on.

So when they start to build real low level games on dx12 and vulkan they will use it.

the problem with your example is that according to what AMD are saying, asynchronous shaders should give a performance boost any time the game uses shaders, which is all the time, with AMD claiming very big performance gains

what we've seen with the drawcall / CPU limit of DX11 is that when going over to Mantle the CPU is then working a lot less hard and more consistently across the cores, so even where there isn't a clear performance boost there is at least evidence that we are no longer CPU bottlenecked

if Thief does use AS, it doesn't show any kind of similar effect for using it
 
not sure which reviews you've been reading but I see lots of reviews showing frame latency comparisons

I don't play BF4 though, so yeah mantle is pretty much irrelevant to me anyway, plus I have Gsync that works in every game, not just those few games I don't even play

This is not quite cricket.

You would ignore decent features and recognition of performance by just dismissing the potential as you don't play that game! lol

Just because it is just 'a few games' doesn't mean its worthless.
 
This is not quite cricket.

You would ignore decent features and recognition of performance by just dismissing the potential as you don't play that game! lol

Just because it is just 'a few games' doesn't mean its worthless.

Mantle=Dx12
so they be using it soon also and will then make babies as it makes them so happy if their card support dx12 that is.
 
This is not quite cricket.

You would ignore decent features and recognition of performance by just dismissing the potential as you don't play that game! lol

Just because it is just 'a few games' doesn't mean its worthless.

I didn't say it was worthless, I said it was irrelevant to me, as in buying AMD products because of mantle would have been a big disappointment for me as it only ended up in a couple games that I didn't even have cause to play

It was touted as a big plus point and purchase justification, in which it would have thoroughly failed for me, and AMD looking at their results
 
I don't expect you to buy an AMD product Andy, my point was you dismissing BF4 with its benefits as you never played it. It was a big title (badly rushed by dice and I slated it too) but showed multi core and mantle working. Even the games that were released (using mantle) didn't have time to show the full potential.

Those of us who did try out the same game in mantle comparing it to DX seen improved frame rates or smoothness. Ignoring it as you couldnt use it and then come out with:
plus I have Gsync that works in every game, not just those few games I don't even play
is petty.
 
I don't expect you to buy an AMD product Andy, my point was you dismissing BF4 with its benefits as you never played it. It was a big title (badly rushed by dice and I slated it too) but showed multi core and mantle working. Even the games that were released (using mantle) didn't have time to show the full potential.

Those of us who did try out the same game in mantle comparing it to DX seen improved frame rates or smoothness. Ignoring it as you couldnt use it and then come out with:
is petty.

I have bought many AMD products in the past, the last 2 being a 7950 and a 7970m. Both of which I ended up selling because they didn't really work in the games I wanted to play at the time.

I'm really getting lost on what point you are trying to make, other than seemingly attacking me personally for questioning AMD's PR machine.

I can appreciate that some people might have bought every mantle title just for the sake of mantle, and props to them if that is what they've enjoyed. For me that would have been wasted money.
So if someone quotes me and says that I in some way missed out on mantle, well, I'm going to say why I don't think that is the case.


Here is what someone who has tried it had to say;
In the 3 or so games that use it which barely anyone play anymore?

Tried both and it's no smoother than DX is on nVidia. If you have a really old CPU or an AMD CPU it might make it smoother.

But you've not replied to them in the same way.

If you go back to my original point; AMD made lots of noise about mantle giving them a huge competitive advantage, they are doing the same now with asynchronous shaders, I'm just asking if this is going to go the same way, why didn't AS give the big boosts in mantle?
 
Last edited:

If you applied those rules to these forums activity would probably drop by 70% within a few days. Most threads get dominated by a handful of of the same hardcore members which imo puts a lot of newer or younger members off once it turns into some stupid argument. It's a shame as some of those same members also do goodn around here as well which is probably why mods tolerate them.
 
I'm really getting lost on what point you are trying to make, other than seemingly attacking me personally for questioning AMD's PR machine.

I think someone is a little paranoid. Your evidence is one guys post saying it was the same experience (DX11 and Mantle - which it isn't). AMD's PR machine? If you read back to the top of the page it wont take long to filter what people are highlighting, if anyone should be lost it is me with your defence against the 'attack'. I think I will leave it at that as you come in the category of person that cannot debate if you interpret posts as personal attacks.

If you cannot see the point and get lost easily then why waste time having to explain it further? :)
 
Thont mate, people are posting AMD videos and twitter comments saying that asynchronous shaders are not supported by nvidia and will give AMD a huge performance boost

Mantle supported asynchronous shaders yet didn't give AMD a huge performance boost over nvidia even with NVIDIA handicapped by DX11, so my question is; why? And should we take AMD's latest PR at face value when they said the same thing about mantle over DX11?

You singled me out as "not being cricket" for questioning the actual tangible benefit mantle gave comparing nvidia DX11 to AMD on mantle. What people typically do when talking up mantle is compare AMD DX11 to AMD on mantle, which is great, but not really the comparison AMD were promising.

But instead of responding to that you keep cherry picking the least important section of my post just to argue a completely irrelevant point. I don't know what that is but it isn't debating.
 
Last edited:
Mantle supported asynchronous shaders yet didn't give AMD a huge performance boost over nvidia even with NVIDIA handicapped by DX11, so my question is; why? And should we take AMD's latest PR at face value when they said the same thing about mantle over DX11?

Asyncronous Compute support only exist in Thief with Mantle. No other Mantle game uses that feature.
 
Which is exactly my point. If it is already in mantle, why didn't devs who already took the rather drastic step of supporting a tiny market with mantle, go the extra step of supporting AS. And looking at Thief benchmarks, it didn't even give the benefit to performance that AMD are claiming.

Something doesn't quite add up.

Didn't AMD at one point say they had like 40 studios signed up for mantle? And what is it like 3 that actually released a game? So if AS is only supported by AMD hardware, how many games are even going to use it?
 
Last edited:
Is it though? I was under the impression Maxwell 2 would be able to support this fully.


From Anand.

On a side note, part of the reason for AMD's presentation is to explain their architectural advantages over NVIDIA, so we checked with NVIDIA on queues. Fermi/Kepler/Maxwell 1 can only use a single graphics queue or their complement of compute queues, but not both at once – early implementations of HyperQ cannot be used in conjunction with graphics. Meanwhile Maxwell 2 has 32 queues, composed of 1 graphics queue and 31 compute queues (or 32 compute queues total in pure compute mode). So pre-Maxwell 2 GPUs have to either execute in serial or pre-empt to move tasks ahead of each other, which would indeed give AMD an advantage..
 
Last edited:
Twiter
Robert Hallock ‏@Thracks Jun 6
Not 100% accurate, but by far the most accurate article published thus far.

If we talk again about the hardware on sale, the situation about the feature levels is the following:

Feature level 11.0: NVIDIA Fermi, Kepler, Maxwell 1.0
Feature level 11.1: AMD GCN 1.0, INTEL Haswell and Broadwell
Feature level 12.0: AMD GCN 1.1 and GCN 1.2
Feature level 12.1: NVIDIA Maxwell 2.0
The first two feature levels roughly coincide to the DirectX 11 levels with the same name (with some differences due the new resource binding model), while feature level 12.0 and 12.1 are new to Direct3D 12.
Despite being pleonastic, it is worth to restate that feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support” since it does not cover many important or secondary features exposed by Direct3D 12.
In the end, as regards the support of every single capability, it is currently not possible, nor appropriate, to draw up a complete and well-defined table showing the support of on sale hardware.
Unless you name is AMD, INTEL or NVIDIA, you cannot present such report with the drivers currently available on the public channels, nor with non-NDA documentation, therefore everything else is only to be considered as pure rants.
http://www.bitsandchips.it/52-engli...out-tier-and-feature-levels-of-the-directx-12

Robert Hallock ‏@Thracks Jun 6
All the other articles I've read have been hysterical, ill-informed. Needless user panic.
 
Last edited:
PR minions doing their job.

They are now in full damage control mode now that their purposely misleading claims of full DX12 support have been shown to be nothing of the sort. The desperation being shown is pretty amusing.
 
To be perfectly honest, he is quite right, trying to determine who can do what at this stage of DirectX 12's development is more guesswork than anything.
 
PR minions doing their job.

They are now in full damage control mode now that their purposely misleading claims of full DX12 support have been shown to be nothing of the sort. The desperation being shown is pretty amusing.

Even the Article talks about it...

Please do give your expert advice on why there wrong I would love to hear it..
 
Back
Top Bottom