• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Confirms GCN Cards Don’t Feature Full DirectX 12 Support – Feature Level 11_1 on GCN 1.0, Featur

It's not full support at all, it's limited and missing features found in the current highest level of the API. AMD can spin and try to downplay it as much as possible, but they cannot claim to have full support, when their competitor supports even more features of the API.

We are still talking about two lately added optional feature which the devs may or may not use to implement certain effects.
You are playing it out as something that makes all the gcn cards useless in dx12.
We still don't know if they can be emulated, and if so how effectively, just as nv cards emulate sone features.

According to devs on B3D some can be emulated on GCN but the performance drops through the floor, making them unusable. These features have been known about for a while, yet AMD have been happy to mislead the public and their customers. I hate to play this card, but if this had been Nvidia the witch hunt would have been staggering (look at the 970 memory/ROP issue).
 
Last edited:
This reminds me of when ATIs latest cards didnt support Shader model 3.0 and instead went down the shader model 2.2 path. They basically said that 2.2 was the same as 3.0 but went about it a slightly different way but got the same end result. I bought into the ATI hype as I had an X800XTPE at the time and was convinced there was no difference between the two. Anyway fast forward to my first shader model 3.0 card which was the ATI X1900XT and the very first time I saw shader model 3.0, I couldn't believe the difference and what I was missing.

I hope for AMDs sake this is not the same thing and that by not going all out on DX12 they don't miss out on some eye candy.
 
According to devs on B3D some can be emulated on GCN but the performance drops through the floor, making them unusable. These features have been known about for a while, yet AMD have been happy to mislead the public and their customers. I hate to play this card, but if this had been Nvidia the witch hunt would have been staggering (look at the 970 memory/ROP issue).

So you are saying GCN is not going to be able to run DX12 games?
 
It's not full support at all, it's limited and missing features found in the current highest level of the API. AMD can spin and try to downplay it as much as possible, but they cannot claim to have full support, when their competitor supports even more features of the API.



According to devs on B3D some can be emulated on GCN but the performance drops through the floor, making them unusable. These features have been known about for a while, yet AMD have been happy to mislead the public and their customers. I hate to play this card, but if this had been Nvidia the witch hunt would have been staggering (look at the 970 memory/ROP issue).

Dude give it a rest, that must be the only over website you know of other than these ones because I've see you post that exact same link over and over. I swear you must sit there at your screen with a smug look thinking to yourself "yeah that'll show em good!"

Seriously let it go man, your obsession with proving Nvidias support of an unreleased Api is borderline fanatical. Much like everyone else in this thread arguing over something that hasn't been released is pointless. Currently we have cards from both sides that may or may not support some features sync may or may not get used, who cares right now I mean seriously? Wait til the API is released and a few games have been made then we might actually have a point to this currently pointless bickering back and forth.

Right now all you are doing is making yourself look a bit sad by vehemently defending something that right now has zero impact or meaning on current games as of right this minute.

Yes some will say "but it affects our current upgrade plans" to which I again say wait til games with dx12 are released, buying hardware solely for dx12 and its upcoming features right now is pointless
 
This reminds me of when ATIs latest cards didnt support Shader model 3.0 and instead went down the shader model 2.2 path. They basically said that 2.2 was the same as 3.0 but went about it a slightly different way but got the same end result. I bought into the ATI hype as I had an X800XTPE at the time and was convinced there was no difference between the two. Anyway fast forward to my first shader model 3.0 card which was the ATI X1900XT and the very first time I saw shader model 3.0, I couldn't believe the difference and what I was missing.

I hope for AMDs sake this is not the same thing and that by not going all out on DX12 they don't miss out on some eye candy.

That's the wrong comparison. The comparison you should be making is x800xtpe v Nvidia's 6800. From the history books it made no difference from what i remember. The x1900xt was a new ball game compared to the x800. I could be wrong but from memory this is what i remember.
 
That's the wrong comparison. The comparison you should be making is x800xtpe v Nvidia's 6800. From the history books it made no difference from what i remember. The x1900xt was a new ball game compared to the x800. I could be wrong but from memory this is what i remember.


My mate had an 6800 and used to wind me up that I was missing out on Shader model 3.0. The first time I saw it was in a 3d mark on the X1900XT and boy did it look nice.
 
Well am being told by Rob and Roy that these two features, mean nothing for us.. I respect there call over yours Andy.. Now before this just drags into am right and your wrong, Lets just leave it here and wait and see what the future holds.

lmao, so you believe AMD PR over and above actually reading the links you post that actually say what the functions in question actually do... ok

I'm not saying anything shanks, you posted the information, which you are now ignoring in favour of Roy's twitter account, because we know how accurate that is :rolleyes:
 
My mate had an 6800 and used to wind me up that I was missing out on Shader model 3.0. The first time I saw it was in a 3d mark on the X1900XT and boy did it look nice.

What's even funnier is Anand's review at the time said Nv were faster in older games and the x800 had the upper hand in new games. So it was not making any difference. They even said the Ati card had the upper hand. I looked into there later review of the x1800xt and the x850 had the upper hand over the 6850 ultra so it made no difference. When it did ati had the cards to match. The only thing Nvidia had back then was superior Open Gl performance which shows big in Doom 3.

http://www.anandtech.com/show/1314/22

I bet the same argument's were getting put forward back then as well. The big difference off course is the market share was much different.
 
I just wasted 2:56 of my life that I will never get back. Sorry, but can't they just launch the cards already so we can see some performance numbers ?

Only 10 days to go. Just think right now we have the fastest GPU in the world, we might not be able to say that after the Fiji launch. Enjoy the next 10 days with your Titan :D

10 days is nothing, all will be revealed soon.
 
riddle me this;
if asynchronous shaders are such a big performance boost, and GCN has it and mantle has it, so either it was in Mantle all along and developers didn't bother to use it, or they did and we've already seen the results of using it

so either, with low market share, even when going to the lengths of using and AMD specific API developers couldn't be bothered to use it to improve AMD's performance, or it actually doesn't give the huge boosts AMD are claiming because from what we've seen mantle itself actually didn't give as huge boosts as AMD were claiming
 
Back
Top Bottom