• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Confirms GCN Cards Don’t Feature Full DirectX 12 Support – Feature Level 11_1 on GCN 1.0, Featur

riddle me this;
if asynchronous shaders are such a big performance boost, and GCN has it and mantle has it, so either it was in Mantle all along and developers didn't bother to use it, or they did and we've already seen the results of using it

so either, with low market share, even when going to the lengths of using and AMD specific API developers couldn't be bothered to use it to improve AMD's performance, or it actually doesn't give the huge boosts AMD are claiming because from what we've seen mantle itself actually didn't give as huge boosts as AMD were claiming

Ask this why dont reviewers test fcat anymore with framelatency?
Nvidia suck at it thats why.
Mantle its so smooth on AMD technology cards.
 
Ask this why dont reviewers test fcat anymore with framelatency?
Nvidia suck at it thats why.
Mantle its so smooth on AMD technology cards.

not sure which reviews you've been reading but I see lots of reviews showing frame latency comparisons

I don't play BF4 though, so yeah mantle is pretty much irrelevant to me anyway, plus I have Gsync that works in every game, not just those few games I don't even play
 
Last edited:
Dude give it a rest, that must be the only over website you know of other than these ones because I've see you post that exact same link over and over. I swear you must sit there at your screen with a smug look thinking to yourself "yeah that'll show em good!"

Seriously let it go man, your obsession with proving Nvidias support of an unreleased Api is borderline fanatical. Much like everyone else in this thread arguing over something that hasn't been released is pointless. Currently we have cards from both sides that may or may not support some features sync may or may not get used, who cares right now I mean seriously? Wait til the API is released and a few games have been made then we might actually have a point to this currently pointless bickering back and forth.

Right now all you are doing is making yourself look a bit sad by vehemently defending something that right now has zero impact or meaning on current games as of right this minute.

Yes some will say "but it affects our current upgrade plans" to which I again say wait til games with dx12 are released, buying hardware solely for dx12 and its upcoming features right now is pointless

It's funny. Now that AMD PR's shenanigans have been found out once again, suddenly having full support for all DX12 features no longer matters.

I don't care if people such as yourself who prefer to wallow in their ignorance as long as it fits in with their biases think I am being obsessive about something. I'd rather be obsessive, but be arguing from a platform backed by facts, than somebody who prefers to ignore them because it's not something they want to hear.

Now rather than try to counter what I am saying with any facts, which obviously you cant, you attempt to undermine me personally. All in all a pretty sad situation, you really should not be so invested in a multi national corporation that you are prepared to go to such lengths to defend them against the truth.

Now, if you want to grow up and discuss the realities at hand, instead of personally attacking anyone who is off message regarding AMD, I'll be waiting. If not, please do everyone a favour and step away from the computer and take a good long look at yourself, as it surely cannot be worth it.
 
What's even funnier is Anand's review at the time said Nv were faster in older games and the x800 had the upper hand in new games. So it was not making any difference. They even said the Ati card had the upper hand. I looked into there later review of the x1800xt and the x850 had the upper hand over the 6850 ultra so it made no difference. When it did ati had the cards to match. The only thing Nvidia had back then was superior Open Gl performance which shows big in Doom 3.

http://www.anandtech.com/show/1314/22

I bet the same argument's were getting put forward back then as well. The big difference off course is the market share was much different.


The X800XTPE was a great card for its time and it was the only thing my mate used to try and put it down on. I think you are right that there was not any real world difference. If I remember correctly there were only a few games that could do shader model 3.0 and the odd tech demo, even then performance wasn't great until the next gen of cards.

As you say not any difference to what we see here today with this DX12 debate. :D
 
Yes because I'm the one constantly posting Rubbish stating AMD can do this and that blah blah blah.

Quite frankly I couldn't give a monkeys right now if AMD do, don't or never will be dx12 or dx12.1 or whatever flavour of dx12. Because right now none of it is important.

But if it gives you a happy gooey feeling to haunt Internet forums posting the same thread over and over like its your own personal moment of crowning glory then more power to you brother.

Me? I'd rather wait for the official facts once the Api is released and we have a few games available and probably to see if these 12.1 features actually even matter before I get all hot and sweaty and rage a campaign on the internet telling everyone they are wrong, like I am bringing the gospel straight from the messiahs myself.
 
I'm the one who has posted nothing but verified fact or opinion backed by third party evidence. If that rustles your jimmies I'd suggest the Internet is not for you.
 
I'm the one who has posted nothing but verified fact or opinion backed by third party evidence. If that rustles your jimmies I'd suggest the Internet is not for you.

Sorry no, your the one who is the winner of the "read it on an Internet forum therefore it must be true" award for 2015.

Well done, gold star and pat on the back for you.
 
http://forums.anandtech.com/showthread.php?t=2433809

If we see anymore duplicate graphics card launch/rumor threads, they will be locked and moderator action will be taken on the member that created the thread. Most of the time this is to bait the other members if a negative rumor occurs which we want to keep the peace amongst everyone here. We don't need a new thread for every little piece of information that occurs, just post it to the respective thread that has already been made.
 
Doesn't matter who the messenger is old chap. the guys over on B3D know thier coconuts and don't take any fanboy nonsense like here and many other places. Which is why it's always the best place to get the heads up on things. Guess what.. The developers over there were bang on the money once they started tearing down the PR smokescreen.

You should join up as well, educate yourself.

ps. I read it on an Internet forum and it was true. :)
 
riddle me this;
if asynchronous shaders are such a big performance boost, and GCN has it and mantle has it, so either it was in Mantle all along and developers didn't bother to use it, or they did and we've already seen the results of using it

so either, with low market share, even when going to the lengths of using and AMD specific API developers couldn't be bothered to use it to improve AMD's performance, or it actually doesn't give the huge boosts AMD are claiming because from what we've seen mantle itself actually didn't give as huge boosts as AMD were claiming

The only Mantle game that uses it is Thief and I think it said somewhere that it works in Crossfire only. Outside of that no other Mantle game uses it. However it is used in a few games on the PS4 which include BF4.

Why is that? Who knows, maybe the performance boost wasn't worth it. Maybe they were just too lazy.

But keep in mind that one feature would give a performance boost to a lot of people yet wasn't used, that's the multy-adapter feature. Anyone with an iGPU would have benefited from it, yet not a single game used it despite it working on Mantle.

https://twitter.com/draginol/status/594312861167550464

https://twitter.com/draginol/status/594308070198681601
 
The only Mantle game that uses it is Thief and I think it said somewhere that it works in Crossfire only. Outside of that no other Mantle game uses it. However it is used in a few games on the PS4 which include BF4.

Why is that? Who knows, maybe the performance boost wasn't worth it. Maybe they were just too lazy.

But keep in mind that one feature would give a performance boost to a lot of people yet wasn't used, that's the multy-adapter feature. Anyone with an iGPU would have benefited from it, yet not a single game used it despite it working on Mantle.

https://twitter.com/draginol/status/594312861167550464

https://twitter.com/draginol/status/594308070198681601

It is because all the mantle games made until now were made in the dx11 limitations (game engines built to dx11) so users who cannot use mantle can play too. So there was no need to use it really. If you don't cross the limits then it can't show why it is better. Example: mantle can do many times more drawcalls than dx11, but if the game only use as many as dx can handle, then you will se no difference...if it uses more then you could see dx collapses mantle goes on.

So when they start to build real low level games on dx12 and vulkan they will use it.
 
Back
Top Bottom