• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Confirms GCN Cards Don’t Feature Full DirectX 12 Support – Feature Level 11_1 on GCN 1.0, Featur

I think both camps could do better, hats of to Nvidia though for giving us awesome performance with Titan X and 980 Ti, both are awesome. AMD deserve credit (Along with PCGamer) for giving PC gamers an E3 event. The first ever, and being the first to use HBM, this year not next year. I doubt they will get any credit though :p
Do they ever? :p

TressFX=suck
Hairwork=Awesome

More vram on AMD card=pointless
More vram on Nvidia (for the first time in a long time)= omgz...my Titan X's 12GB GDDR5 vram is make games smoother due to cacheing etc etc, 4GB HBM only on AMD would be such a joke, but the 3.5GB GDDR5+512MB (some slow memory) for the 970 is still awesome!! (:confused:)

Mantle sucks because it allows AMD to improve performance (mainly only reducing CPU bottlenecks) for AMD cards despite it doesn't affect developers to optimise dx performance for Nvidia hardware in anyway...but since it does nothing to benefit Nvidia hardwares it can die.

GameWorks is awesome because it will only improve the performance and work nicely with Nvidia hardware, but not for AMD and making it harder for developers to optimise games for AMD hardware, so we welcome it.

:D
 
Last edited:
Do they ever? :p

TressFX=suck
Hairwork=Awesome

More vram on AMD card=pointless
More vram on Nvidia (for the first time in a long time)= omgz...my Titan X's 12GB GDDR5 vram is make games smoother due to cacheing etc etc, 4GB HBM only on AMD would be such a joke, but the 3.5GB GDDR5+512MB (some slow memory) for the 970 is still awesome!! (:confused:)

Mantle sucks because it allows AMD to improve performance (mainly only reducing CPU bottlenecks) for AMD cards despite it doesn't affect developers to optimise dx performance for Nvidia hardware in anyway...but since it does nothing to benefit Nvidia hardwares it can die.

GameWorks is awesome because it will only improve the performance and work nicely with Nvidia hardware, but not for AMD and making it harder for developers to optimise games for AMD hardware, so we welcome it.

:D

A very good post, however the fact you take shots at the 970 despite the fact it's memory config has never been proven to affect performance outside of a synthetic benchmark nobody uses, kinda shows that people on both sides of the fence pick and choose their bias.

I don't get why we can't all be happy, those who bought the 980ti are happy, those of us waiting on the new AMD card will probably be too.
 
Come on now Marie, I loved TressFX in Tomb Raider and I think HairWorks looks great in TW3. If I did have a gripe about TressFX, it is the lack of seeing it in other games. AMD did a fantastic job with it but then didn't push it any further, so that was a little disappointing.

As for the VRAM debates, I can't say I have really paid any attention to that.

Mantle was great but it was proprietary and we have been over that many times. A shame really, as again I thought it was a good thing to have and with the imminent release of Windows 10 and DX12, we will have a API that works for both (so long as you have a DX12 compliant card), so that is pretty much the future for both vendors.

GameWorks is awesome and from Batman Arkham Origins to AC:U and to TW3, I really like the tech.
 
Lets be fair Greg everything you can currently buy is rubbish. 2160p/60hz/Ultra/12gb on 1 single chip is where it needs to be. Technology should be there and be cheap but we're being slowly milked instead with sub par performing gimped workstation cards for a grand.
 
well I full expect nvidia to support (bribe) a gamedeveloper to utlize unused 12_1 feature from dx12. (good for them)

While console shows 12_0 is whats the engine and games are designed for the next year and beyond.
 
Lets be fair Greg everything you can currently buy is rubbish. 2160p/60hz/Ultra/12gb on 1 single chip is where it needs to be. Technology should be there and be cheap but we're being slowly milked instead with sub par performing gimped workstation cards for a grand.

Not sure what that has to do with DX12 but whilst you are right, I don't think we are even close to half that. UHD at full ultra on say "GTA V" will probs be doable in 5 years time but by then, graphics will have moved on and we will be back to square one. When there is a big breakthrough in chip design (Germanium after silicon perhaps?), then it should be catching up but as the future of 14/16nm gets closer and then 10nm and after that, I just see more die issues as seen with 20nm) and less gains from both AMD and NVidia.
 
so wccftech is predicting that only fiji+ will get fullest dx12 support...

perhaps the million dollar question is will dx12 support affect your/peoples purchasing decision?

rightly or wrongly on current awareness i will be buying a gpu based on its dx12 support, regardless of how long it takes features to be used, games to arrive. Amd's dominance in console would not allay my concerns for a lower feature level.

If the report is true, AMD need to be very publicly/clearly convincing why they chose not to engineer in this direction - maybe there is a very valid (in terms of gpu value) reason?.

not everyone upgrades their gpu ever 1-2 years...
 
Using the same logic applied being applied to AMD's two year old GPU's in this thread not supporting every DX12 feature.

Pascal will have more DX12 feature support, meaning that all current Maxwell cards are also redundant, yes? no? Of course not !! Exact same thing applies to AMD's hardware..

Think it's time to move on to another petty topic..

the company’s next-gen GPUs will support numerous new features introduced by DirectX 12,

http://www.kitguru.net/components/g...about-future-pascal-products-in-the-pipeline/
 
Spoken to Robert Hallock and he confirms the article is true but also had this to say.

All the good DX12 features are in 11_1 and 12_0. Feature levels matter for devs, but they're meaningless for consumers.
It's why Microsoft doesn't really bother articulating the differences.

Brings me back to my point that all the "GoodNess" are in DX12_0
 
Spoken to Robert Hallock and he confirms the article is true but also had this to say.

All the good DX12 features are in 11_1 and 12_0. Feature levels matter for devs, but they're meaningless for consumers.
It's why Microsoft doesn't really bother articulating the differences.

Brings me back to my point that all the "GoodNess" are in DX12_0

thanks for confirmation, but as a potential purchaser, i want more convincing.
 
Personally don't think any of this support matters until die shrink cards anyway.
Wouldn't surprise me if any of the cards we currently own which "support" whatever level of DX12, didn't actually work for DX12 :p
 
thanks for confirmation, but as a potential purchaser, i want more convincing.

I dont think you'll get a better answer. AMD will say the same, NV will say anything below 12.1 will give you cancer, took your money from your account, and burn down your house.
 
Last edited:
I dont think you'll get a better answe. AMD will say the same, NV will say anything below 12.1 will give you cancer, took your money from your account, and burn down your house.

You got the kind of humour I like :D

It is pretty much much of a muchness really and the thread was gold to me to show all those who said this and that about GCN has the best DX12 support blah blah and NVidia are well behind what is actually what. Call me baiting trolling or whatever but that wasn't the intention and was clearing up some fallacies that were being posted.

Probs by the time that DX12 is running, I would have moved on as well from my current GPUs but it does once and for all clear up that what I said about AMD not being able to do ROVs or Conservative Rasterization and these are 2 important parts of DX12.

I am still very curious to know if the 390 series including Fiji will support DX12 though and I am sure it will all come clear at E3 like Shankly said.
 
Tbh i dont think the 3xx series (including fiji) will be more advanced in terms of dx12 features.
Also with all the chatting about tiers and features, its all doesn't matter...the same will go on as usual the two sides cards clashing, some games run better on this sone on that.
We're lucky they don't make the same cards...it could be very boring :)
 
A series that is not even out yet not supporting the current dx is pretty bad tbh.

But that aside if it has taken amd this long to release the 3 series how long do you think it will take them to release another one? maybe sometime in 2016 but its looking more like 2017 at this rate and by that time it will defiantly matter to the consumer's that amd dont support dx 12 fully and nvidia do.

Plus i and other's will probaly be upgrading late summer to autumn as the release of really great and very gpu taxing games such as star citizen will be on our doorstep and just like crysis you will want the best system you can afford with the best quality!!
 
Why would AMD not support DX12 fully by the time next gen come round?

Gawd 3 series not out yet but people talking about how the gen after next will fail to support new dx 12 features... which are also not out... and no game uses.
 
Back
Top Bottom