• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon GFX & DirectX 12

if this is the case i suspect game devs will want to work with it sooner rather than later no? people say it will be a few years before we see proper DX12 games. when they ahve more to work with they may make them sooner.
 
This is a completely uneducated guess but I would suspect it would only be in certain scenarios and would take a lot of effort to get it work and would not be a "default" feature as such.

Like I said just my poor guesswork!

In reality it'll never be 2x4GB being used the same as a single card with 8GB as both GPUs will need access to very similar data and you're not going to be going across the PCIE bus every frame.

Memory bandwidth on a 290x is something like 320GB/s whereas PCIE 3 is 16GB/s.
 
Last edited:
Wont all this apply to NVidia too?

Yes DirectX 12's multithreaded command buffer recording, async shaders and explicit multiadapter features are also applied to Nvidia too. ;)

But it seemed AMD missed tons of other very excited DirectX 12 features on Nvidia like Voxelization, Conservative Rasterization, Raster Ordered Views, Tiled Resources, Multi-frame sampled AA MFAA, Agregate G-Buffer AA AGAA, Accumulative AA CAAA, Faster Geometry Shader Rendering, Ray Tracing Shadows, FTIZB etc. It confirmed AMD will not supported Conservative Rasterization, Raster Ordered Views and probably some other features because these belong to higher level DirectX 12_1 features but AMD supported just all of DirectX 12_0 features. Maxwell V2 will also supported all of DirectX 12_0 features. :)
 
Yes DirectX 12's multithreaded command buffer recording, async shaders and explicit multiadapter features are also applied to Nvidia too. ;)

But it seemed AMD missed tons of other very excited DirectX 12 features on Nvidia like Voxelization, Conservative Rasterization, Raster Ordered Views, Tiled Resources, Multi-frame sampled AA MFAA, Agregate G-Buffer AA AGAA, Accumulative AA CAAA, Faster Geometry Shader Rendering, Ray Tracing Shadows, FTIZB etc. It confirmed AMD will not supported Conservative Rasterization, Raster Ordered Views and probably some other features because these belong to higher level DirectX 12_1 features but AMD supported just all of DirectX 12_0 features. Maxwell V2 will also supported all of DirectX 12_0 features. :)

Proof? From what I understand amd GCN fully supports dx12.
Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture
http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx
 
Love it how Matt just plants the seed then doesn't answer any questions. pffffft more splendid community work.

It's the AMD way. Annoy customers until they go to the competition. lol

Lucky for them their competition likes to milk and ***** their customers, otherwise I get the feeling their market share would be much worse than it is the way they running the show.
 
developers has more control over how the hardware presents their game pretty much asked for decades and now possible thanks to Mantle and amd.

DX12 is really good news for anyone esepcially those who has a bit weaker cpu which is the majority.

I agree with this. Many view Mantle/Vulcan and DX12 as a way for AMD to move the responsibilities for things such as crossfire support and you could say to a certain degree that the responsibility is moved but its not a bad thing. It is actually quite a good thing. No one knows the game as well as the dev behind it and once we have all these major game engines locked and tuned to support all major hardware vendor and stuff like Crossfire and SLI proper support for these features are going to be available on launch without the need for nvidia or amd to fiddle with drivers. Atleast i think this is what they are going for. It will take some time before these game engines are ready but just look at Unreal Engine and how fare that is already and im pretty sure it wont take DICE long before Frostbite is ready as well. CryEngine i think is also doing the move and all these devs are pretty talented.

I think it will be a good day the day you can buy a product and focus on features, performance and price instead of driver support.
 
Proof? From what I understand amd GCN fully supports dx12.
Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture
http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx

The B3D thread I keep linking to where people are pulling apart DX12 functionality levels on Nvidia and AMD hardware. The DX12 feature level they hit (and can claim to have full support of, handily misleading people) misses out a number of big ticket DX12 features available in newer Maxwell hardware.
 
The B3D thread I keep linking to where people are pulling apart DX12 functionality levels on Nvidia and AMD hardware. The DX12 feature level they hit (and can claim to have full support of, handily misleading people) misses out a number of big ticket DX12 features available in newer Maxwell hardware.

I'll be sticking with what amd is saying tbh until otherwise.. Full compatibility, is full compatibility to me..

Unless you have proven facts to go against AMDs word?

Amd directx 12 drivers won't be even finished, it's pointless pulling anything apart until it's official release.. What's to say nvidia driver is ahead? Vice versa
 
The B3D chaps have 100% facts to back up what they are saying. Feel free to ignore that if you want in preference of an ambiguous year old bit of AMD PR.

Current versions of GCN only support feature level 12_0 which I bet you a tenner will be the full (lol) DX12 support they are talking about. This misses out all the goodies that 12_1 and higher bring, many of which were mentioned in that post by AthlonXP1800, and only exists to allow older DX11 hardware to take advantage of the low level optimisations of DX12 without them having to support any new features.
 
In reality it'll never be 2x4GB being used the same as a single card with 8GB as both GPUs will need access to very similar data and you're not going to be going across the PCIE bus every frame.

Memory bandwidth on a 290x is something like 320GB/s whereas PCIE 3 is 16GB/s.

Maybe they'll be able to state it uses a full 8gb, when in fact one card's vram is used as a slower, unadvertised "super cache", if you will.
 
Think i will wait to see actual real information from Microsoft and Game Devs before i believe anything Layte has to say, you only need to check his post history to see how much anti AMD drivel he spews.

Maybe the current crop of AMD cards dont support all the feature levels, maybe they do, maybe the new AMD cards wont or will, one i know for sure, none of us have an idea, so its all pure speculation right now.

Like i say, sit tight and wait to see what MS and Devs have to say on the matter.
 
Think i will wait to see actual real information from Microsoft and Game Devs before i believe anything Layte has to say, you only need to check his post history to see how much anti AMD drivel he spews.

Everything I post I make sure can be backed up by facts, unlike many. The B3D chaps are developers by the way.

Feel free to read the links I have taken the time to post up, I'm only the messenger. :)
 
Am I right in thinking it is down to the game developers to optimise for each GPU, be it AMD and NVidia? If so, I wonder if both will still be required to send out their own devs to get the best performance for their users?

Layte knows his stuff, so ease up guys.
 
Am I right in thinking it is down to the game developers to optimise for each GPU, be it AMD and NVidia? If so, I wonder if both will still be required to send out their own devs to get the best performance for their users?

Would imagine they just implement features, and if your hardware supports it then all the better for you, if your hardware doesnt then its toggled off or something Greg.

Then again i aint no developer so im just guessing at that ;)
 
I'll continue this discussion when directx 12.x is released. And we have working games to back up all this.. Until then I doubt anyone running GCN should be worrying.

Well that's just head in the sand silliness. People are out there right now who know their coconuts and are putting this information out there. Ignoring them because it doesn't fit your bias is not going to change anything.
 
The real upgrade comes from jumping to dx12 train anyways. It doesn't matter in big picture if you have latest feature levels or not, even it sounds good on paper. Games won't be made to require those levels at start anyway.

Worst thing that'll happen is you miss some fancy effects of features, but game will still run better than it did on DX11.
 
Well that's just head in the sand silliness. People are out there right now who know their coconuts and are putting this information out there. Ignoring them because it doesn't fit your bias is not going to change anything.

Mate dx11.2 didn't even take off.. We not even got dx12 yet and already posting junk about 12.1.. Lets see what games use be it dx12 or dx12.1
Hell did even meny games use 11.1

Like I posted above wait for the games that's what matters.
 
Back
Top Bottom