• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon GFX & DirectX 12

Mate dx11.2 didn't even take off.. We not even got dx12 yet and already posting junk about 12.1.. Lets see what games use be it dx12 or dx12.1
Hell did even meny games use 11.1

Like I posted above wait for the games that's what matters.

12_0 is nothing more than DX11 but allowing cards to make use of DX12 low level optimisations. All the goodies are in 12_1 or higher. AMD are being disingenuous here with this.
 
I agree with this. Many view Mantle/Vulcan and DX12 as a way for AMD to move the responsibilities for things such as crossfire support and you could say to a certain degree that the responsibility is moved but its not a bad thing. It is actually quite a good thing. No one knows the game as well as the dev behind it and once we have all these major game engines locked and tuned to support all major hardware vendor and stuff like Crossfire and SLI proper support for these features are going to be available on launch without the need for nvidia or amd to fiddle with drivers. Atleast i think this is what they are going for. It will take some time before these game engines are ready but just look at Unreal Engine and how fare that is already and im pretty sure it wont take DICE long before Frostbite is ready as well. CryEngine i think is also doing the move and all these devs are pretty talented.

I think it will be a good day the day you can buy a product and focus on features, performance and price instead of driver support.

Its a nightmare to write drivers for games.
Developers had to deal with Dx11 and lazy development from microsoft. sli/crossfire will see benefit with dx12.
You bought a 4 core or 8 core cpu and it just sits there idle most of the time playing games when it could be utlized a lot more than one single core.
so now we will find games have a bonus effect due to dx12 that we previously have seen in several generations of cards before but now unleashed a whole new level of power thanks to the new API of dx12.

I found this interesting and one reason AMD will have a good line up for windows 10 and gaming the coming years.

33wpkyt.jpg
 
Yea, pretty sure people here have got far more than 10M on 980's.

Once again AMD are being disingenuous by using a test that requires the full resources of the system to do nothing more than draw simple polygons. FM themselves have asked that it not be used as a benchmark. The only DX12 gamengine test we have seen shows NV having a huge performance advantage.
 
Yea, pretty sure people here have got far more than 10M on 980's.

Once again AMD are being disingenuous by using a test that requires the full resources of the system to do nothing more than draw simple polygons. FM themselves have asked that it not be used as a benchmark. The only DX12 gamengine test we have seen shows NV having a huge performance advantage.

Such as?

The only thing we know for sure is 3DMark's API test, wich shows AMD with the advantage.
 
I think this (dx 12/windows 10) and amd's new cards are where the the entire amd driver team have been spending most of their time and money over the last several months and as a result the current gpu's (more so crossfire users) and windows 7/8/8.1 have suffered, lets just hope it really pays of...

I am in no rush to get a new gpu so can hold of for a bit longer.

AMD have a driver team??? :confused::confused::confused:

Since when!?? :p
 
meh, split frame rendering has been around since the dawn of multi-gpu gaming, it didn't work well because the workload in different parts of the screen can be vastly different. Nothing has changed, it is just a fundamental limitation. Sure, with DX12 developers probably have better control rather than relying on the drivers to split the screen but that doesn't resolve the situation.

And the shared memory is nice and all except it only has the advantage of preventing duplication of a resource that will only be used by one of the GPUs, which basically never happens. texture and geometry etc are going to be needed on both GPUs because it will be far too slow exchanging resources between GPUs.
This works much better for compute applications.
 
meh, split frame rendering has been around since the dawn of multi-gpu gaming, it didn't work well because the workload in different parts of the screen can be vastly different. Nothing has changed, it is just a fundamental limitation. Sure, with DX12 developers probably have better control rather than relying on the drivers to split the screen but that doesn't resolve the situation.

And the shared memory is nice and all except it only has the advantage of preventing duplication of a resource that will only be used by one of the GPUs, which basically never happens. texture and geometry etc are going to be needed on both GPUs because it will be far too slow exchanging resources between GPUs.
This works much better for compute applications.
+1

Also, layte, someone being a developer doesn't mean they are not talking BS, I should know! :p
 
I agree with this. Many view Mantle/Vulcan and DX12 as a way for AMD to move the responsibilities for things such as crossfire support and you could say to a certain degree that the responsibility is moved but its not a bad thing. It is actually quite a good thing. No one knows the game as well as the dev behind it and once we have all these major game engines locked and tuned to support all major hardware vendor and stuff like Crossfire and SLI proper support for these features are going to be available on launch without the need for nvidia or amd to fiddle with drivers. Atleast i think this is what they are going for. It will take some time before these game engines are ready but just look at Unreal Engine and how fare that is already and im pretty sure it wont take DICE long before Frostbite is ready as well. CryEngine i think is also doing the move and all these devs are pretty talented.

I think it will be a good day the day you can buy a product and focus on features, performance and price instead of driver support.

You're looking at it with rose tinted glasses, a lot of developers aren't going to bother fixing things that a small proportion of enthusiasts complain about, once they've got their money from the masses most of whom are running Intel integrated what does it matter to them if a few AMD users are having a particularly bad experience in some extreme cases? Is the BF4/Mantle memory leak fixed yet? how long did that take? it obviously wasn't a priority for the developer and those whose interest it is in (AMD) won't be able to fix it themselves via drivers? if this happens there's just going to even more complaining from AMD about how all of the developers aren't doing their job properly. NVidia will probably still be doing their thing with developers before release so it shouldn't affect them much either way.
 
Confuses as to why amd are advertising any of it as a amd only thing. U can do all this with nvidia and sli or nvidia card with onboard GPUs like the Intel ones.

If you look past the AMD logos on the pictured graphics cards, none of the text actually claims any of this as an AMD only thing. It's simply AMD pushing the benefits of DX12 as a whole rather than pushing AMD (and adding pictures of their own GPUs, which makes sense if they've made the presentation).
 
If you look past the AMD logos on the pictured graphics cards, none of the text actually claims any of this as an AMD only thing. It's simply AMD pushing the benefits of DX12 as a whole rather than pushing AMD (and adding pictures of their own GPUs, which makes sense if they've made the presentation).

Because they have no products to sell, just PR spin and hot air.
 
It seems AMD spend more time making slides and presentations than actually writing drivers. :p

They wrote Mantle, which large parts of DX12 is based on, and the overwhelming bulk of Vulkan (plus some spicy new stuff) is based on.

Both should mean driver optimisation of titles is far less necessary. CF / SLI profiles should certainly be a thing of the past. Except all bets are off in NV GameBrakes titles ...
 
They wrote Mantle, which large parts of DX12 is based on, and the overwhelming bulk of Vulkan (plus some spicy new stuff) is based on.

Both should mean driver optimisation of titles is far less necessary. CF / SLI profiles should certainly be a thing of the past. Except all bets are off in NV GameBrakes titles ...

I know I know I was just baiting. :p
 
Back
Top Bottom