• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Probability over the next 4+ years that DX12 Feature level 12_1 wont deliver tangible benefits?

Associate
Joined
30 Jul 2007
Posts
1,281
Im considering the long term (4+yr) value of a cheaper card as opposed to maxwell 2 purchase in relation dx12 feature level support. (Fury/GCN 1.2 is 12_0 it would seem?)

Is it reasonable to predict any of
  • that 12_1 feature levels wont be adopted in games in a significant way because consoles dont* support these feature levels (or another reason)?
  • that if 12_1 feature levels were adopted, that they could could be emulated to no significant detriment on a high end CPU?
  • that supporting these features levels is of no benefit in general or to the current class of hardware (eg dont have the horsepower to use)?

Im wondering how much value to ascribe to Feature level 12_1 support over the long term....

*(assumption based on the age/class of gpu).

thanks
 
One of the features actually reduces computation requirements for a specific case.

'Order independent transparency' in hardware reduces computation requirements for accurate transparency's when dealing with multiple transparent layers. (Rasterizer Ordered Views, dx12 name for it)

But as you said, the majority of the other features would likely end up as ultra settings in some pc games. Otherwise the main benefits to DX12 is the low abstraction. And hopefully a transition to some of the features DX11 brought which dev's still dont use. Such as better material formats which offer higher quality at the same file size.
 
One of the features actually reduces computation requirements for a specific case.

'Order independent transparency' in hardware reduces computation requirements for accurate transparency's when dealing with multiple transparent layers. (Rasterizer Ordered Views, dx12 name for it)

But as you said, the majority of the other features would likely end up as ultra settings in some pc games. Otherwise the main benefits to DX12 is the low abstraction. And hopefully a transition to some of the features DX11 brought which dev's still dont use. Such as better material formats which offer higher quality at the same file size.

So if a game has its characters wearing sunglasses when sitting in their cars it could bring benefits so you can see their eyes better behind the multiple transparent surfaces... :)
 
OYztGl1.png
 
...
that if 12_1 feature levels were adopted, that they could could be emulated to no significant detriment on a high end CPU?
...
They can also be done on the GPU, just not in hardware if not in the feature level (also note that not having a given feature level doesn't mean all the contained bits are missing, just that at least one is).

Both feature level & resource binding tier are as good as totally irrelevant, all new (any GCN, Maxwell, Kepler) cards will be good for a few years yet - remember games aim for the installed GPU base not the currently selling GPUs.
 
So if a game has its characters wearing sunglasses when sitting in their cars it could bring benefits so you can see their eyes better behind the multiple transparent surfaces... :)

It just means there is dedicated hardware for performing transparency sorting. It can be done on the gpu as it is now, but the current methods require more computation and time. Which then adds a latency penalty.so complex transparency's tend to be avoided.
 
remember games aim for the installed GPU base not the currently selling GPUs.

this make sense, its just that next month my 2GB 5850 will be 5 years old;

in four-five years a card i buy tomorrow probably will be mixed into the 'installed gpu base' with the next gen 16nm gpus+ following 4 years (all supporting FL 12.1 (presumably))....

in the past i think point releases were not announced until after the initial release..here we have a point release already announced prior to the initial release...
 
Last edited:
sQrnQF9.png

GCN for the here and now, Maxwell for those who buy every 10 years. :)

These things move very slowly, look at how long UE4 is taking to transition.
 
It'll be like DX11, which still isn't defacto in PC games. Indeed a game I am beta testing now, which isn't due out until next year at the earliest falls back to DX9 without being explicitly prodded (ooh, err misses)
 
Well depends if MS upgrade the Xbone firmware to run a Win 10 based OS then you will see MS pay devs to implement DX12 asap. Consoles hold the PC back always have.
 
Dx9 is already dead in the majority of upcoming games and ports from what i can see. Most wanting DX10 at the minimum. But the uptake of DX12 should be more rapid than previous DX. Simply because of the low overhead and sharing of core Rendering engine code between Console API's and DX12 etc.
 
Well depends if MS upgrade the Xbone firmware to run a Win 10 based OS then you will see MS pay devs to implement DX12 asap. Consoles hold the PC back always have.

No depends about it. Xbox one will have windows 10 and they want it "synced" across all MS devices like XB1/PC/SP and phone. Personally owning all bar the phone its something Im looking forward to
 
Todays gpu you buy for 2 years or so. the die shrink which happens next year will take about a year to mature. Once it matured you have midrange cards offering HBM2 and small factor and similiar or more performance than todays entusiast.

a game title takes around 18months to 4 years to make on a new engine which well 4 years is a long way to go
 
Im considering the long term (4+yr) value of a cheaper card as opposed to maxwell 2 purchase in relation dx12 feature level support. (Fury/GCN 1.2 is 12_0 it would seem?)

Is it reasonable to predict any of
  • that 12_1 feature levels wont be adopted in games in a significant way because consoles dont* support these feature levels (or another reason)?
  • that if 12_1 feature levels were adopted, that they could could be emulated to no significant detriment on a high end CPU?
  • that supporting these features levels is of no benefit in general or to the current class of hardware (eg dont have the horsepower to use)?

Im wondering how much value to ascribe to Feature level 12_1 support over the long term....

*(assumption based on the age/class of gpu).

thanks

If you're going for best bang for the buck and have some knowledge about the matter, go for the sh market. Buy a GPU that is fine for you now and sell it before it loses a lot of value, add some money to that and upgrade again.
 
Todays gpu you buy for 2 years or so. the die shrink which happens next year will take about a year to mature. Once it matured you have midrange cards offering HBM2 and small factor and similiar or more performance than todays entusiast.

a game title takes around 18months to 4 years to make on a new engine which well 4 years is a long way to go

i am sure that the current gen gpus can last until the end of the current console era @1080p. Given the improved specification of pc gpus and the limited console resolution. So i am planning for a purchase to last another 5 years.
 
Back
Top Bottom