I'm talking about dirty tactics that bring no visible benefit but heavily impact one vendor. I'm sure there are more recent examples.
Hair Works.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm talking about dirty tactics that bring no visible benefit but heavily impact one vendor. I'm sure there are more recent examples.
The extreme tessallation reminds me of the Crysis 2 (or was it 3?) where they even had massive tessellation on concrete blocks, and were rendering water even though it wasn't in the scene. Stupid crap to make one company look good when it really wasn't valid.
It reminds me of when the NVidia FX series couldn't run FP32 precision very fast so they cheated by forcing FP16 instead, FP32 wasn't really necessary then either but who wants to buy a card that achieves the same level of performance as its competitor by cutting corners?
Even if high levels of tessellation are overkill now it won't always be.
The invisible water would be culled before it hit the GPU.
It's ironic that people bring up a game from 2011, as it seems AMD have not moved their tessellation performance on since that time.![]()
HairWorks.
That is where AMD should have learned and provided better tessellation support so as to not get caught out again. Seems they didn'tI hope the same doesn't happen for AMD or Nvidia and they cry foul play when DX12 features are used and if one or the other can't run it.
They used such and such just to cripple performance on such and such hardware....I can see it now.
Actually, I played with the tessellation slider on the Fury X (CCC) and it really didn't make that much difference to performance. I was expecting huge gains but it didn't happen.
It would start doing real damage to Developers if they go on releasing games that run like a bucket of rusty nails. (on all hardware)That is where AMD should have learned and provided better tessellation support so as to not get caught out again. Seems they didn'tI hope the same doesn't happen for AMD or Nvidia and they cry foul play when DX12 features are used and if one or the other can't run it.
They used such and such just to cripple performance on such and such hardware....I can see it now.
That's because Tessellation on Fury-X is already fairly high, reducing it is much more effective on older GCN hardware.Actually, I played with the Tessellation slider on the Fury X (CCC) and it really didn't make that much difference to performance. I was expecting huge gains but it didn't happen
Conservative rasterization - I've got an image of David Cameron wearing a rasterfarian tea cosy style hat.
It reminds me of when the NVidia FX series couldn't run FP32 precision very fast so they cheated by forcing FP16 instead, FP32 wasn't really necessary then either but who wants to buy a card that achieves the same level of performance as its competitor by cutting corners?
Even if high levels of tessellation are overkill now it won't always be.
AMD could have doubled tessellation performance in hardware on 390X simply by updating it to GCN1.2 (still **** but better than 2011 levels), their accepted driver cheats meant they had no incentive to actually improve the hardware so you all just got a glorified rebadge.
Nviida is now no longer the only one with Geforce 900 series supported DirectX 12 feature level 12_1. The biggest surprise is Intel now joined in with Nvidia, Skylake now supported DirectX 12 feature level 12_1 left AMD behind in the dust only supported DirectX 12 feature level 12_0.
The interesting thing with Skylake, they fully supported Conservative Rasterization Tier 3 while Nvidia only supported Tier 1 and Skylake supported other DirectX 12 features Cross Adapter Row Major Texture, VP And RT Array Index From Any Shader FRSWGSE, PS Specified Stencil Ref, UMA and Cache Coherent UMA Nvidia and AMD did not supported these other features.
http://www.pcgameshardware.de/Core-...ylake-Test-Core-i7-6700K-i5-6600K-1166741/#a3
Skylake also supported DirectX 12 Resource Binding Tier 3 same along with AMD while Nvidia Geforce 900 series supported Tier 2.
I'll sell you one, 4790k chucked in, all on the same package. Hope you got very very good cooling.By the looks of that chart Intel have AMD and Nvidia completely slammed
All hail Intel, i'm off to buy one of their GPU's....
People still point scoring about dx 12 level support. Wow. Because as we all know games are so quick at supporting all the features of a new api....oh wait.
I'm looking to go Intel soon ^^^^^
The whole DX12.1 is a joke, there is actually no such thing as far as MS are concerned, Nvidia support Conservative Rasterization while AMD don't so Nvidia calls that DX12.1.
Its a bit like AMD calling thier cards DX12.2 because they have Tiled Resources Tier 3 and Specified Stencil support while Nvidia don't, ridiculous DX12 one-upmanship.
Wonder how long it will be before we even see any DX12 games. Id hazard a guess that we will all be on newer cards than we have now by the time any sizeable number are out.
The whole DX12.1 is a joke, there is actually no such thing as far as MS are concerned, Nvidia support Conservative Rasterization while AMD don't so Nvidia calls that DX12.1.
Its a bit like AMD calling thier cards DX12.2 because they have Tiled Resources Tier 3 and Specified Stencil support while Nvidia don't, ridiculous DX12 one-upmanship.
Nviida is now no longer the only one with Geforce 900 series supported DirectX 12 feature level 12_1. The biggest surprise is Intel now joined in with Nvidia, Skylake now supported DirectX 12 feature level 12_1 left AMD behind in the dust only supported DirectX 12 feature level 12_0.
The interesting thing with Skylake, they fully supported Conservative Rasterization Tier 3 while Nvidia only supported Tier 1 and Skylake supported other DirectX 12 features Cross Adapter Row Major Texture, VP And RT Array Index From Any Shader FRSWGSE, PS Specified Stencil Ref, UMA and Cache Coherent UMA Nvidia and AMD did not supported these other features.
http://www.pcgameshardware.de/Core-...ylake-Test-Core-i7-6700K-i5-6600K-1166741/#a3
Skylake also supported DirectX 12 Resource Binding Tier 3 same along with AMD while Nvidia Geforce 900 series supported Tier 2.
Wonder how long it will be before we even see any DX12 games. Id hazard a guess that we will all be on newer cards than we have now by the time any sizeable number are out.