• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Oxide Developer: “NVIDIA Was Putting Pressure On Us To Disable Certain Settings In The Benchmark”

Soldato
Joined
2 Jan 2012
Posts
12,409
Location
UK.
Kind of suprised that Nvidia wanted Async Computer disabled. Interesting quote though.

Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware.

Maybe AMD's superior Async Compute performance will finally start to help them out in DX12. :D
 
I'm waiting to see a broader spread of DX12 benchmarks - this whole hysteria over 1 benchmark is getting silly - especially given the whole history of Oxide.
 
So basically nVidia wanted to taper and shape the benchmark to run better on their hardware?? lol. The lengths they will go to. Disabling certain settings n stuff? Just WOW.
 
So basically nVidia wanted to taper and shape the benchmark to run better on their hardware?? lol. The lengths they will go to. Disabling certain settings n stuff? Just WOW.

Depends what the goal is - vendor specific paths have existed almost as long as games have. If they want to disable that feature wholesale at an image quality or performance penalty that is one thing but it might be they are able to provide better performance at the same quality level with the feature disabled on their hardware while on other hardware it might be a performance advantage to have it on.
 
So basically nVidia wanted to taper and shape the benchmark to run better on their hardware?? lol. The lengths they will go to. Disabling certain settings n stuff? Just WOW.

I wonder what AMD would have said if the benchmark needed 5gb of VRAM to run in !!!!

:D
 
So basically nVidia wanted to taper and shape the benchmark to run better on their hardware?? lol. The lengths they will go to. Disabling certain settings n stuff? Just WOW.

It would be interesting to see if anything more than Async Computer was actually disabled on their hardware. I also wonder whether AMD asked for anything to be disabled. Sounds to me like Nvidia are going the PR route instead of dealing with the problem.
 
It would be interesting to see if anything more than Async Computer was actually disabled on their hardware. I also wonder whether AMD asked for anything to be disabled.

Yea true quite true mate! Well oxide have owned upto the only vendor specific code in the benchmark is nVidia's but that was only to disable Async Compute as nVidia requested it. However their drivers still called on this feature which caused chaos and Oxide just disabled it on hardware level. Still find it funny how when oxide refused to disable all the settings nVidia requested they threw their dummy out of the pram and blamed the benchmark with buggy code or something lol.
 
Apparently They do not make heavy use of Async. But some games are starting to on the consoles. Most of their Async is to do with lighting. So essentially the AMD cards came up on par once the DX11 overhead was gone.

And if you read their game blog, they do say that their code is generic and that they would not implement code changes that hinders one vendor over another.
 
Yea true quite true mate! Well oxide have owned upto the only vendor specific code in the benchmark is nVidia's but that was only to disable Async Compute as nVidia requested it. However their drivers still called on this feature which caused chaos and Oxide just disabled it on hardware level. Still find it funny how when oxide refused to disable all the settings nVidia requested they threw their dummy out of the pram and blamed the benchmark with buggy code or something lol.

I think Nvidia releasing a petty PR statement like that said it all really, but again, I don't think this one game should be the decider, i'd be interested to see what games like ARK and Gears of War Ultimate Edition make of DX12.
 
I think Nvidia releasing a petty PR statement like that says it all really, but again, I don't think this one game should be the decider, i'd be interested to see what games like ARK and Gears of War Ultimate Edition make of the DX12.

yea quite right mate. I'm not judging any DX12 performance soley off this benchmark i just like the indication of great performance gains on the AMD side and same for nVidia but something is wrong here for nVidia obviously. Shows DX12 is the right direction for gamers.

But we do need more games, id be sceptical of basing DX12 off ARK too as its in alpha and is quite buggy but its still going to be a DX12 game :)
 
yea quite right mate. I'm not judging any DX12 performance soley off this benchmark i just like the indication of great performance gains on the AMD side and same for nVidia but something is wrong here for nVidia obviously. Shows DX12 is the right direction for gamers.

But we do need more games, id be sceptical of basing DX12 off ARK too as its in alpha and is quite buggy but its still going to be a DX12 game :)

Well it's hard to say what exactly is causing Nvidia's relatively low DX12 performance, whether it be an architectural flaw with Maxwell or a driver problem.
 
Back
Top Bottom