Soldato
Why shouldnt they ? It's their hardware they can do whatever they like.
Think you missed the point been made entirely.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Why shouldnt they ? It's their hardware they can do whatever they like.
Who cares - not me!
If you're of the hippy tree hugging variety I could see this might bother you.
WB blocking AMD support.
I'm getting substandard performance on my AMD GPU with this game, i have been waiting for AMD to make some driver optimizations for this game, i found it very strange that after all this time there has been nothing from them.
After some digging around it seems thats never going to happen, Under Nvidia's instruction's, it seems, Warner Brothers are blocking AMD from making any performance optimizations.
"because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia.
There’s a second reason to be dubious of Arkham Origins: it pulls the same tricks with tessellation that Nvidia has been playing since Fermi launched. One of the differences between AMD and Nvidia hardware is that Nvidia has a stronger tessellation engine. In most games, this doesn’t matter, but Nvidia has periodically backed games and benchmarks that include huge amounts of tessellation to no discernible purpose. Arkham Origins is one such title."
I'm not happy about this, i paid Warner Brothers my hard earned money for a product, that product is sub standard and they are blocking my Hardware provider from fixing it.
I want to know what Warner Brothers have to say about this.
In Nvidia’s GameWorks program, though, all the libraries are closed. You can see the files in games like Arkham City or Assassin’s Creed IV — the file names start with the GFSDK prefix. However, developers can’t see into those libraries to analyse or optimize the shader code. Since developers can’t see into the libraries, AMD can’t see into them either — and that makes it nearly impossible to optimize driver code.
AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience.
And while we acknowledge that current Gameworks titles implement no overt AMD penalties
So does all this make it fair, no not really, is it the way the industry works, yes unfortunately, a lot of the major players do this sort of thing, it is the way the world goes round.
The thing is again though you should care, in fact you should be sending letters of complaint to Nvidia, and the guys behind the game.
Nvidia are adding tessellation to pointless degrees which is reducing YOUR performance, in an attempt to reduce AMD users performance just to a greater degree.
Are you happy that instead of paying the dev's money that comes from you(and all people who buy Nvidia products) to add tessellation in a not intentionally harmful way and in a way that genuinely improves the IQ? Because that is what is happening.
When will people realise, Nvidia are hurting their OWN performance just because they realise they can hurt AMD performance by a larger percentage. If they actually used all that power in Crysis 3 for something that actually improved the IQ, and it was fully possible to do that while also reducing performance less, because after a point more tessellation provides no IQ improvement, just a performance hit. Instead Nvidia users say things like they are happy with the worthless usage of performance hogging tessellation that hurts their OWN performance.
I can turn down tessellation in the AMD control panel to simply bypass Nvidia's attempts, however, i'd be happier for the dev's to actually add tessellation where it improves IQ, but maybe that's just me.
Maybe Nvidia users really are happy about having flat surfaces tessellated to reduce their own FPS for no reason at all.
No - that's why you care. I shouldn't care just because you say I should care.
I don't care because I don't care. If I was getting naff performance I might be bothered but my FPS is good in everything I play so...
Troll thread by AMD fanboy.
Not much point hanging around here.
Furry muff but if You don't care, time to leave the thread and let people who do care discuss it.
Was just replying to the wall of text assuming I should care and send a letter of protestation (lol) because somebody else is bothered.
The people who do care will be the same people who posted similar opinions in the Mantle thread and have already probably been over the same points. As it's Christmas and all I guess
Thank god Mantle won't stop Nvidia optimizing for Directx11 though. Can you imagine the uproar if the tables were turned and suddenly Mantle being used by a developer meant that Nvidia could not provide any performance improvements or optimizations in their drivers?