• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Balkanized Gaming - Mantle & GameWorks

Caporegime
Joined
12 Jul 2007
Posts
43,799
Location
United Kingdom
Well worth a read. :)

At the end of the day, GameWorks libraries are still proprietary Nvidia implementations of DirectX 11 functions. We can assume that those libraries will always maintain basic compatibility with Intel and AMD hardware — but Nvidia will never go out of its way to make GameWorks libraries perform well on competitive hardware. With those libraries baked into Unreal Engine 4 by default, that means the burden is now on AMD and Intel to either provide and fund the creation of third-party replacements or accept whatever level of performance Nvidia allows.

It’s worth nothing, however, that AMD isn’t exactly the flawless white knight in this situation. While Mantle offers developers more freedom, many readers have argued that the API either is or soon will be open source. It isn’t, as confirmed by Robert Hallock of AMD last fall. While AMD has discussed offering an SDK at some point in the future, an open media SDK and code samples do not make the API open source.

AMD has announced no plans to open source Mantle at this time. Until it does, claims that Nvidia could implement Mantle are spurious — even if Nvidia wanted to use Mantle, it can’t implement the standard without permission from AMD.

Why GameWorks worries me

Many readers have raised the question of why I see GameWorks differently than Mantle, given that both are proprietary implementations of either an API or of key facets of an API. Developer freedom to collaborate with every IHV (independent hardware maker) is only part of it. The reason I’m cautious on this front is because we’ve seen what happens when a vendor — any vendor — has too much sway over a supposedly neutral standard. This is one area where Nvidia’s decision to target Direct3D as opposed to its own proprietary API is actually quite smart. If Mantle was hypothetically compatible with Nvidia cards but ran terribly on them, NV users would either blame AMD for sabotaging the standard or conclude that Mantle offered no performance advantages for non-GCN cards.

But if users fire up Daylight, Fable Legends, and Fortnite — each made by a different studio, but all based on the Unreal 4 engine — and see AMD and Intel lagging Nvidia in what they identify as three different games, each utilizing the supposedly-neutral DirectX API? At that point, the user is far more likely to conclude that the problem lies with AMD and Intel’s graphics hardware or their miserable driver teams. We’ve seen precisely this kind of manipulation play out in the past, thanks to the Intel compiler’s “cripple AMD” function.

Are gamers best served by bifurcated markets?

I don’t normally link ten year old articles to reinforce a point, but I think it’s worth noting that I’ve never been convinced that vendor-centric optimizations are a great thing for the gaming market as a whole. When Doom 3 and Half Life 2 launched, with Doom 3 running better on NV hardware and HL2 far superior on ATI, I raised many of the same concerns I’ve expressed over GameWorks. I see Mantle as more akin to PhysX, in that it’s a special operating mode offered by just one vendor, but were AMD to try and lock people into the standard, I’d have exactly the same problem.

Ultimately, what I care about most is that gamers — all of them — are able to enjoy the games they purchase on hardware that should be capable of running the game. By entrenching itself at the heart of one of the most popular engines on Earth, Nvidia has a great opportunity to create novel gaming experiences — but it’s going to come under a great deal of scrutiny. If the GameWorks libraries are principally tapped to offer effects like TSAA or PhysX support, that’s one thing. If they’re used to create an unfair performance advantage, that’s something else entirely. It’ll be up to Nvidia which way that situation plays out.

Full Article
http://www.extremetech.com/gaming/1...counters-with-mantle-integration-in-cryengine
 
Last edited:
Nice article, it's good to see both sides taking some stick from a journalist rather than an article being completely one sided.
 
I like the way that in order to stress his point he cites a load of waffle he wrote a decade ago, on the basis that because it's now old it's somehow less wrong :P (Valve did work with ATi on HL2 but they worked with Nvidia too, and iD worked with Nvidia on Doom 3 but they worked with ATi too, but Doom 3 didn't run better on Nvidia hardware because it was designed biased, it ran better because it used OpenGL and in those days Nvidia were simply better at OpenGL than ATi).
 
Hi Ubersonic, I'm the author of the story. I'm afraid you've forgotten the story of why Doom 3 and HL2 ran so well on NV and ATI cards respectively.

In Doom 3's case, the GF6 family was far more efficient then the Radeon cards of the day at calculating and discarding pixels that didn't need rendering. This advantage was specific to D3 -- the Radeon cards were still considered to have more efficient early Z-cull in general, but the GF6 had some specific capabilities as related to Carmack's new engine. It also had multiple advantages in shadow techniques. ATI's driver support for OGL certainly played a part, but analysis demonstrated that the game was simply a better fit for the GF6 architecture.

A full discussion of this is available here:

http://alt.3dcenter.org/artikel/2004/07-30_english.php

AMD dominated Half Life 2's *first* benchmark releases in 2003 and the FX family due to a lack of parallelism in the core and some poor forecasting from Nvidia regarding where the market was headed. The problem with HL2 was that ATI had launched a huge promotion around the game. Valve had released benchmark results in 2003 showing how the Radeon 9800 families destroyed Nvidia in benchmarks -- and then the game was pushed back an entire year.

Fast forward to 2004 and the X800 / GF6 family were far more competitive with each other in HL2, but customers who bought ATI cards in 2003 because they expected HL2 to ship within a matter of months and didn't see Nvidia as a credible option due to poor FX performance still bought an ATI card. I was profoundly angry about this at the time, because Gaben later admitted that he'd lied to journalists about the shape of the code. Many websites ran the HL2 benchmark data because Gaben told us that the game was about to launch and that the results we saw in August or September 2003 would be indicative of final performance. By the time the game was actually ready, an entire new family of cards had shipped from each vendor.

Nonetheless, my position then and now was not waffling. In the wake of Nvidia's "brilinear" filtering debacle and what I saw as misleading statements regarding HL2's ship-date, we saw two of the prominent games released that year being used to create marketing camps -- one that favored NV, one that favored AMD. I remain unconvinced that such camps are in the best interest of gamers as a whole.
 
Last edited:
Sorry but I have skipped read this and I can see Joel's point but don't forget that on max settings, GameWorks runs better on a 290X than it does on a 780Ti. In the Batman: Arkham Origins bench thread, It took a massively overclocked KPE 780Ti to beat Kaapstad's 290X.
 
Sorry but I have skipped read this and I can see Joel's point but don't forget that on max settings, GameWorks runs better on a 290X than it does on a 780Ti. In the Batman: Arkham Origins bench thread, It took a massively overclocked KPE 780Ti to beat Kaapstad's 290X.

And that's because AA is not something Nvidia can lock down. That optimization can be done driver side, and as GCN is superior than Kepler when it comes to AA they can overpower the gameworks advantage and take the lead. Remember folks without x8AA and the gameworks effect in full swing, a 770 (a mid range part) is actually 5fps faster than a 290X (fastest gpu out there bar a 780TI - both high end gpus) at 1080P. How this is possible remains unclear.
 
Last edited:
And that's because AA is not something Nvidia can lock down. That optimization can be done driver side, and as GCN is superior than Kepler when it comes to AA they can overpower the gameworks advantage and take the lead. Remember folks without x8AA and the gameworks effect in full swing, a 770 (a mid range part) is actually 5fps faster than a 290X at 1080P. How this is possible remains unclear.

I will use Martini as an example, as he is the only one I know from AMD who has actually played Batman Arkham Origins and his words were "It runs better than the previous games and runs so much smoother". That to me is a plus and not the complete tosh that I am reading from people who have not even played it.

Do you think nVidia have crippled GameWorks on AMD hardware then? Joel went on to say how they over used tessellation in his previous dig at GameWorks but it has clearly been proven that tessellation runs better on AMD hardware. You make out that you are being hard done by because nVidia GameWorks is controlled by nVidia but this clearly isn't the case.
 
"but it has clearly been proven that tessellation runs better on AMD hardware."

Where? What I saw from performance modeling (including examination of driver calls at various forced tessellation levels and at the default, application-set level) indicated that tessellation consumed much more time on the R9 290X than on the GTX 780 Ti.

However, this wasn't because of a GameWorks-specific library. The tessellation issue was caused by pumping huge amounts of tessellation into a scene, in a manner that arguably slowed the card without meaningfully impacting image quality that the end-user perceived. It wasn't as bad as Crysis 2, which poured millions of triangles into flat surfaces and rendered entire oceanscapes underneath solid terrain, but the impact was there.
 
My concern with gameworks is if the libraries are used to implement standard features and 3rd party developers are pressured into developing games in that manner.

While they are used to implement extended features, especially features that might be beyond the scope for a smaller studio to implement from scratch in the first place then I don't really see them any different from other libraries like enlighten.
 
Back
Top Bottom