• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

Wasn't Mantle already stated multi vendor compatible?

Edit
Yeah they did see here
http://www.dsogaming.com/news/amds-mantle-does-not-require-gpus-with-gcn-architecture/

in theory it COULD work with "any modern GPU", however it is not currently multi vendor compatible, to make it so would require other vendors to work directly with AMD to add support for those other GPU's to mantle, just like those other vendors currently work with Microsoft to get their pet features added to DX

currently it only supports GCN (well, currently as in there are no games that actually support it as of yet)
 
Captain obvious for the first part of your post :rolleyes:, hence why I wrote someone needs to explain to me how this is any different :D. Also you haven't really explained anything?

Currently Mantle locks Nvidia out at a hardware level (Requiring GCN) for any functionality..

Twimtbp titles lock AMD out at a software level for the Nvidia optimized code.

Someone explain why Nvidia is the bad guy? Nvidia is bad for Physx and Twimtbp but AMD is a savior for Mantle... When coming from a neutral view point, it just looks like two company's trying to screw each other other with proprietary features that require proprietary hardware / software to function.. Both as bad as each other..

Yep you still dont get it so it clearly is not obvious to you.



Twimtbp titles lock AMD out at a software level for the Nvidia optimized code, no problem, but it does not lock out DX11 optimization for AMD

Currently Mantle locks Nvidia out at a hardware level (Requiring GCN) for any functionality, no problem, but it does not lock out DX11 optimization for Nvidia.

This GameWorks title locks AMD out at a software level for the Nvidia optimized code and locks out DX11 optimization for AMD, the difference is clear and it has been said by others as well but you cant grasp it. if it was anything like the other 2 above there would be no article with this premiss.
 
Last edited:
Might force AMD to make better cards :p

In all seriousness, Batman Arkham Origins runs much smoother than City ever did, so maybe Warner Brothers are just crap, because as of right now, this is a none story to me, I'm getting better IQ and better performance than City, and while yeah sure, I think the frame rates aren't as high as they could be, it's a damn site better than City (Which wasn't gameworks)

I would love AMD to buy their way into making an AAA highly anticipated game mantle only :p. Not because I agree with it, but because I think it's the only way to make Nvidia invest into Mantle.
 
Last edited:
lol, 4K reviews have the 290X just as fast if not faster than the GTX Titan.

I get the GPGPU performance, though one does wonder why Nvidia strip so much compute power out of their gaming cards to start with, now, Retail GK104 Kepler are weaker than Retail Fermi cards.

I know you like to return the favour to those who said 2GB is not enough on the GTX 680, its still not enough now that they have re-badged that GPU as the GTX 770, but i have yet to see where 4GB is not enough, and its more than 3GB :D

You mean those reviews where they run 4K with reduced settings so those 4GB cards run and then there is next years upcoming games (just in time for cheap 4K monitors) that will really push a 4gb card (the 290X is obsolete for 4K even before it got on the shelves).

As a 290X owner I can say I really like the cards but I won't pretend they can do stuff they can not.

On the other hand I will also defend them for example last night pointing out that they run the latest Batman game faster than a Titan does at 1600p.

I really wish you were using NVidia as well as AMD then you view point would not be so one sided.

Back on AMD cards, your Valley thread is dominated by NVidia cards this may change in the next few weeks.;)
 
Might force AMD to make better cards :p

In all seriousness, Batman Arkham Origins runs much smoother than City ever did, so maybe Warner Brothers are just crap, because as of right now, this is a none story to me, I'm getting better IQ and better performance than City, and while yeah sure, I think the frame rates aren't as high as they could be, it's a damn site better than City (Which wasn't gameworks)

I would love AMD to buy their way into making an AAA highly anticipated game mantle only :p. Not because I agree with it, but because I think it's the only way to make Nvidia invest into Mantle.

I am not going to pretend I know all the ins and outs of Mantle or anything similar that NVidia may come up with but what I was thinking is all these new things may put pressure on Microsoft to come up with something that runs a lot more efficiently than DirectX11 that anyone can use.
 
Next gen consoles are bound to push textures. So where 3 to 4GB may be ok at even 1440P for now, it's going to be a close call come Q3 in some titles. It's not a bad thing, but as said it's pretty obvious there will be walls being hit at 4K very soon on current gen cards.

3GB for 1440P even is nail biting looking occasionally at the usage. Like Kaap I'm going by how I like to play games. I pay for a flagship card so I can knowingly max out all the settings. I don't particularly want to be putting down detail in order to play at higher resolutions, as it defeats the object. At least IMO.

Which is why I don't really pay any attention to a lot of the 4K reviews currently. It's a non issue till the panels become more affordable but case in point
 
Last edited:
http://www.hardwarepal.com/amd-release-batman-arkham-origins-fix/

So AMD had a fix back in October to get the frames up from 11 on a 7970 at 1440P but now they are locked out? I confess to not knowing the ins and outs of coding but if they could do that, someone explain to me how they can't get more optimizations?

The great news is that AMD has released their new 13. 11 Beta v6 driver that fixes most of these issues and you should be good to go with these drivers. The HardwarePal team was quick to get there benchmark troubleshoots out and AMD responded in less than 24 hours nice one!!! We have uploaded our new Batman Arkham Origins Benchmarks.

Genuine question and I must be missing something. Another point that is bugging me on the original article....

Here’s what the ground looks like in Arkham Origins: First, the actual model, and second, the wireframe in DX11. Pouring triangles into these surfaces can make them look subtly more realistic, but it’s also a cheap way to disadvantage a competitor. Deep performance inspection reveals that the R9 290X takes 30-40% more time in tessellation per frame than an equivalent Nvidia card.

265e411cabe6f2dfe5b9beb4acbe1c94.jpg

I can't see how making the game look good is disadvantaging the competitors. That is a daft comment right there. We may as well have 1997 style GFX, so IGPU users can have decent performance. Sorry but for me, I want the candy and the candy in B:AO looks fantastic. Walking and scrapping in the snow works to a wow effect and love to see it. I would hate to see that dumbed down.

The first three scenes of the benchmark in Arkham Origins hammer tessellation. AMD’s driver allows us to manually define the tessellation level — changing that setting to x4 improves performance in the first three scenes of the test by 11%, from 134fps to 150fps. Total test performance improves by 7%, from 148fps to 158fps. AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience.

If a 2 gen old card like a 6970 has to tone down some tesselation, then I am stunned. I expect even a 570 would have issues and need to turn settings down.

My 2 cents worth anyway and I prey WB Montreal issue a statement for the right or wrong. I feel after the ExtremeTech blog and the others who have jumped on the bandwagon, they will need to say something.
 
They're making the game look "better" very inefficiently, that's a bad thing, no spin will change that.

Batman AO would look the same with the tessellation turned down, there's no IQ gain, just performance loss.

It's a very bad move, like Crysis 2 and the naff tessellation.
This doesn't have change the fact I have no problem with running the game, but that comment of yours just seems somewhat ignorant.

But I'd love to see a rebuttal to the driver thing :p
 
Last edited:
I could be wrong but the extra triangles could be reffering to tesselation where Nvidia like to add in far more than is needed as there is no benefit to the visuals.
 
Last edited:
They're making the game look "better" very inefficiently, that's a bad thing, no spin will change that.

Batman AO would look the same with the tessellation turned down, there's no IQ gain, just performance loss.

It's a very bad move, like Crysis 2 and the naff tessellation.
This doesn't have change the fact I have no problem with running the game, but that comment of yours just seems somewhat ignorant.

But I'd love to see a rebuttal to the driver thing :p

I am just going by all 3 games and by far, B:AO looks massively better than the previous incarnations to me. Maybe the tesselation is over kill but notching down AA makes it look bland IMO, so I don't believe it is purposefully put in to cripple AMD hardware.

I doubt anyone will rebut the driver optimizations but maybe someone can shed some light on it.
 
I am just going by all 3 games and by far, B:AO looks massively better than the previous incarnations to me. Maybe the tesselation is over kill but notching down AA makes it look bland IMO, so I don't believe it is purposefully put in to cripple AMD hardware.

I doubt anyone will rebut the driver optimizations but maybe someone can shed some light on it.

Oh come on, lowering the tessellation wouldn't reduce the IQ, but it'd gain performance.

There's no reason for it to be so intensive other than to cripple, just like Crysis 2 was, horribly inefficient at a time AMD's tessellation performance wasn't as good as Nvidia's (I can't remember when it launched, so I'm guessing, it was before the 7970?)

I know AMD get away with being the peoples champion, and people display a lot of ignorance, but it's no different to this.

People claim not to be unbiased, this that and the other, but that's rarely the case.

For the AA thing? There's subtle differences, if you use 8X MSAA it's like using Uber sampling on Witcher 2, calling less AA bland is a bit sensationalist.
 
Last edited:
IMO the tessellation in Arkham City made Batman look much more impressive lighting wise.

Better lighting is what DX11 can do, the tessellation is probably moot in that.

Like I say, Arkham City had naff performance really compared to Origins (Which is why I find this article a none starter)

For your AA comment, I haven't mentioned anything to the contrary, 8X MSAA Batman Arkham Origins had better IQ than 4X MSAA, like I said, it's like Uber sampling to Witcher 2, but that has little to do with the insane tessellation.
 
Oh come on, lowering the tessellation wouldn't reduce the IQ, but it'd gain performance.

There's no reason for it to be so intensive other than to cripple, just like Crysis 2 was, horribly inefficient at a time AMD's tessellation performance wasn't as good as Nvidia's (I can't remember when it launched, so I'm guessing, it was before the 7970?)

I know AMD get away with being the peoples champion, and people display a lot of ignorance, but it's no different to this.

People claim not to be unbiased, this that and the other, but that's rarely the case.

For the AA thing? There's subtle differences, if you use 8X MSAA it's like using Uber sampling on Witcher 2, calling less AA bland is a bit sensationalist.

I will do a couple of screenies tomorrow (far too tired now) and show what I mean.
 
Tessellation is fantastic where correctly applied. I found it made a huge difference to IQ in Crysis 2. I never played the game on AMD hardware though. However I'd of simply turned it off if it was crippling performance. I don't see how you can say it was implemented poorly if it worked fine on Nvidia cards. More favouritism is what you're saying?
 
Tessellation is fantastic where correctly applied. I found it made a huge difference to IQ in Crysis 2. I never played the game on AMD hardware though. However I'd of simply turned it off if it was crippling performance. I don't see how you can say it was implemented poorly if it worked fine on Nvidia cards. More favouritism is what you're saying?

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/5

Must I say more?

AMD's tessellation performance wasn't upto Nvidia's standards, so ridiculous levels of pointless tessellation (Which anyone can see) hampered the performance on the AMD GPU's more than the Nvidia.

That's not to mention all the water you're rendering that you can't see.

It's an inefficient implementation, you could have less tessellation, same IQ, better performance, that's the problem.

Look at Heaven 4, you can blatantly see the IQ improvement with tessellation, that's it done properly.
 
Last edited:
I've read that article. So you'd call depth effects like this pointless then?

I didn't have an adverse performance effects from running it. So not particularly fussed if they added it to inane objects like wood etc. Wouldn't say it's a brilliant use of the tech, applying it to every little thing granted though.

crysis22012-10-1505-07ruf0.png

18d7y3.jpg
 
Back
Top Bottom