• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gameworks, Mantle and a pot calling a kettle black

If you not seen some of the benchmarks already, cards like 7850 and 6950 got their arses handed to them by the much slower GTX560Ti and even the lowly 128-bit 16 ROPS 650Ti (not the faster 192-bit Boost version).

Who cares? It's an Nvidia sponsored game, of course it's going to run faster on Nvidia hardware...
 
Well, the issue is the impact are NOT the same on both vendors. If you not seen some of the benchmarks already, cards like 7850 and 6950 got their arses handed to them by the much slower GTX560Ti and even the lowly 128-bit 16 ROPS 650Ti (not the faster 192-bit Boost version).

You're missing the point again, those games show bias with all gameworks features OFF, so it is not gameworks causing the disparity, it is the core game / AMD drivers combo

Blaming gameworks when it is off is the same as setting the game to use FXAA and then blaming low frame rates on 8xMSAA - when you arent even using it
 
Last edited:
Well, the issue is the impact are NOT the same on both vendors. If you not seen some of the benchmarks already, cards like 7850 and 6950 got their arses handed to them by the much slower GTX560Ti and even the lowly 128-bit 16 ROPS 650Ti (not the faster 192-bit Boost version).

It is hard not to point finger when this doesn't happen in ANY non GW games. Yes there's not prove if it's Nvidia or the developers mess up or not intentionally or unintentionally, but the fact is resuls are there.

Which is why this all started right? Because overall game performance is pants and does need to be looked into. Disabling GW doesn't resolve that issue though so while the idea of a witch-hunt is fine to find answers, it's got the wrong target.
 
Which is why this all started right? Because overall game performance is pants and does need to be looked into. Disabling GW doesn't resolve that issue though so while the idea of a witch-hunt is fine to find answers, it's got the wrong target.
Actually no. People just pretend didn't even see it, or brush it off as a non-issue (since they are not using AMD cards themselves).

They keep on insisting AMD users are making a fuss over nothing. Had the position of Nvidia and AMD swapped, I doubt they would be remaining quiet themselves. Just look how some of those same people criticising Mantle despite it doesn't reduce the hurt dx performance on Nvidia hardware in anyway. Very double-standard.
 
Last edited:
Actually no. People just pretend didn't even see it, or brush it off as a non-issue (since they are not using AMD cards themselves).

They keep on insisting AMD users are making a fuss over nothing. Had the position of Nvidia and AMD swapped, I doubt they would be remaining quiet themselves. Just look how some of those same people criticising Mantle despite it doesn't reduce the hurt dx performance on Nvidia hardware in anyway. Very double-standard.

I'm using AMD cards!

I imagine if the positions were reversed there would still be just as much complaining and defending, just the Nvidia people would be complaining and the AMD people would be defending.
Same with Mantle, if it was an Nvidia tech, the AMD users would complain it's fragmenting the gaming market and it'd be pointless with DX12 on the way. Nvidia users would be defending it and doomsaying about DX12 being a multiplatform API that will be rubbish on every platform when it finally arrives in a few years time.
AMD and AMD users aren't much better or worse than Nvidia and Nvidia users.
 
You're missing the point again, those games show bias with all gameworks features OFF, so it is not gameworks causing the disparity, it is the core game / AMD drivers combo

Blaming gameworks when it is off is the same as setting the game to use FXAA and then blaming low frame rates on 8xMSAA - when you arent even using it
Funny enough I didn't even blame GameWorks directly, I was saying games that doesn't use GameWorks had never had situation where AMD cards would underperform so badly. I was merely pointing out the coincidence.
 
Funny enough I didn't even blame GameWorks directly, I was saying games that doesn't use GameWorks had never had situation where AMD cards would underperform so badly. I was merely pointing out the coincidence.

Maybe some of the game devs that use gameworks ,, dont have the time/skills to code nicely so take the easy approach and its why some of the releases perform poorly?
 
I'm using AMD cards!

I imagine if the positions were reversed there would still be just as much complaining and defending, just the Nvidia people would be complaining and the AMD people would be defending.
Same with Mantle, if it was an Nvidia tech, the AMD users would complain it's fragmenting the gaming market and it'd be pointless with DX12 on the way. Nvidia users would be defending it and doomsaying about DX12 being a multiplatform API that will be rubbish on every platform when it finally arrives in a few years time.
AMD and AMD users aren't much better or worse than Nvidia and Nvidia users.
Interestingly if AMD did pull something like GameWorks, I wouldn't even bother to defend them (especially with results already showing the issues), as I am not naive enough to believe they won't hurt their competitor if they were in control :p But to be honest, the biggest drive factor behind most people (and myself included) being not trusting Nvidia to do the right thing is still probably the fact that PhysX is locked out at software level when detecting other vendor's GPU, despite people may already own a Nvidia card from previous upgrade.

I don't for a second think AMD is a saint as Nvidia users keep on insisting AMD users trying to make them out to be, I just honestly think they are the lesser of the two evils...for now at least. So many times I keep hearing "both vendors are as bad as each other", but the irony is that AMD simply isn't as good at being bad as Nvidia, even if they try :D
 
Last edited:
Interestingly if AMD did pull something like GameWorks, I wouldn't even bother to defend them (especially with results already showing the issues), as I am not naive enough to believe they won't hurt their competitor if they were in control :p But to be honest, the biggest drive factor behind most people (and myself included) being not trusting Nvidia to do the right thing is still probably the fact that PhysX is locked out at software level when detecting other vendor's GPU, despite people may already own a Nvidia card from previous upgrade.

I don't for a second think AMD is a saint as Nvidia users keep on insisting AMD users trying to make them out to be, I just honestly think they are the lesser of the two evils...for now at least. So many times I keep hearing "both vendors are as bad as each other", but the irony is that AMD simply isn't as good at being bad as Nvidia, even if they try :D

It's easy to say in a hypothetical situation. I bet a lot of Nvidia people could say that if roles were reversed they wouldn't have a problem with AMD 'sponsored' (or whatever) games running better on AMD machines.

The one thing I'll say about the PhysX thing (and I do own some older Nvidia cards) is that I think it's wrong to lock out PhysX just because there is an AMD card in the system. I do think it's fine for them to stop PhysX running if the Nvidia card isn't the primary card. If they wanted to sell PhysX cards, they would. Nvidia don't (as I recall) it was AEGIS that did that (didn't seem to work out too well for them either).
 
The one thing I'll say about the PhysX thing (and I do own some older Nvidia cards) is that I think it's wrong to lock out PhysX just because there is an AMD card in the system. I do think it's fine for them to stop PhysX running if the Nvidia card isn't the primary card. If they wanted to sell PhysX cards, they would. Nvidia don't (as I recall) it was AEGIS that did that (didn't seem to work out too well for them either).
Regardless of the point of view, the fact remains that the customer DID get the short-end of the stick and not get feature they are supposed to, purely because of Nvidia want to take away the freedom of people on considering choosing graphic cards.
 
Last edited:
Regardless of the point of view, the fact remains that the customer DID get the short-end of the stick and not get feature they are supposed to, purely because of Nvidia want to take away the freedom of people on considering choosing graphic cards.

I appreciate that there are technical reasons behind it, but I would expect to be able to have a 780Ti and 7850 in my machine, play a game on the 780Ti and expect to be able to use Mantle.
Yet, my 290s did advertise Mantle as a feature, but I doubt I'd get to use it if my primary card was a 780Ti.
 
I appreciate that there are technical reasons behind it, but I would expect to be able to have a 780Ti and 7850 in my machine, play a game on the 780Ti and expect to be able to use Mantle.
Yet, my 290s did advertise Mantle as a feature, but I doubt I'd get to use it if my primary card was a 780Ti.

To be fair you cannot make that comparison. Nvidia cards can be used as dedicated PhysX cards (and PhysX is an extension, not what a game is running on). The fact Nvidia block it if AMD is the primary card is foul play - but we're getting off topic here :)
 
To be fair you cannot make that comparison. Nvidia cards can be used as dedicated PhysX cards (and PhysX is an extension, not what a game is running on). The fact Nvidia block it if AMD is the primary card is foul play - but we're getting off topic here :)

Maybe TrueAudio would've been a better example than Mantle. I bet putting a card that supports TrueAudio in an Nvidia system doesn't allow you to use TrueAudio (although I haven't tried so maybe you can).
But as you say this is starting to get a bit off topic.
 
There was a bit of a scandal back when TA was announced, because of some DRM they were allegedly doing. That got swept under the rug and now I cant' find it again.
 
To be fair you cannot make that comparison. Nvidia cards can be used as dedicated PhysX cards (and PhysX is an extension, not what a game is running on). The fact Nvidia block it if AMD is the primary card is foul play - but we're getting off topic here :)
Yea. And let's not forget that Hybrid PhysX actually worked with someone simply made a crack for it (well, worked until Nvidia sent the guy a threatening letter :p).

I just don't like the idea of "being trapped" when they are making/already made money from us. With PhysX disable at software level when AMD card is the primary card is almost as ridiculous as hypothetically you bought a Asus soundcard with features such as Dolby Headphones and Dolby Pro Logic IIx, but they will get disabled if it detects non-Asus brand graphic card and motherboard :rolleyes:
 
Funny enough I didn't even blame GameWorks directly, I was saying games that doesn't use GameWorks had never had situation where AMD cards would underperform so badly. I was merely pointing out the coincidence.

Oh really? Games have never shown a big vendor bias before GameWorks came along? you really want to go down that road?

http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/7

Civilisation 5 uses DX Command Lists, here it is, 2 GTX580's getting over 120fps on max settings... 2 7970's (which should absolutely trounce 580's) getting 110
and this review is from 2013, so you can't even blame the 7970 launch drivers

No vendor bias there then :rolleyes:

and what caused this bias? AMD CHOSE not to support a basic DirectX optimisation technique. Is that Nvidia's fault?
BF3 also used Command Lists and that game also showed a 10-20% vendor bias.

There have also been AMD gaming evolved titles that have, on launch, shown a big vendor bias.

So are we also now saying that Microsoft are deliberately sabotaging AMD performance because AMD choose not to support parts of DX?
 
Oh really? Games have never shown a big vendor bias before GameWorks came along? you really want to go down that road?

http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/7

Civilisation 5 uses DX Command Lists, here it is, 2 GTX580's getting over 120fps on max settings... 2 7970's (which should absolutely trounce 580's) getting 110
and this review is from 2013, so you can't even blame the 7970 launch drivers

No vendor bias there then :rolleyes:

and what caused this bias? AMD CHOSE not to support a basic DirectX optimisation technique. Is that Nvidia's fault?
BF3 also used Command Lists and that game also showed a 10-20% vendor bias.

There have also been AMD gaming evolved titles that have, on launch, shown a big vendor bias.

So are we also now saying that Microsoft are deliberately sabotaging AMD performance because AMD choose not to support parts of DX?
When Crossfire and SLI comes into play, it introduces other variables such as the multi-GPU scaling for the particular game with Crossfire vs SLI, so I wouldn't consider it to be exactly like for like comparison to single GPU results comparison (especially given how SLI and Crossfire is far more driver dependent than single GPU).
 
Oh really? Games have never shown a big vendor bias before GameWorks came along? you really want to go down that road?

http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/7

Civilisation 5 uses DX Command Lists, here it is, 2 GTX580's getting over 120fps on max settings... 2 7970's (which should absolutely trounce 580's) getting 110
and this review is from 2013, so you can't even blame the 7970 launch drivers

No vendor bias there then :rolleyes:

and what caused this bias? AMD CHOSE not to support a basic DirectX optimisation technique. Is that Nvidia's fault?
BF3 also used Command Lists and that game also showed a 10-20% vendor bias.

There have also been AMD gaming evolved titles that have, on launch, shown a big vendor bias.

So are we also now saying that Microsoft are deliberately sabotaging AMD performance because AMD choose not to support parts of DX?

Doesn't command lists boost performance for about 5 minutes, then it tails off and actually ends up hurting performance? It's good for inflating benchmark scores, but that's about it. That's what Dan Baker says, he was the graphics lead on the game you mention.
 
You mean Dan Baker that now works for Oxide. Because that sounds like he would still be impartial, given that he was extolling the virtues of command lists a year or so ago.

If anything, given how much he talked up command lists and now he's in with Mantle is rubbishing them, it doesn't do too much for his credibility as it just makes it look like he's willing to say whatever someone pays him to say.

I've played Civ5 on both AMD and Nvidia hardware and I don't get a fall off in performance on Nvidia, so I would say no. IIRC Kaap is a big Civ5 fan so he might be able to chip in as well.

Which is still tangential from the point Marine made, that I was pointing out was a fallacy, which is that games have ALWAYS shown vendor bias, it is not something that has suddenly appeared with GameWorks.
Also, you are now saying that benchmark results are not important, when targeted specific benchmark results are exactly the basis of the articles complaining about GameWorks - using one specific set of settings and results to make a complaint, when other settings in the same game show a different result these are ignored.

When Crossfire and SLI comes into play, it introduces other variables such as the multi-GPU scaling for the particular game with Crossfire vs SLI, so I wouldn't consider it to be exactly like for like comparison to single GPU results comparison (especially given how SLI and Crossfire is far more driver dependent than single GPU).

The single GPU scores are also much higher for the 580 than they should be compared with 7970, e.g. roughly the same as the 7970 when it should be a good 30%+ faster
 
Last edited:
Back
Top Bottom