• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD

That's odd, given that PhysX is designed to be run through a GPU I find it strange that a CPU handles it better from your experience. Is this only the case with the extreme Intel CPUs (hexcores) or is it also similar with the slightly lower performance ones like the i7-4770K and i7-3770K etc?

I am guessing it is because of the overclocked hexcores. I have just been checking old benchmarks and @1080p the stock Titan plus dedicated PhysX card is slightly faster than the 290X but @1600p it is the other way round with the stock 290X beating the Titan plus dedicated PhysX card. I think this is because of the lower fps @1600p which gives the CPU less work to do.

http://forums.overclockers.co.uk/showpost.php?p=25193535&postcount=50

In the link above there is a stock Titan plus another Titan doing the PhysX

@1080p
Min = 71
Max = 132
Ave = 94

@1600p
Min = 42
Max = 74
Ave = 53

http://forums.overclockers.co.uk/showpost.php?p=25232838&postcount=101

In the link above is a stock 290X @1600p using the same settings as the Titan above

@1600p
Min = 29
Max = 79
Ave = 56

http://forums.overclockers.co.uk/showpost.php?p=25232639&postcount=99

In the link above is a stock 290X @1080p using the same settings as the Titan above

@1080p
Min = 30
Max = 135
Ave = 88

You can see several things from the numbers above

@1080p the Titan is faster

@1600p the 290X is faster

The minimums for the 290X is lower at both 1080p and 1600p, I have put this down to the CPU also doing the PhysX.

I think if the 290X was to have a dedicated PhysX card the same as the Titan above has, the 290Xs minimums would come up and it would win at both 1080p and 1600p by a wide margin.
 
AMD have already said they submitted optimizations before the deadline.

Nvidia's fault here is debatable, depends on what if anything they have to do with it.

Right now WB are not accepting anything from AMD, they are shut out and game servicing from WB does not exist for AMD owners, this with the game in a sub par state.

They have taken my money and then denied my hardware vendor the service they set out to provide.

Its unacceptable, if it was DICE denying Nvidia optimizations for BF4 i would feel exactly the same way. wouldn't you?

I suppose you do have a point, however I'd need more information regarding this matter before I reach a conclusion on it. AMD have likely been busy lately with Mantle and so for all I know they could've submitted sub standard code to WB which was, in turn, declined. That's just speculation on my part though.

We need to know why WB declined the AMD optimisations and the terms of them being shut out from the game code. If the situation is that Nvidia has instructed WB to reject AMD optimisations in return for some pay offs, then that is a serious issue indeed. However, as I say, this is all speculation and the facts are what's needed.
 
I am guessing it is because of the overclocked hexcores. I have just been checking old benchmarks and @1080p the stock Titan plus dedicated PhysX card is slightly faster than the 290X but @1600p it is the other way round with the stock 290X beating the Titan plus dedicated PhysX card. I think this is because of the lower fps @1600p which gives the CPU less work to do.

...

The minimums for the 290X is lower at both 1080p and 1600p, I have put this down to the CPU also doing the PhysX.

I think if the 290X was to have a dedicated PhysX card the same as the Titan above has, the 290Xs minimums would come up and it would win at both 1080p and 1600p by a wide margin.

Interesting, thanks for the information. It seems that AMD graphics card users should consider investing in an Intel enthusiast CPU or a dedicated Nvidia PhysX GPU with the money they have saved from not buying a top of the range Nvidia card if they plan on playing lots of these PhysX enabled titles then.

Although, I'd have thought that there would probably be teething issues using an AMD card with an Nvidia card for dedicated PhysX calculations. I don't really know, I'd try it out if I had a decent AMD card; alas the only AMD I have is an old HD5450 lying around somewhere.
 
I suppose you do have a point, however I'd need more information regarding this matter before I reach a conclusion on it. AMD have likely been busy lately with Mantle and so for all I know they could've submitted sub standard code to WB which was, in turn, declined. That's just speculation on my part though.

We need to know why WB declined the AMD optimisations and the terms of them being shut out from the game code. If the situation is that Nvidia has instructed WB to reject AMD optimisations in return for some pay offs, then that is a serious issue indeed. However, as I say, this is all speculation and the facts are what's needed.

+1
 
I am guessing it is because of the overclocked hexcores. I have just been checking old benchmarks and @1080p the stock Titan plus dedicated PhysX card is slightly faster than the 290X but @1600p it is the other way round with the stock 290X beating the Titan plus dedicated PhysX card. I think this is because of the lower fps @1600p which gives the CPU less work to do.

Are you sure CPU PhysX in AO is doing all the same stuff as in GPU mode? in most titles with CPU PhysX it drops some effects and/or reduces particle counts, etc. on the CPU.
 
Interesting, thanks for the information. It seems that AMD graphics card users should consider investing in an Intel enthusiast CPU or a dedicated Nvidia PhysX GPU with the money they have saved from not buying a top of the range Nvidia card if they plan on playing lots of these PhysX enabled titles then.

Although, I'd have thought that there would probably be teething issues using an AMD card with an Nvidia card for dedicated PhysX calculations. I don't really know, I'd try it out if I had a decent AMD card; alas the only AMD I have is an old HD5450 lying around somewhere.

Are you sure CPU PhysX in AO is doing all the same stuff as in GPU mode? in most titles with CPU PhysX it drops some effects and/or reduces particle counts, etc. on the CPU.

I just noticed when I went to do some runs on the Titan with the PhysX on the CPU that the highest setting the game will allow for PhysX using the CPU is normal. If I use a Titan as a PhysX card I get the option to use the High setting.

Below are some revised figures I have just run for the Titan plus the 290X ones which are not going to change as it has to use the CPU but on the normal setting for PhysX.



1080p CPU used for PhysX in game setting Normal

Titan
dzsn.jpg
Min = 27
Max = 164
Ave = 93

290X
hi6t.jpg
Min = 30
Max = 135
Ave = 88


1600p CPU used for PhysX in game setting Normal

Titan
6sxq.jpg
Min = 26
Max = 75
Ave = 55

290X
k7ku.jpg
Min = 29
Max = 79
Ave = 56


1080p Dedicated card used for PhysX in game setting Normal

Titan
behg.jpg
Min = 74
Max = 205
Ave = 104


1080p Dedicated card used for PhysX in game setting High

Titan
4ea8.jpg
Min = 71
Max = 132
Ave = 94


1600p Dedicated card used for PhysX in game setting Normal

Titan
od2r.jpg
Min = 42
Max = 75
Ave = 57


1600p Dedicated card used for PhysX in game setting High

Titan
kxl1.jpg
Min = 40
Max = 72
Ave = 53


Doing PhysX on the CPU does not allow you to select the High in game setting.

Apples to apples @1080p the Titan is faster.

Apples to apples @1600p the 290X is a tiny bit faster.

Using the CPU for PhysX only allows you to select the Normal in game setting and also really hits the minimum fps.
 
Last edited:
Was with Amd a long time... But I like my 3d, and there's just something about an nvidia setup... Not entirely sure I see nvidia at fault here though.

Don't particularly like them as a company though.

Why? they bend over backwards to give their customers the optimum gaming experience... all AMD do is sit on the sidelines crying about the big bad wolf. AMD customers should be asking why is that AMD don't invest money and go out of their way to do the same. Exclusivity deals are normal practice in the console space, at least games still work on AMD hardware.
 
Why? they bend over backwards to give their customers the optimum gaming experience... all AMD do is sit on the sidelines crying about the big bad wolf. AMD customers should be asking why is that AMD don't invest money and go out of their way to do the same. Exclusivity deals are normal practice in the console space, at least games still work on AMD hardware.

They have been responsible for a fair few dodgy practices, and as much as they own/buy/create great stuff, sometimes they take the whole proprietary thing too far, as with physx, and imo, g - sync. Also to a lesser extent 3d.

Sometimes if they had been more open with physx, I'm sure uptake would have been far greater. Honestly, I don't think many people have bought a dedicated card for physx. And then why block physx running on nvidia card when amd card is main one? Just not necessary.
 
Why? they bend over backwards to give their customers the optimum gaming experience... all AMD do is sit on the sidelines crying about the big bad wolf. AMD customers should be asking why is that AMD don't invest money and go out of their way to do the same. Exclusivity deals are normal practice in the console space, at least games still work on AMD hardware.

The only bending over is by Nvidia's customers. :D
 
Someone needs to explain to me why it's bad that Nvidia have exclusive locked out features with twimtbp titles but not AMD with Mantle?

Both play dirty.. You can't say with a straight face that Mantle is open, as right now it only works on GCN, that's an AMD hardware, it's about as open as Nvidia's twimtbp titles :p

All this bashing one side and the other, gets so boring.. If Nvidia do get negative pr for this, I just hope they bring it with Maxwell, no medium pretend high end card this next round. Nvidia need to come out on top, and remind us why they are worth the extra money for gaming. If they don't step up the hardware with Maxwell, then AMD may for the first time regain lost market share. We need both company's being competitive else it's just high prices and stagnation..

Come on Nvidia it's time to justify the high prices.
 
Someone needs to explain to me why it's bad that Nvidia have exclusive locked out features with twimtbp titles but not AMD with Mantle?

Both play dirty.. You can't say with a straight face that Mantle is open, as right now it only works on GCN, that's an AMD hardware, it's about as open as Nvidia's twimtbp titles :p

All this bashing one side and the other, gets so boring.. If Nvidia do get negative pr for this, I just hope they bring it with Maxwell, no medium pretend high end card this next round. Nvidia need to come out on top, and remind us why they are worth the extra money for gaming. If they don't step up the hardware with Maxwell, then AMD may for the first time regain lost market share. We need both company's being competitive else it's just high prices and stagnation..

Come on Nvidia it's time to justify the high prices.

Is not about exclusive features and its clear you have not read and neither understand.

When a game uses Mantel it does not prevent the option of the standard DX11 path being optimized by NV while in this case Nvidia’s GameWorks stops the standard DX11 path being optimized by AMD.

This maybe a one off situation, we shall see.
 
Last edited:
Is not about exclusive features and its clear you have not read and neither understand.

When a game uses Mantel it does not prevent the option of the DX11 path being optimized by NV whil in this case Nvidia’s GameWorks stops the DX11 path being optimized by AMD.

To be fair i don't think the majority quite understand.
 
Nvidia Playing Dirty With GameWorks – Semi-Propriety Nature May Pose Serious Danger to AMD, Dev’s Power Diluted.

GameWorks – Graphical Feats Performed by Hidden Code Optimized by NVIDIA for both Developers and AMD.

However thing about this particular library is that its code is closed. Developers cannot see the code at all, therefore they cannot modify it. Only Nvidia has the power to modify this particular set of code. Now though it will only be a slight inconvenience and a mild insult to developers, it can be a much serious threat for AMD.
Basically AMD will be relying on NVIDIA to Optimize the GameWorks Library for AMD GPUs – Because they cannot edit the code themselves.
You probably see the subtle attack now, with the AMD Driver team unable to optimize their GPUs for games with GameWorks, AMD will be quite literally at Nvidia’s mercy.

Full Article
http://wccftech.com/nvidia-playing-...-pose-serious-danger-amd-dev-s-power-diluted/
 
Nvidia Playing Dirty With GameWorks – Semi-Propriety Nature May Pose Serious Danger to AMD, Dev’s Power Diluted.

GameWorks – Graphical Feats Performed by Hidden Code Optimized by NVIDIA for both Developers and AMD.

Full Article
http://wccftech.com/nvidia-playing-...-pose-serious-danger-amd-dev-s-power-diluted/

Not sure I can take any article seriously which muddles up propriety and proprietary :D. And as you didn't use a "(sp)" when reposting you're just as bad!!!! :p

What I don't understand is: nobody is forcing game developers into relationships with AMD or nVidia. Sure the monetary incentive is large but that is their choice as studios to accept or decline. So why is this being spun as a nVidia problem? Same with Mantle really - the talk of it being GCN only at first (i.e. AMD only) - and then opened up later on... why is this AMDs fault? I'm sure the game developer is happier that they've got to only optimise it for a subset of GPUs at first so it's not really an altruistic decision. It's all just nonsense really.

People in life in general (not just here!) get up in arms about the smallest things as an excuse to get the pitchforks out.

Read my post above. That's bad news. :(

Only if you've got an AMD GPU.

[fake troll mode]Could be a good time to get a nVidia GPU - you get the benefits of all the AMD "open" tech as well as the "propriety" tech from nVidia[/fake troll mode]

:D :D
 
Last edited:
Read my post above. That's bad news. :(


Ok, I've glanced through this thread and I am a total loss at what you're trying to prove.

Performance is pretty much very similar in Arkham Origins if not for PhysX. The last link you've reposted is against a 770 at base settings (maximums only reported), for which there is probably a CPU bottleneck at those frame rates.

So the real question is, why are you posting all these articles when it's a non story? Nvidia worked along side WB Studios. In the same way Nvidia lost out when Tomb Raider launched because Crystal Dynamics hadn't given them the latest game build for their drivers.

As Greg tried to point out in the opening post (4 pages ago) you're picking up whatever you can find on the internet (which is obviously 100% truth wherever you look according to some) and making a non story.
 
Back
Top Bottom