• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PCGH asks Nvidia on Gameworks

Over use of tessalation=Crysis 2, Batmans cape, COD wolfs and the dog.

We have 2 Nvidia employees stating the polar opposite, one states AMD have access to game code and the other stating AMD don't have access to game code as they are a competitor.




Hairworks is valid in line with GW performance.

I'm using Nvidia and AMD, and never saw any issue in WD's so far and have stated so a few times already, COD, I don't have, but I have BAO and the performance delta is way, way off until you add MSAA which is a brute force approach at a driver level.

Asking a silly question, do you mean source game code or do you mean source gameworks code as they are really two different things, and which was the nvida Employee talking about talking about?
 
@MJ Frosty sorry for OT, but we have been trying GTX 780SLI on4K /AOC U2868/ however, 60Hz runs fine, but another issue came up- no SLI

we have tried like 5 SLi bridges but we couldn t get the SLI to run ..any advice, pls ?

I've had no issues with SLI. What bridges are you using? I'm using a ROG ASUS one and it's working fine. Will check rev when I can. AFAIK it's just EVGA bridges with an issue, and even then you will still be able to enable SLI. You'll just experience tearing issues and rubber banding. I would consider reseating the cards and DDU
 
Again, nothing to do with GW though.


C'mon rusty, you know full well Hairworks is part of GameWorks, COD Ghosts use HW's it, part of huddy's argument is over use of Tessalation:

As part of our wider NVIDIA GameWorks toolset, NVIDIA Hairworks is just one of the many ways we’re working to enhance and improve PC games and game development. In 2014 you’ll see further uses of NVIDIA Hairworks, along with new uses of PhysX

http://www.geforce.co.uk/whats-new/articles/call-of-duty-ghosts-nvidia-hairworks

 
@triss,

AMD aren't allowed near GW's code, no compilers/interpreters, a competitors code HAS to run on your competitors hardware, if the shoe was on the other foot, there would be uproar-Gpu PhysX/Mantle is proof of this even though it doesn't run on AMD/Nvidia.
 
@Tommy so your talking about GW source then correct?
Nvidia did indeed state clearly they dont give that to competitors and tbh why should they? Why should they offer it to the competition to see how they optimized for their cards?
Yes in a ideal world everything is free and available to all but most business dont work like that , end of the day to me as a layman GW a tool that is upto the dev to use or not use just like they can choose to include mantle or not , whats the difference?
lets take hiarworks since its been mentioned Ggost used it and either camp can have the choice of using it,, yes it may well run better on Nvidia cards but since they designed it they should know how to tweak it for there hardware. If there was no hairworks would it even be a option in game? Surely that would be a bigger loss for us all?
 
Who cares.

Nvidia aren't able to optimise for Gaming Evolved titles until after they're released anyway. If you're that bothered then just buy Nvidia unless you can give me a specific reason why that in itself is an issue for you.

GameWorks isn't in itself a bad thing. Ghosts Performance is pretty poor on NV hardware with fur enabled and it is down to the developer to choose just how much tessellation they want to use for specific objects. Watch Dogs is the biggest GW title recently, and it performs so much better on AMD than it does on NVhardware at the moment. For whatever reason that is, is unclear.

I don't agree with some of Nvidia policies, they've very much lost a lot of face within the community. AMDs outcry approach however solves nothing and IMHO comes across as partially desperate. They're both at each other's throats although it makes sense in the corporate world, what some of you are doing is ridiculous. You can't see the forest through the trees and I don't believe some of you for a second even give different hardware a second glance.

If GW was of such grave importance then I'm sure some of you would have no problems using NV instead...

That's the last time I'm mentioning GameWorks in a thread regarding article spin
 
Hairworks is licensed and available separately from the main gameworks binary though. Putting it under the gameworks banner seems like a marketing rather than technical situation.

Anyway, realistic hair and fur is always going to require higher levels of tessellation than many other types of object. I was led to believe that AMD had resolved their weakness in that area?
 
Hairworks is licensed and available separately from the main gameworks binary though. Putting it under the gameworks banner seems like a marketing rather than technical situation.

Anyway, realistic hair and fur is always going to require higher levels of tessellation than many other types of object. I was led to believe that AMD had resolved their weakness in that area?

Amd are not weak on Tesselation but Nvidia cards are still supperior at very high levels. At these higher levels there's no benefit to the end user but because it creates a bigger hit on Amd hardware Nvidia have allegedly in the past used some dirty tricks with the main one being over tesselating in Crysis 2. Crytek are also at fault as it's there game but being a Nvidia sponsored title it's very suspicious knowing both sides strengths and weaknesses. Tesselating flat surfaces and water that's not even visible on the screen is just stupid.

The thing with Nvidia sponsored titles is these kind of things keep popping up whether it's Gameworks, over tesselation, anti-aliasing missing on amd and am sure there's a good few others. It's hard not to look in Nvidia's directions as it does not happen on neutral games and Amd sponsored games.
 
Last edited:
No argument about the C2 stuff (which was years ago). But realistic hair and fur is going to require very high levels of tessellation, actually even higher than what cards are capable of right now. You say it is at a level that provides no end user benefit, but without being able to play around with values, it is nothing more than conjecture.

Both companies are not above this type of thing though. Remember a number of GE titles made very heavy use of compute at a time they had a clear advantage, swings and roundabouts.
 
No argument about the C2 stuff (which was years ago). But realistic hair and fur is going to require very high levels of tessellation, actually even higher than what cards are capable of right now. You say it is at a level that provides no end user benefit, but without being able to play around with values, it is nothing more than conjecture.

Both companies are not above this type of thing though. Remember a number of GE titles made very heavy use of compute at a time they had a clear advantage, swings and roundabouts.

With Amd you can play around with Tesselation levels in the ccc which i think was brought in to counter what nvidia was doing. Back then the gtx480 was so far ahead in Tesselation performance it was not even close. Amd did take advantage of there superior compute but you were gaining from Tressfx hair in Tomb Raider. Nvidia done the same back with the gtx580 and tbh i had no problem with that as it forced Amd to bring in decent compute capabilities which we will all benefit from. Even the new consoles have compute capabilities and that's mainly due to Nvidia i would say.
 
Last edited:
Feel free to correct me on this as I am working from memory but I believe the 290X has 4 tessellation units, whilst nVidia have 12 on the 780? nVidia are working to their strengths if they are fully utilising tessellation. Overkill? Perhaps but I don't see that as part of GameWorks gimping performance on AMD hardware. Tessellation has been around for a number of years now but AMD have not really picked up on it and stuck with a low count of tessellation units since the single unit that was chucked on to the 5 series.

Again though, feel free to correct me if I am wrong :)
 
AMD's cards are differently tuned, they have much higher pixel fill rates than Nvidia ^^^ but less tessellation.

*Snip* :)

The thing with Nvidia sponsored titles is these kind of things keep popping up whether it's Gameworks, over tesselation, anti-aliasing missing on amd and am sure there's a good few others. It's hard not to look in Nvidia's directions as it does not happen on neutral games and Amd sponsored games.

Thats the thing, these things keep cropping up where Nvidia are involved.

IMO They are also stuck in the late 1990's / early 2000's, back when there was 20 or so vendors all with their own propitiatory tech trying to lock each other out and get their tech to be dominant.

Almost none of them could get Developers to use any of that tech without paying them, and almost all of them went to the wall pretty quick.

Its part of the reason we are now down to two.
And for the industry as a whole none of those technologies are widely used.

Nvidia hovered up a couple of them and then continued to deny the industry as a whole to use them, hence a very limited number of games using that tech, stagnation. Nvidia just don't get it. they are still very combative, they are still just all about themselves with enemies on every corner... when what they should be doing to move the industry forward is to work with it as a whole.
 
Last edited:
@triss,

I'm talking about GW's in any shape or form, in direct comparison TressFX isn't closed off to AMD's competitor-The Way It's Always Been Done.

@layte,

Whether HW's is/isn't part of GW's, doesn't matter, the end implementation is unquestionable.

The performance difference figures are known, it isn't conjecture.

Compute purposefully stripped out of a gpu series obviously is advantageous to a competitor, when they released gpu's with compute, performance was similar, it wasn't massively superior to AMD, unlike Nvidia tessalation techniques...

@Gregster,

That doesn't matter if my 670 could throw out faster fps with lesser optimised tessalation levels that I can't physically notice a difference.

Purchasing a new gpu because I'm forced to rather than need to isn't an answer, isn't an explanation that's harming/milking your customer base.
 
Feel free to correct me on this as I am working from memory but I believe the 290X has 4 tessellation units, whilst nVidia have 12 on the 780? nVidia are working to their strengths if they are fully utilising tessellation. Overkill? Perhaps but I don't see that as part of GameWorks gimping performance on AMD hardware. Tessellation has been around for a number of years now but AMD have not really picked up on it and stuck with a low count of tessellation units since the single unit that was chucked on to the 5 series.

Again though, feel free to correct me if I am wrong :)

I don't know how it all works but Rroff explained it way back. What i do know is its not a straight 12 v 4 as there units work totally different whithin the architecture or at least they did. Nobody wins from overkill as performance is down on both sides. If there is no benefit to your eyes then its wasted performance that perhaps could have gone else where in the game. It is a tactic though but i would hope hurting your own users to hurt your competitor more is not something most people would agree is good.
 
AMD's cards are differently tuned, they have much higher pixel fill rates than Nvidia ^^^ but less tessellation.

Nice ninja edit :D

you still get four independent tessellation units

http://www.techpowerup.com/reviews/AMD/R9_290/

Anyway, I am still failing to see what nVidia are doing wrong with GameWorks. They are giving devs easier tools to use with their game and this free's up time to work on other aspects.

Ohhh and just to add, Uber sampling in The Witcher 2 kills my performance to an unplayable level, so I turn it off.
 
Last edited:
Nvidia are gimping overall performance on your Titans/my 670 in order to make the competition look weak, deniability is strong with most Nvidia users though, which is a shame really as were robbed of better performance.
 
Nvidia are gimping overall performance on your Titans/my 670 in order to make the competition look weak, deniability is strong with most Nvidia users though, which is a shame really as were robbed of better performance.

Are they? I thought my Titans coped quite well in heavy tessellated games to be honest. MSAA can hurt me and Ubersampling can as well but I choose to lower those to get frames back up. Can't you do the same with tessellation?
 
Anyway, I am still failing to see what nVidia are doing wrong with GameWorks. They are giving devs easier tools to use with their game and this free's up time to work on other aspects.

well the complaint is they are adding in crazy amounts that the gamer never even see's just to gimp amd users

and then this is forced by dll's that cant be changed

you understand that part right? or you just dont agree anything wrong with it
 
Back
Top Bottom