• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPR W3 on Hairworks(Nvidia Game Works)-'the code of this feature cannot be optimized for AMD'

errrm, that's exactly what AMD are doing with their tessellation override, the fact that users can't notice it with their eyes is irrelevant. I can't tell the difference between high and ultra textures in games but do I want a driver setting which ignores my requests for ultra textures and forces the use of high instead? hell no.

If a game calls for level 32 tessellation and people are using AMD drivers to force 8 or 16, they are not rendering the same scene as NVidia and seeing the game as the developer intended, simple as. AMD can make as many excuses as they want but the fact is if their tessellation performance was good enough they wouldn't have gone anywhere near such a thing.

Another very fundamental thing you're completely overlooking is that with a black box api like gameworks, where devs are offered it cheap without code or expensively with source code.... you can't make the claim that AMD and Nvidia are producing the same IQ. Nvidia control the code, there is absolutely no IQ improvement going from 8 or 16x tessellation to 64x. Inside that black box code all they have to do is have code that says if AMD card present, use 64x tessellation, if Nvidia card is present, use 8x tessellation.

Again very simple draw a triangle, keep sub dividing it into two smaller triangles... can you do this infinitely or does it effectively become a line? If you start with a very small triangle to begin with do you think there is much benefit doing it more than 4 or 8 times?

You claim Nvidia are running a higher level and AMD are overriding this, for one thing as default the override is NOT enabled meaning they aren't cheating at all. They are running whatever absurd level Nvidia is telling their code to run for AMD hardware. You don't know what level Nvidia is running because it's their code. Fundamentally though, 64x tessellation gives ZERO benefit, only a performance hit, so you have two options. Nvidia are hurting Nvidia users performance on purpose to get them to upgrade hardware OR in their own code or in the drivers they optimise to run the best compromise between performance and IQ.

Can you say for certain that Nvidia are running a higher level of tessellation than AMD? Nvidia could have a different path in the code for their own hardware or could be doing precisely what AMD do in drivers, but as a standard option which the user can't enable.

If Nvidia isn't doing one of those two things, they are simply incompetent. There is no two ways about this, you can't say but or if, over tessellation beyond a point of visual difference hurts performance with not a single benefit to the end user. So Nvidia either aren't doing it, are doing it on purpose or are so incompetent they are doing it by accident..... and it's 100% not the latter.
 
Well, what can one say. If you're happy repeating failsafe corporate responses instead of discussing the issue properly there's no real reply to that comment, Tommy. Strange how seemingly that's been a trend since the new rules kicked in. Perhaps you have nothing to add then

Since new rules kicked in?

It's called cba getting into pointless discussions these days but you want an opinion, here you go:



You linked a corporate response a few posts ago, had zero issues with it, nothing to 'add' out with AMD Roy's public **** up and PCars apparently contains royalty free in game advertising.:D

GW's bottom line=Shady as **** artificial performance advantage over your competitor with a huge carrot encouraging them to switch vendors and at the same time a subtle push to ensure your own customers upgrade-yet another absolute genius tactic from Nividia-the company that doesn't do apologies to it's customers, simply explanations when they get caught out.:p

All we need now is for AMD to shake itself down, claw some business back with competition, loose the good guy image and lock out Nvidia optimisations on their features, sorted, the console ecosystem on PC-happy days.:D
 
Again very simple draw a triangle, keep sub dividing it into two smaller triangles... can you do this infinitely or does it effectively become a line? If you start with a very small triangle to begin with do you think there is much benefit doing it more than 4 or 8 times?

Don't get hung up on triangles, they can also be points and lines too at the Tessellator stage.
 
You claim Nvidia are running a higher level and AMD are overriding this, for one thing as default the override is NOT enabled meaning they aren't cheating at all.

You're wrong, "AMD Optimized" is enabled by default in AMD's drivers, you have to manually deselect it and tick "Use application settings" to disable any cheats, AMD just haven't got around to 'optimizing' it yet as their driver team are pathetic.

Make up all the excuses you want the point still stands tessellation will be used more and more at greater and greater levels going forward and AMD hardware is just not up to it.
 
Last edited:
That is a pre and post Nvidia involvement in a game shot, the awesome looking shot being the dev making a game on their own, the later being what happens after Nvidia get involved..

Great, another pointless daft opening. You know full well (one would hope) that a pre alpha shot is never representative of the final product, not to mention NVIDIA would have no involvement in actively reducing the games art style and graphical fidelity. This is a measure taken by the developer and them alone. In fact if they wanted to harm AMD performance, leaving all the environment tessellation on would have been a great place to start also.

The GW libraries were added only a few month before the game launched, because they are vendor specific. Detrimental performance can be avoided by disabling them. So your only real argument is why can't you use them well, to which the answer is NVIDIA doesn't really want to hand out their intellectual property for free, and why should they. This isn't common business practice, why should it be now?

The whole reason GameWorks is allowed to exist is because developers cannot set time aside to develop these things in house. Project Cars has taken this head on and is still getting flack for the performance, because - and this is a fact, not a dig - AMD's performance is straddling behind hanging off the horse.

If there was a financially viable solution that would benefit all parties without vendor lock ins, then we would have seen it. PC centric catered effects will elude games till someone else other than NVIDIA actually puts the backing behind it. Evidently creating one shader technology (TressFX) and saying "well you can use ours". "Right, and who's going to help us implement this?" isn't an option. NVIDIA's libraries can be disabled, so disable them.
 
Last edited:
I didn't think AMD could optimise for GameWorks? If they are optimising at driver level, that kinda kills that then.

I didn't quite undestand what you mean with that. I'm pretty sure there are tons of other things amd can optimise than just gameworks features.

And the fact that AMD can't optimise, doesn't mean they can't improve. They're just doing it blindfolded hoping that new values are better than old ones. Surely just by testing different settings they can get better performance, but getting optimal settings blindfolded are same than winning from lottery.
 
The Witcher 3 Will Get Graphical Improvements in Upcoming Patch

The Witcher 3: Wild Hunt developer CD Projekt Red has spoken to Eurogamer to tackle the issue about the game's graphical downgrades head-on.

In the interview, the developer revealed to the site that a new, large patch with over 600 changes brings improvements to the game's graphics and graphical options is in the works and will be out within five to seven days of this publication. It will come with patch notes.

Additionally, the studio intends to patch the game to allow the editing of .ini files on the PC to push the graphical settings well beyond the maximum offered by the game's in-game settings. It will be possible to tweak grass and vegetation density, post-processing effects like sharpening, as well as draw distances. This patch will be separate from the aforementioned update as the developers are considering a few other tricks they would like to implement.

http://www.gameranx.com/updates/id/...get-graphical-improvements-in-upcoming-patch/
 
Going to be interesting to see if amd fix the performance and nvidia just ignore kepler! That would really annoy me not least because i have a kepler card!
 
Numerous people have said performance is fine on their kepler cards.

PhysicsMan94 said he hasn't seen any sign of performance drop off in the last year, surely if it was intentional gimping then every kepler user would see it?
 
I saw this gif on the Nvidia forums showing the difference between AMD and Nvidia lol (the trolling is strong in those forums :D):
wVHJusH.gif

No mate, the second pic just doesn't have the sharpening filter on:p
 
Numerous people have said performance is fine on their kepler cards.

PhysicsMan94 said he hasn't seen any sign of performance drop off in the last year, surely if it was intentional gimping then every kepler user would see it?

** No hotlinked images **

Although TBH I think Kepler has definitely taken a back seat. NVIDIA are focused on Maxwell, like they have been with Tegra recently. All this talk about intentionally gimping their own drivers is hilarious though, people are idiots. If these 'enthusiasts' cared to roll back far enough they could quite easily disprove that theory in 5 minutes.
 
Last edited by a moderator:
Back
Top Bottom