• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

PCGH asks Nvidia on Gameworks

Nice ninja edit :D



http://www.techpowerup.com/reviews/AMD/R9_290/

Anyway, I am still failing to see what nVidia are doing wrong with GameWorks. They are giving devs easier tools to use with their game and this free's up time to work on other aspects.

It can be explained a thousand times a thousand diffrent ways and not everyone will understand or grasp it, which ironically keeps these arguments going on and on and on...

Which is not a bad thing, It needs to be front and centre of a attention. Nvidia and Ubi hope it will go away soon, its better for the industry that it doesn't.
 
Are they? I thought my Titans coped quite well in heavy tessellated games to be honest. MSAA can hurt me and Ubersampling can as well but I choose to lower those to get frames back up. Can't you do the same with tessellation?

Yes they are, I suspect you know this too, we've seen this and discussed it here for years now.

MSAA, ubersampling(which isn't under scrutiny) is optional or has levels of implementation, tessalation in the likes of BAO are enabled/disabled.
 
well the complaint is they are adding in crazy amounts that the gamer never even see's just to gimp amd users

and then this is forced by dll's that cant be changed

you understand that part right? or you just dont agree anything wrong with it

I am a massive fan of Batman and the cape/water looks very good. I don't know about game coding at all. It runs well, it looks very good and if tessellation was lowered, would it look so good? I will have to pass on that but I am sure there is some game coders here who will tell us what is what when it comes to triangles.

Yes they are, I suspect you know this too, we've seen this and discussed it here for years now.

MSAA, ubersampling(which isn't under scrutiny) is optional or has levels of implementation, tessalation in the likes of BAO are enabled/disabled.

So if it hurts your performance, turn it down/off surely?

End of discussion from me, as I have more interesting things to do :)
 
I am a massive fan of Batman and the cape/water looks very good. I don't know about game coding at all. It runs well, it looks very good and if tessellation was lowered, would it look so good? I will have to pass on that but I am sure there is some game coders here who will tell us what is what when it comes to triangles.



So if it hurts your performance, turn it down/off surely?

End of discussion from me, as I have more interesting things to do :)

Hasn't this been tested? i think i remember that it has and it looks identical on a lower tessellation setting.

If it hasn't been tested its easy enough to do.
 
I am a massive fan of Batman and the cape/water looks very good. I don't know about game coding at all. It runs well, it looks very good and if tessellation was lowered, would it look so good? I will have to pass on that but I am sure there is some game coders here who will tell us what is what when it comes to triangles.

The cape remains the same when you reduce tessalation in BAO via CCC, much the same as the concrete blocks remained the same in C2, the rest of the IQ reduces though, so it's pointless.

So if it hurts your performance, turn it down/off surely?

You know you can't turn it down in BAO, I shouldn't be forced by Nvidia to turn it off.
 
you dont need to be a game coder to understand what happened with tessellation

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

The funny thing is a lot of the stuff that has absolutely assloads of tessellation thrown at it looks no different with it turned off, its obvious it was just thrown at everything and anything in obscene amounts just to gimp performance. A flat concrete barrier needs to be tessellated to look no different? Tree stumps tessellated to look no different? Water that's not even visible still being tessellated? Its obvious what was going on here.
 
This business with Nvidia deliberately adding things to the game that hurt their own performance because it hurts AMD more has been questioned and accused for years by a lot of different people in the industry, there is a consensus that it does happen.

Again, please provide proof

How about when Nvidia reduced the texture quality in their drivers to gain speed over AMD at the cost of quality?

They made "very high" in the new drivers = "high" in the old drivers and incremented all the options on the slider as such until they got to the end then added a new extra low option where the lowest was previously (the slider looked the same afterwards just each setting was lower quality than previously).

This hurt Nvidia but it hurt AMD more as it gave Nvidia a lead in benchmarks when using the highest settings.
 
Yes, the tessellation levels in Crysis 2 were high enough to drop the performance on nvidia hardware by 20%, but closer to 40% on AMD hardware at the time. The nVidia cards were geared for massively over the top tessellation. AMD cards weren't and crytek/nVidia used that to their advantage. That had to be the explanation, because i cant believe either of them were oblivious to the pointless levels of tess. They couldnt have been, they're not that dumb.
 
Just read this on extremetech from the first article Joel did on GameWorks.

The first three scenes of the benchmark in Arkham Origins hammer tessellation. AMD’s driver allows us to manually define the tessellation level — changing that setting to x4 improves performance in the first three scenes of the test by 11%, from 134fps to 150fps.
http://www.extremetech.com/extreme/...rps-power-from-developers-end-users-and-amd/2

So if you can't lower tessellation Tommy, something is broken in your system.
 
I don't think thats what he said ^^^

How about when Nvidia reduced the texture quality in their drivers to gain speed over AMD at the cost of quality?

They made "very high" in the new drivers = "high" in the old drivers and incremented all the options on the slider as such until they got to the end then added a new extra low option where the lowest was previously (the slider looked the same afterwards just each setting was lower quality than previously).

This hurt Nvidia but it hurt AMD more as it gave Nvidia a lead in benchmarks when using the highest settings.


Image quality really needs to become a part of reviews now, its far to easy to add a driver profile to reduce IQ in favour of performance and then release a new profile when the review is done.
 
What frames do you get on your 670? Can you post them in the Batman Bench thread for the charts. We could do with a few more entries in that.
 
i think they have this backwards anyway!

its the game companies at fault mostly
i mean they are basically being paid lots dollah so its better on a nvidia card
so their job is to show off the hardware
putting insane amounts triangles somewhere we cant even see just to make the competition look bad is a super lazy way of doing that
they should be doing more clever stuff with phys-x or atleast using them extra triangles where it makes us go wow a bit!

maybe after all this bitching they might switch it up a bit i duno
they just lazy e_e
 
What frames do you get on your 670? Can you post them in the Batman Bench thread for the charts. We could do with a few more entries in that.

Cba with bm's in general any more, besides, BAO doesn't really warrant downloading again due to not really rating it and one bad taste or another.
 
C'mon rusty, you know full well Hairworks is part of GameWorks, COD Ghosts use HW's it, part of huddy's argument is over use of Tessalation

As layte said, marketed as such and part of the overall package but not a GW library. And anyway you were waffling on about Crysis 2 and over-tesselation which definitely isn't GW.

This ain't a Huddy thread it's a thread about what the nVidia guy said about GW :).
 
Waffling?

HW's is part of GW's, the same as the other libraries that are part of GW's.

TWDEDYq.png


C2=track record example-but you knew that.

The core of the thread is AMD-Nvidia GW's argument, in part response to huddy's words=on topic valid discussion.
 
Last edited:
I've had no issues with SLI. What bridges are you using? I'm using a ROG ASUS one and it's working fine. Will check rev when I can. AFAIK it's just EVGA bridges with an issue, and even then you will still be able to enable SLI. You'll just experience tearing issues and rubber banding. I would consider reseating the cards and DDU

After reseating of the second card it works like a charm !
A big thank you !!:D
 
Back
Top Bottom