• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Shenanigans again?

Well AMD found a solution that would benefit all GPU's, so it sounds like some money has changed hands.

Article said:
AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark.
 
Rroff/Raven : Pure speculation guys. If you have some information that we should know about to back up what your saying I'd appreciate you posting it :-)

I did say its not something I can substantiate...

However at a technical level tessellation is fairly restrictive in what you can do with it - you need a predictable output for the same input data and parameters otherwise you can't rely on it to work as intended in your game, etc. which means that AMD's changes to it are most likely to cause issues if they are implemented at one end of the pipeline or artifically restraining the output range by opptimising the input if its at the other. Both of which have potential pitfalls.

It looks like judging on the graph of tessellation performance AMD released (or was leaked) that they are opptimising things to fall within the range that AMD works best at - which also incidental helps other GPUs tho not to the same extent - which would have nVidia kicking up a fuss as it doesn't highlight their raw performance advantage as much and also does mean things could potentially fall apart in the future should there be a scenario where AMD can't opptimise the input data to fit their best effort profile.

Yes at the end of the day its speculation but there are some things that are inherent to tessellation that you can't change so you can guess fairly reliably at the rest.
 
Last edited:
Well AMD found a solution that would benefit all GPU's, so it sounds like some money has changed hands.

This is starting to remind me of the dx10.1 patch that improved performance on ati cards in assassins creed. Did ubi ever offer a reason why it was withdrawn that didn't stink of BS?
 
Bare in mind that:

1) It's a preview benchmark
2) Just because AMD demonstrated a technique doesn't mean that Ubisoft had time to implement it, or perhaps the technique adversely affected Nvidia's performance?

It's probably gone through marketting clearance first on the 'official line' before getting to HARDOCP.
 
Ubisoft are a massive company and IMO not so stupid to implement dodgy code so the game runs like crap on AMD cards, it's not in their interest considering AMD have the majority of the DX11 market currently. I think it's down to AMD to sort their drivers out.

You mean like how Ubisoft removed DX 10.1 support from Assassins Creed that had performance improvements on ATI cards, but Nvidia didn't support DX 10.1 at the time?
 
This is starting to remind me of the dx10.1 patch that improved performance on ati cards in assassins creed. Did ubi ever offer a reason why it was withdrawn that didn't stink of BS?

I don't think anyone outside of nVidia has ever pretended that was something other than nVidia playing dirty... the thing was porting the game to dx10.1 itself didn't enhance performance for everyone - only ATI because of the crappy AA implementation they'd run with assuming it would be the future - which it turns out it wasn't... so while I don't agree with what nVidia did I can see from a business POV why they would do that.


You mean like how Ubisoft removed DX 10.1 support from Assassins Creed that had performance improvements on ATI cards, but Nvidia didn't support DX 10.1 at the time?

It wouldn't have increased performance on nVidia cards even if they had supported DX10.1 - the gain on ATI cards was due to the way ATI had implemented some features that nVidia wouldn't have done even with DX10.1 support.
 
It wouldn't have increased performance on nVidia cards even if they had supported DX10.1 - the gain on ATI cards was due to the way ATI had implemented some features that nVidia wouldn't have done even with DX10.1 support.

So? Surely improving performance for a sizeable minority of people isn't a bad thing?
 
My instinct is that AMD is optimising the data in some way before handing it off to the tessellation processing so that the output is consistantly within the range their hardware works best at, which would probably give reasonable gains on nVidia cards as well but not show the raw hardware performance difference so vividly. I'm in 2 minds on that as it does potentially increase performance without much if any visual hit but does maybe hide future performance issues.

wow, Rroff. that's amazingly unbiased. well done!


(I'm being serious)
 
Their comes a point where greater tessellation != better image quality. What I think AMD are doing is saying that a certain level of tessellation is enough and any more than that is a waste.

I totally agree with that as over a certain level it only adds complexity with very little image quality gain. But as the 4** from nvidia can do allot of tesselation, one of the few areas that they can beat ATi/AMD, they have to keep pushing this one thing - it's getting boring tbh.

This is the same company that removed 10.1 support from assassins creed remember so therefor totally in the Nvidia camp.
 
He's not trying to shift the £120 768 460s in this thread.

That can mean only one thing... HawkX2 will use more than 768mb and the cheap 460s will perform worse than ATI!
 
Their comes a point where greater tessellation != better image quality. What I think AMD are doing is saying that a certain level of tessellation is enough and any more than that is a waste.

I totally agree with that as over a certain level it only adds complexity with very little image quality gain. But as the 4** from nvidia can do allot of tesselation, one of the few areas that they can beat ATi/AMD, they have to keep pushing this one thing - it's getting boring tbh.

This is the same company that removed 10.1 support from assassins creed remember so therefor totally in the Nvidia camp.

I think this is something a lot of people struggle to understand. Tessellation currently, when it's used is being over used. They could knock down the amount of tessellation without actually reducing the image quality.

Like that Hawx 2 mountain line, you're not going to be able to tell the difference between 1000 polys, and 10,000 polys, there's no point in tessellating it past a certain point, that's AMD's point apparently as well.

There is absolutely no need to go over what's necessary. For example, once you can no longer see the segments on an object that's supposed to be circular, you've got the right about of tessellation, increasing the segments will result in a more "detailed" and complex shape technically speaking, it's not something you'd actually see a difference with.
 
Lets all be honest, untill ATI catch up Nvidia with tesselation(May of already) it is not going to matter to pro ATI people, and of course vice versa!
 
Lets all be honest, untill ATI catch up Nvidia with tesselation(May of already) it is not going to matter to pro ATI people, and of course vice versa!

It doesn't really work like that though. Currently it's only being used as a gimmick to enable such things as "we're faster and better than you" and "you're doing it wrong".

Tessellation currently isn't being used well though, it's currently just a checkbox feature. You could increase image quality quite a bit in games with even moderate tessellation usage.

For example take the Bioshock Big Daddies, they have a fair amount of "segmentation" visible, more so in the areas that are rounded or curved.

bioshock1.png


For example, in that image you can see a lot of segments in the rounded areas, the drill arm, knee pads, brass fixings, and the metal guard around the helmet, with mild tessellation you could turn up the segments enough so that you would no longer be able to see the segmentation, but not have insane polygon counts, it's totally unnecessary. That's what AMD's basically saying, use what's necessary, and have it so the tessellation level decreases as you get further away from an object.

Currently, tessellation seems to just be all or nothing, insane polygon counts turned up to level 11, that would look no different to tessellation say at "level 2". It's just a waste of time, it needs to be done right to look its best. Heaven for example is just one of those things, the level of tessellation isn't necessary at its highest setting, a lot of the detail could remain at a lower level of tessellation.
 
What I find in this whole episode very concerning is that Nvidia is going out and telling sites what tests to run.

Nvidia should worry more about creating better vid cards that won't melt ur case.

Tesselation is all Nvidia have over upcoming 6 series. Extreme Tesselation at that.
 
Back
Top Bottom