Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Both are from the same custom scenario.
If you do enough testing you'll see that runs using that custom scenario on the same system are within a couple of frames of each other.
Please don't come back at all.
I'm not getting drawn back into the GameWorks debate again Greg, I've said my piece on that for now. I can put you in touch with Joel via trust if you want though, you can ask him yourself then.
Well at least Matt outright states that he's not going to get back on topic in the thread. Wonder how happy he'd be if we refused to get back on topic in one of his threads?
Also avoided the question nicely too.

Not getting into another argument no matter how much you or others try to bait, but have fun.![]()
Yeah, people often bow out of arguments when they don't have an answer.
So Batman: AO performs just as well on AMD as Nvidia?
And Watch Dogs performs just as well on AMD as Nvidia?
And we're saying that GameWorks is sabotaging performance on AMD cards?
They're both GameWorks games right?
So is the problem that AMD users think they should be getting more performance than the respective Nvidia cards and are unhappy because they're getting fairly even performance?
Obviously I don't expect Matt to answer as I think he's made it clear he doesn't have one. But if someone could explain it to me?
AO performs just as well at max settings, as x8 AA favours AMD architecture. However using FXAA budget range Nvidia cards beat mid tier AMD cards and mid range Nvidia cards beat high end AMD cards. That's the difference. Doesn't affect the high end, but does affect the low to mid end card users.
I've also said in this very thread that i don't believe Nvidia designed GameWorks to gimp AMD performance so really there is no need to keep baiting me.
AO performs just as well at max settings, as x8 AA favours AMD architecture. However using FXAA budget range Nvidia cards beat mid tier AMD cards and mid range Nvidia cards beat high end AMD cards. That's the difference. Doesn't affect the high end, but does affect the low to mid end card users.
I've also said in this very thread that i don't believe Nvidia designed GameWorks to gimp AMD performance so really there is no need to keep baiting me.
Weren't you out?
Also didn't you start a whole thread about GameWorks intentionally gimping AMD performance?
How do we know it's not Nvidia cards running much faster than they should be, rather than the AMD cards running slow?
Hence Nvidia's dramatic MSAA drop in performance (Yes, I know, AMD loses less, but we're talking a much higher drop than is the norm for Nvidia.)
Lol. Yes it does. You can't see it because it's not there. Looks like we're going to have to agree to disagree, again. I'm out to save the thread derail.
lmao
absolutely clueless.
Let me tell you a secret about game development. Assuming an effect or a mesh is on the screen being rendered it doesn't matter how well the player sees it. That exact same effect/mesh still uses the exact same draw call/CPU time budget because you as a developer have told it to be rendered in the scene.
Imagine I want a million 2d butterly meshes on my screen at the same time. All moving constantly. Now that I have those added I have a crapton of draw calls to deal with. In my engine editor I can set the player camera very close to the butterlies or very far away from them. Being closer to the meshes the player can see them better. However this does not change the amount of draw calls that have to be handled.
Just because the gamer can't see X as well as before due to distance, camera movement etc. does not mean that it's not being rendered exactly the same way as it was before. (usually this causes GPU time issues which is why most of the time you want different LOD meshes for your models)
If you wanted to remove a ton of draw calls from this star swarm demo you'd have to change the motion blur implementation (which doesn't change unless you change settings, motion blur implementation is the exact same regardless of scenario, otherwise you'd see the RTS scenario being the easiest to run and follow being the hardest). Or alternatively you could group a ton of individual meshes together into single big meshes, which obviously isn't happening either.
Matt you're just plain wrong. Your rudimentary understanding of game engines rendering something is completely wrong.
Let's prove that.
Follow [Extreme settings]
![]()
74fps
Follow [Extreme settings without motion blur]
![]()
120fps
Custom Scenario [Extreme settings]
![]()
51fps
Custom Scenario [Extreme settings without motion blur]
![]()
80fps
Follow blur to no blur fps increase: 62%
Custom blur to no blur fps increase: 57%
It's the exact same fps hit between the modes. The same motion blur is being rendered every time. And as you can see the custom scenario is actually the more stressful one out of the two. The small variation in percentages can be attributed to the inconsistency of the follow preset.
AMD's archicture handles AA better than Kepler. That shows in more games than just batman. As to why that specifically is i have no idea.
When AA is enabled do you see a 7970 beating a titan or a 780TI? Or a 7770 beating a 760?
Yeah, that doesn't actually address or answer my question.
AO performs just as well at max settings, as x8 AA favours AMD architecture. However using FXAA budget range Nvidia cards beat mid tier AMD cards and mid range Nvidia cards beat high end AMD cards. That's the difference. Doesn't affect the high end, but does affect the low to mid end card users.
I've also said in this very thread that i don't believe Nvidia designed GameWorks to gimp AMD performance so really there is no need to keep baiting me.
It was you who started numerous threads on GameWorks and how it gimps performance on AMD cards but as has been shown and proven in our own OcUK Batman Arkham Origins bench thread that this isn't the case. Batman was the original talking point and I believe Batman was released in April and extremetech did the first article some time after. Now clearly if GameWorks (what the whole argument is about) is gimping performance on AMD (intentional or not), At max settings, nVidia cards would still have the same massive advantage, purely based on GameWorks being the accused.
So why does it do it on lower settings but not at max settings? Surely GameWorks would cripple AMD cards at max details as well, as I doubt nVidia can pick and choose what settings to cripple AMD cards.