• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia GameWorks teams up with Ubisoft for Assassin’s Creed Unity, Far Cry 4, Tom Clancy’s The Divis

The top cards run so close to each other, it shouldn't be an issue getting either but nVidia are ahead with techs. ShadowPlay is now recording 1600P and smoothly, PhysX looks good when done well. SLI is much faster since the last drivers with less work on the CPU in DX11 games and this is across the DX11 range and not just 2 games. Those are just a couple of reasons to consider spending the extra cash on nVidia cards.

GameWorks is also very impressive and I can see big things with this in the long term and hats off to nVidia for bringing all this to us gamers. People are quick to criticise but not so quick to praise.
 
The top cards run so close to each other, it shouldn't be an issue getting either but nVidia are ahead with techs. ShadowPlay is now recording 1600P and smoothly, PhysX looks good when done well. SLI is much faster since the last drivers with less work on the CPU in DX11 games and this is across the DX11 range and not just 2 games. Those are just a couple of reasons to consider spending the extra cash on nVidia cards.

GameWorks is also very impressive and I can see big things with this in the long term and hats off to nVidia for bringing all this to us gamers. People are quick to criticise but not so quick to praise.

If SLi was that good now wouldn't people be getting 2 770's instead of a 780 Ti? I'd rather wait for some benchmarks at least in games (not demo's) that use the new engines before i buy, buying a card now bar for 1-2 games (Wolfenstein & Upcoming Doom) would be a bit silly i think, better wait & see how the cards handle the new engines. Looks like next cards are a while away yet though, end of this year or early next.
 
If SLi was that good now wouldn't people be getting 2 770's instead of a 780 Ti? I'd rather wait for some benchmarks at least in games (not demo's) that use the new engines before i buy, buying a card now bar for 1-2 games (Wolfenstein & Upcoming Doom) would be a bit silly i think, better wait & see how the cards handle the new engines. Looks like next cards are a while away yet though, end of this year or early next.

Not really, as the 770 is ok but still only 2GB (4GB is massively over priced) and only a 256 bit bus, so if people want to go bigger screen resolutions, that would be silly (unless they already had a 770).

The point of the newer drivers is how much less work the CPU has to do, so a 2500K for instance would struggle pushing a pair of 780s and would bottleneck them even at 4.6Ghz but the newer drivers will help massively to relieve that bottleneck (very much like Mantle).
 
I can't blame any one for not pre-ordering any of the UBI games in fairness. They do release some good games mind.

I bought FC3 on pre release but never got around to actually playing it until it had spent a year sat on my U-play account doing nothing, By the time I did get around to it I had zero problems and really enjoyed it.

That's the best way to do it, When a big title game releases if you wait long enough it's all been fixed up for you. I'm not saying buy pre-release I mean your best off waiting and catching them in the thrice yearly big Steam sales, That way you get some sort of discount and it's all patched. That makes it an enjoyable experience.
 
Why, just because it doesn't affect you/conform to a Nvidia users opinion?

I might as well play it on my 670 and throw my 290 in the bin.

Uq4MoWy.png

If it was a GW gimping AMD issue then it wouldn't look like this:

1920_MSAA.png


With MSAA instead of FXAA would it? (and who wouldn't use MSAA over FXAA? FXAA is garbage, it looks like the N64's full screen blur filter).

The results you posted are clearly an FXAA issue with that game not a GW issue.
 
Couldn't you just turn FXAA off, use no AA at all, then just force MLAA in drivers :p?

That'd make for some interesting comparisons, Nvidia FXAA, versus AMD MLAA.
 
Couldn't you just turn FXAA off, use no AA at all, then just force MLAA in drivers :p?

That'd make for some interesting comparisons, Nvidia FXAA, versus AMD MLAA.

Surprisingly my 7770 does not usually like high amounts of AA, can't think why. :p
 
If it was a GW gimping AMD issue then it wouldn't look like this:

1920_MSAA.png


With MSAA instead of FXAA would it? (and who wouldn't use MSAA over FXAA? FXAA is garbage, it looks like the N64's full screen blur filter).

The results you posted are clearly an FXAA issue with that game not a GW issue.

Fxaa can have that much performance hit? I always thought that Fxaa is less demanding because it's and injector/overlay of the image meaning it don't get rendered by the GPU so performance is min.

Could be wrong.
 
I thought other then ATM Mantle everything on AMD was open standard supposedly?

MLAA is similar to fxaa, they're both poor tbh.
Wasn't fxaa Nvidias? Before MLAA, which then saw the resurgence of fxaa.

Or Nvidia had something similar to MLAA well before MLAA launched.

Either way, can an amd owner just disable fxaa and use MLAA and gain like super performance :p?
 
MLAA is similar to fxaa, they're both poor tbh.
Wasn't fxaa Nvidias? Before MLAA, which then saw the resurgence of fxaa.

Or Nvidia had something similar to MLAA well before MLAA launched.

Either way, can an amd owner just disable fxaa and use MLAA and gain like super performance :p?

Ah Thanks :)

So was it proved that NVidia was harming amd in these old games Or simply gameworks was as it says optimizing Nvidia hardware above what level it normally would be at?
 
Nope, it doesn't work for nVidia user's. FXAA which is an nVidia tech works for AMD user's though but apparently not very well.

Used to use it a lot before I found smaa and I had zero problems with it performance wise.
I not sure how Fxaa can effect performance? It's a blur filter to help reduce edges. It's not rendered by the GPU like msaa is.
So how can it effect performance?

Edit
Smaa not msaa
 
Last edited:
Back
Top Bottom