• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s GameWorks program usurps power from developers, end-users, and AMD




Few points.

This is an Unreal Engine 3 Game. Unreal Engine 3 LIKES NV hardware. Until you bridge any leads by extorting the huge bandwidth on 290s

Here is an example of UE3.0 being NV dominant in Thief:


Add 8X to that, and the gap will swallow itself. Matt, just quit hurting yourself.

1. MjFrosty-780ti@1315/1965, [email protected]

2. gregster-Titan@1267/3767, [email protected]

3. whyscotty-Titan@1241/1802, [email protected]

4. zpaf-AMD 290@1200/1600, [email protected]

5. uksoldierboy- AMD 290X@1150/1400, [email protected]

6. tommybhoy-290X@1175/1475, [email protected]

7. hyperseven-290X@1130/1450, [email protected]
 
So in Thief we should expect to see a 770 faster than a 290X and a 660 beating out a 7950 boost? I bet you won't. Your example is poor and does not work because GameWorks libraries are not part of Thief. The engine may be the same and the engine may favour Nvidia, but it does not explain the results in question at all.
 
UrrrGHHh. The 290 had barely been on the market when Arkham Origins launched Matthew, they fixed most of the performance gripes the following day after release. What more do you want? Unreal Engine 3's performance scaling applies across the board.

It's the same game engine. It COULDN'T BE MORE relevant. It's only not relevant to you because you've convinced yourself that GameWorks is the soul reason for it. When I'm showing you another Unreal Engine 3 title that NV cards are dominant in. If I install Arkham City, I bet you I could emulate near exactly the same scenario when using 8XAA.

Enough with the corporate conspiracies.
 
So in Thief we should expect to see a 770 faster than a 290X and a 660 beating out a 7950 boost? I bet you won't. Your example is poor and does not work because GameWorks libraries are not part of Thief. The engine may be the same and the engine may favour Nvidia, but it does not explain the results in question at all.

The numbers don't lie though Matt and look at all the other game bench threads and nVidia own in all of them, except Batman....

You have a pair of 290's, but if you want to play with FXAA, be my guest, I think MSAA looks great :p
 
UrrrGHHh. The 290 had barely been on the market when Arkham Origins launched Matthew, they fixed most of the performance gripes the following day after release. What more do you want?

It's the same game engine. It COULDN'T BE MORE relevant. It's only not relevant to you because you've convinced yourself that GameWorks is the soul reason for it. When I'm showing you another Unreal Engine 3 title that NV cards are dominant in. If I install Arkham City, I bet you I could emulate near exactly the same scenario when using 8XAA.

Enough with the corporate conspiracies.

No they didn't. Joel found exactly the same results when he tested a 770 vs a 290X long after that Techspot review. Its not normal for a 770 to beat out a 290X. Same as its not normal for a 660 to beat out a 7950 boost. That scales down to the low end cards as well.

The numbers don't lie though Matt and look at all the other game bench threads and nVidia own in all of them, except Batman....

You have a pair of 290's, but if you want to play with FXAA, be my guest, I think MSAA looks great :p

What about Sleeping Dogs? :D

290X competes well with the titan in most things, just the 780TI that is normally a bit ahead, but then that was released after the 290 cards.

In most of the game bench threads there are very few water cooled 290 entrants, where as all the top Nvidia cards are generally water cooled. If Ranger (or other WC 290 users) bothered to post some game benchmarks im sure he'd give some of you a run for your the money looking at the 3dmark scores he just posted. Beating out the fastest 780TI's in gpu score by a hefty margin. :p

That has nothing to do with this though and does not explain what is going on. :)

EDIT

And on this note my new motherboard has arrived so off to install it, wish me luck. :D
 
1. Score 89.9, GPU nvTitan @1320/1901, CPU 2500k @5.0 khemist Link
2. Score 89.3, GPU 780ti @1250/1962, CPU 4770k @4.5 zia Link
3. Score 88.8, GPU 780ti @1295/1950, CPU 4960X @4.7 MjFrosty Link 322.21 drivers
4. Score 86.5, GPU nvTitan @1280/1802, CPU 3930k @4.8 whyscotty Link
5. Score 86.1, GPU 780ti @1208/1800, CPU 4770k @4.6 Dicehunter Link 331.82 drivers
6. Score 85.3, GPU nvTitan @1280/1879, CPU 3930k @4.625 Gregster Link 332.21 drivers
7. Score 84.8, GPU 780 @1400/1850, CPU 3770k @5.0 Geeman1979 Link
8. Score 84.5, GPU 290X @1200/1625, CPU 3970X @4.9 Kaapstad Link 13.11 beta 8 drivers
9. Score 83.5, GPU 780 @1424/1877, CPU 3930k @5.0 whyscotty Link 332.21 drivers
10. Score 82.9, GPU 780ti @1150/1750, CPU 4770k @4.4 zia Link

http://forums.overclockers.co.uk/showthread.php?t=18536130&highlight=sleeping+dogs+bench+thread

:p

And that is me out for the day as well. Sadly I will be drinking heavily all day and betting money on nags :D
 
Regardless though, it changes nothing. The fact is they confessed to the thing we all argued about for ages. :D

They didnt confess to anything, we always knew they were closed libraries and that they dont give the source code to AMD, the argument wasnt about that, the argument was that this totally disables AMD from making ANY changes to their drivers to alter the performance, which is the bit that is laughably false
 
I accept it and I don't see the issue. Why would you run a game with only FXAA on a 290X? The arguments have always been that when you turn the dials up, then "X" GPU really starts to shine.... Not when you turn the dials down lol

3D gaming can't use MSAA, so basically I shouldn't touch anymore GW titles to play in 3D as my 290X is going to be castrated.

Thanks Nvidia, thanks a lot.:(

Nvidia have fessed up and that's cool because AMD got off their **** and created Mantle that doesn't gimp Nvidia in direct competition with AMD in DX in any way?
 
They didnt confess to anything, we always knew they were closed libraries and that they dont give the source code to AMD, the argument wasnt about that, the argument was that this totally disables AMD from making ANY changes to their drivers to alter the performance, which is the bit that is laughably false


Basically this,,,^^
Most of the posters in this thread agreed that PART was closed but it doesnt mean the whole thing is closed.
Nvidia hasnt come out anywhere and stated that AMD is unable to optimize at all , iirc one of there driver releases did actually list a % Gain
 
Like PhysX its only going to be the couple of games that Nvidia can persuade to use Games Works.

Propitiatory lockouts on its own is a no for developers anyway, but when you also remove the developers access to do their own optimisations it gets snubbed even more.
 
3D gaming can't use MSAA, so basically I shouldn't touch anymore GW titles to play in 3D as my 290X is going to be castrated.

Thanks Nvidia, thanks a lot.:(

Nvidia have fessed up and that's cool because AMD got off their **** and created Mantle that doesn't gimp Nvidia in direct competition with AMD in DX in any way?

That is your choice but I don't see how it is castrated, as even with 8XMSAA, you get 100fps - Halve that for 3D = 50fps - take away the MSAA and you should be getting around 65 - 80 fps (which you know).

I have put 78 hours into this game and it is stunning in 3D.
 
~45fps minimum per eye shouldn't be happening with a non demanding 3D title, it should be pegged@60fps.

Non AMD 290 series are getting hammered in 3D where that same 35% performance penalty is going to collapse the fps in 3D.

I'll wait for performance figures in future with GW titles, if they are going to gimp them, they can keep them.

I'm sure you wouldn't be bothered in the slightest if you got a 35% performance hit with a GE title in 3D while AMD stormed ahead...
:rolleyes:
 
Like I said, "That is your choice" It makes no odds to me if you buy/don't buy a GameWorks title. All I have done is point out how the 290X is faster than my Titan in Batman: AO and yet in every other games benchmarks, I can beat the 290X.

I have never heard people moan before about getting more fps than me :D
 
Like I said, "That is your choice" It makes no odds to me if you buy/don't buy a GameWorks title. All I have done is point out how the 290X is faster than my Titan in Batman: AO and yet in every other games benchmarks, I can beat the 290X.

I have never heard people moan before about getting more fps than me :D

Well done, deflected everything I said about an enforced performance penalty in 3D.:o

I have put 78 hours into this game and it is stunning in 3D.

Most on AMD won't get to enjoy BAO 3D on AMD, if that's the way to promote PC gaming, well, Nvidia certainly knows how to stifle PC Gaming.:rolleyes:

Please, no point replying btw unless it's to directly address closed AMD driver optimization with the end result being reduced performance in 3D.:)
 
Well done, deflected everything I said about an enforced performance penalty in 3D.:o



Most on AMD won't get to enjoy BAO 3D on AMD, if that's the way to promote PC gaming, well, Nvidia certainly knows how to stifle PC Gaming.:rolleyes:

Please, no point replying btw unless it's to directly address closed AMD driver optimization with the end result being reduced performance in 3D.:)

What are you on about? You accuse me of deflecting and yet I have shown how the 290X beats me in this game (which incidentally was made by Warner Brothers Montreal and not nVidia) and you are not happy still? Whinge at WB Montreal if you are not happy and not me.

Now let's put this bench that Matt posted into perspective.

4755801818aa0df29d478aece30b769a.jpg


A 6870 (not a typo) is getting 46 fps average. A 7870 is getting 64fps average. Now those frame returns for such old and slow hardware is very good IMO.

When we look at the frames for a 7970 in this bench, we can see that it gets 106fps

Now when we compare this to Batman Arkham City, we get these fps

f41be613ac7f04aca3b1f75f6745f599.jpg


You can see this much older game is getting 55fps at the same resolution but when you compare max quality (MSAA x8) to max quality as well, the 7970 grabs 72fps in Origins and yet in City, it only gets 55fps average BUT you guys are still not happy?

What am I missing? The game is playable with max quality all the way down to a 7870 (46fps). This just seems like an excuse to have a whinge at nVidia for giving you AMD users great performance to me and frankly, pathetic.
 
Back
Top Bottom