• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

nVidia - Why We’re Investing Heavily in GameWorks

Beyond the flagship cards and 8x MSAA results, the lower AMD cards all under-delivered in performance big time at lower AA settings. 7870 got its rear kicked by a much slower GTX560Ti from earth to the moon.

290x being able to keep up with GTX780 on 8x MSAA wasn't helping any AMD users that doesn't own a flagship card to overcome limitation (suffer less to be precise) with brunt force :rolleyes:

So how did nVidia cripple lowered AA on AMD? What did they do to it?
 
Beyond the flagship cards and 8x MSAA results, the lower AMD cards all under-delivered in performance big time at lower AA settings. 7870 got its rear kicked by a much slower GTX560Ti from earth to the moon.

290x being able to keep up with GTX780 on 8x MSAA wasn't helping any AMD users that doesn't own a flagship card to overcome limitation (suffer less to be precise) with brunt force :rolleyes:

If 780/ti 290/x are all reasonably close grunt wise in most games and they seem to be similar in this , Surely if lower tier cards from one side are worse then the other thats the fault of that side?
 
Assassin’s Creed: Unity – HBAO+, TXAA, PCSS, Tessellation
Batman: Arkham Knight – Turbulence, Environmental PhysX, Volumetric Lights, FaceWorks, Rain Effects
Borderlands: The Pre-Sequel – PhysX Particles
Far Cry 4 – HBAO+, PCSS, TXAA, God Rays, Fur, Enhanced 4K Support
Project CARS – DX11, Turbulence, PhysX Particles, Enhanced 4K Support
Strife – PhysX Particles, HairWorks
The Crew – HBAO+, TXAA
The Witcher 3: Wild Hunt – HairWorks, HBAO+, PhysX, Destruction, Clothing
Warface – PhysX Particles, Turbulence, Enhanced 4K Support
War Thunder – WaveWorks, Destruction

I have got to say, take out a couple of titles I have no interest in, it still leaves a mighty impressive list.:)

Beyond the flagship cards and 8x MSAA results, the lower AMD cards all under-delivered in performance big time at lower AA settings. 7870 got its rear kicked by a much slower GTX560Ti from earth to the moon.

290x being able to keep up with GTX780 on 8x MSAA wasn't helping any AMD users that doesn't own a flagship card to overcome limitation (suffer less to be precise) with brunt force :rolleyes:

Yeah, AMD's high end gpu's brute force approach to MSAA overcame any 'gimping' as it was superior to the competition-pgi and myself done a comparison 290X v 780 to prove it did with performance swing numbers it wasn't pretty for AMD until brute force MSAA was added.

Playing in 3D(as MSAA is disabled by default), I ended up with comparable performance to a 770 iirc when both using FXAA, I was ok still had performance, but...

If playing 3D on a 7870, performance tanked 40/40 fps instead a comparable Nvidia card pumping steady 60/60 fps.

Overlooked in this forum as every ones supposedly playing uber top of the range gpu's.:D

However, never found anything amiss on WD's regarding performance.

It's been done to death, bottom line, something iffy went on imo-be it GW's or the tess use, if BF4/SEv3(whatever) tanked on nvidia it would explode in the same manner as BAO.:)
 
So Brute force worked in Batman AO but not in many of the other games that we bench here and nVidia are on top. Underhand tactics or AMD not optimising too well or something else.

I have a sneaking suspicion that AMD just didn't bother with the lowered AA (FXAA) and therefore, the performance was so poor compared to nVidia who catered for all of their customers.
 
Dev co operation is the key factor-CD worked hand in hand to get TFX working as it should on Nvidia, if they didn't AMD would have better performance instead of similar.

AMD was denied any co operation and went the brute force approach with MSAA@driver level and how MSAA by passes game code=downscaling, you know how it works greg. :)
 
Just for the record, I have grabbed a couple of bench results from our threads.

Bioshock Infinite.

1. Score 178.55 GPU 780TI @1346/1900, CPU 4930k @ 4.77, MjFrosty Link
2. Score 170.09 GPU nvTitan @1306/1881, CPU 3930k @ 5.0, Gregster Link
3. Score 159.72 GPU 780TI @1136/1948, CPU 4930k @ 4.7, systemerror Link
4. Score 150.48 GPU nvTitan @972/1789, CPU 3930k @5.0, Kaapstad Link
5. Score 148.93 GPU 290X @1230/1625, CPU 4930k @4.7, Kaapstad Link
6. Score 143.68 GPU 290X @1281/1560, CPU 4770k @4.5, Avenged7Fold Link

Alien Isolation

1. Score 167.36 GPU 980 @1590/2000, CPU 3930k @5.1, Besty Link DX11 344.16 Drivers
2. Score 164.68 GPU 980 @1557/2000, CPU 4790k @4.8, Dicehunter Link DX11 344.16 Drivers
3. Score 163.34 GPU 980 @1590/1970, CPU 5960X @4.7, stulid Link DX11 344.16 Drivers
4. Score 162.49 GPU 980 @1580/2100, CPU 5960X @4.35, Silent_Scone Link DX11 344.16 Drivers
5. Score 161.52 GPU 980 @1516/2001, CPU 4930k @4.5, Spudley Link DX11 344.16 Drivers
6. Score 150.65 GPU 780ti @1385/1850, CPU 4930k @4.5, FMTopfan Link DX11 340.52 Drivers
7. Score 146.41 GPU 290X @1340/1700, CPU 4930k @4.75, Pulse88 Link DX11 14.9 Drivers

Hitman Absolution

1. Score 74.00 GPU 290X @1340/1725, COU 4930k @4.75, Pulse88 Link
2. Score 71.73 GPU 290X @1260/1665, CPU 3930k @4.8, Rangerjr1 Link
3. Score 70.79 GPU 290X @1260/1625, CPU 4930k @4.7, Kaapstad Link
4. Score 67.89 GPU 290P @1260/1625, CPU 2600k @4.6, Uncle Petey Link
5. Score 67.55 GPU 980 @1411/2128, CPU 3930k @5.0, whyscotty Link

Thief

#1, _Alatar_: Titan @ 1346/1925, 4770K @ 5Ghz - DirectX FPS: Min 65.6 / Avr 95.8. Link
#2, MjFrosty: Titan Black @ 1350/2100, 4960@ 4.7Ghz - DirectX FPS: Min 49.8 / Avr 94. Link
#3, whyscotty: 980 @ 1358/2103, 3930K @ 4.4Ghz - DirectX FPS: Min 56.9 / Avr 91.1. Link
#4, Dicehunter: 780TI @ 1300/1900, 4770K @ 4.8Ghz - DirectX FPS: Min 62.2 / Avr 90.5. Link
#5, Gregster: GTXTitan @ 1333/3848, 3930K @ 5Ghz - DirectX FPS: Min 67.6 / Avr 89.9. Link
#6, Avenged7Fold: 290X @ 1300/1604, 4770K @ 4.5Ghz - Mantle FPS: Min 71.8 / Avr 88.5. Link

Tomb Raider

1. vsg28- 780Ti KPE @1392/1928, 334.27. 91.6
2. Besty - 980 @1542/2041, 344.16. 90.2
3. MjFrosty - 780Ti @1316/3920, 334.89 87.6
4. whyscotty - 980 @1493/2078, ???. 87.1
5. zia - 780Ti @1293/1965, 331.82 86.9
6. Gregster - Titan @1372/3703, 331.93. 84.7 R.I.P
7. Pulse88 - 290X @1340/1700, 14.9. 84.1

Batman Arkham Origins

1. 135 - R9 290X - Pulse88
2. 133 - GTX 780Ti - Vsg28
3. 127 - R9 290X - Kaapstad
4. 126 - GTX Titan - Whyscotty
5. 125 - GTX 780 - Geeman1979

Sniper Elite V3

Score 75.4 GPU 290X @1344/1740, CPU 4930k @4.79, Pulse88 Link Mantle 14.9 Drivers
Score 75.3 GPU 980 @1610/2120, CPU 5960X @4.86, Silent_Scone Link DX11 344.16 Drivers
Score 74.3 GPU 980 @1603/2115, CPU 3930k @5.0, Besty Link DX11 344.16 Drivers
Score 73.6 GPU 290X @1356/1740, CPU 4930k @4.79, Pulse88 Link DX11 14.9 Drivers
Score 73.5 GPU 290X @1255/1625, CPU 3970X @4.9, AMDMatt Link Mantle 14.9 Drivers
Score 72.5 GPU 290X @1344/1725, CPU 4930k @4.75, Pulse88 Link DX11 14.9 Drivers
Score 71.3 GPU 290X @1270/1625, CPU 4930k @4.8, Kaapstad Link Mantle 14.9 Drivers
Score 71.1 GPU 290P @1285/1625, CPU 2600k @5.0, Uncle Petey Link Mantle 14.9 Drivers
Score 70.0 GPU 290P @1225/1650, CPU 3770k @4.6, win8.1 Link Mantle 14.9.1 Drivers
Score 69.9 GPU 290X @1250/1625, CPU 3970X @4.9, AMDMatt Link DX11 14.9 Drivers
Score 69.9 GPU 290P @1240/1625, CPU 3970X @4.9, AMDMatt Link Mantle 14.9 Drivers
Score 68.6 GPU 290X @1280/1625, CPU 4930k @4.8, Kaapstad Link DX11 14.9 Drivers
Score 64.1 GPU 290P @1180/1600, CPU 4790k @4.8, MadMatty Link Mantle 14.9 Drivers
Score 63.2 GPU 290P @1200/1500, CPU FX-9590 @4.7, Sgt Bilko Link Mantle 14.9 Drivers
Score 62.5 GPU 290P @1200/1450, CPU FX-8350 @4.5, humbug Link Mantle 14.9.2 Drivers
Score 61.9 GPU 290X @1200/1500, CPU 3770k @4.5, tommybhoy Link Mantle 14.9 Drivers
Score 59.7 GPU 290P @1200/1450, CPU FX-8350 @4.5, humbug Link DX11 14.9.2 Drivers
Score 58.8 GPU 980 @1241/1753, CPU 3930k @4.4, whyscotty Link DX11 344.16 Drivers
Score 58.6 GPU 980 @1241/1753, CPU 4770k @3.5, andrewohare Link DX11 344.16 Drivers
Score 55.0 GPU 290P @1050/1400, CPU 4770k @4.5, Robzere31 Link DX11 14.9 Drivers
Score 51.4 GPU 970 @1190/1342, CPU 4790k @4.7, byron_hinson Link DX11 344.16 Drivers
Score 44.1 GPU 280X @1110/1575, CPU FX-8320 @4.5, thebennyboy Link Mantle 14.9.2 Drivers

Now then, when you take a look at those bench results, why is one brand better than another brand? When you take Sniper Elite V3 for example, why is nobody screaming AMD are playing foul and purposefully gimping the game on nVidia to make AMD look better? because that's what it looks like to me. You can say what you like and if you say it enough, you can believe it yourself but I like to deal in facts and even the AMD/Devs said they don't think for a minute that nVidia had gimped performance but could if they wanted, which again was proven the libraries were not a black box and were readily available for devs to use and optimise. Nothing in any of the games that GameWorks is in looks suspicious to me.

If people still want to believe that nVidia are gimping, that's fine but if you are going to post that they are, you better post some facts or give it a rest with the tin hat crap.
 
As I said, I didn't agree with WD's gimping, game devs said Bao had gimping going on not just AMD.

All those BM's have MSAA involved, Mantle is used to improve performance in SEv3.

Why no mention of the TR TFX dev co operation?

Nvidia laughed at AMD for asking for game code/dev co operation yet they publicly stated they needed to work hand in hand with Nixxes to get performance on par.

GW's code is locked dlls where as TFX code is open-download both and compare them, any one can Nixxes did to make it run properly on Nvidia hardware, is anyone allowed to do that with Nvidia's GW code without paying X hundred of thousand $?

If you don't want to directly answer anything in this/previous posts instead of counter arguing with different scenarios then further discussion is pointless but refrain from tin hat pot shots without backing up anything with substance.:)
 
If you don't want to directly answer anything in this/previous posts instead of counter arguing with different scenarios then further discussion is pointless but refrain from tin hat pot shots without backing up anything with substance.:)

I asked the question of 'how nVidia crippled lowered AA on AMD hardware'. That was the question that needed answering and very much inline with the gimping of AMD on GameWorks but you and so far nobody else has given a reason, so to tell me to answer and not counter answer questions is a little scurrilous. So if you don't have a reasoned response for my question, it is probably good that you deem further discussion as pointless.

Ohhh and you better get your facts correct, as GWs libraries are not locked dll's and devs have full access to them but are not allowed to share them.
 
Can the same rules applied to the mantle thread be used here?
The thread looks like its a few posts away from going the usual route. Besides did we ever get to see those damming contracts?
 
Can the same rules applied to the mantle thread be used here?
The thread looks like its a few posts away from going the usual route. Besides did we ever get to see those damming contracts?

nVidia have zero access to Mantle and anyone using Mantle has to be vetted before acceptance. It is a closed Beta that only a select few have access to. As for the contracts, there was some backtracking from AMD and the contracts never materialised. That is for another discussion though and I don't feel the need to bring things like Mantle and TressFX into this discussion. What I would love to see though is some form of evidence that GameWorks is crippled on AMD hardware but in every single discussion (and there has been a few) never brings said evidence but only conspiracies.

It is actually a discussion I enjoy (so long as people remain polite) and if nVidia are guilty of gimping performance on AMD hardware, that is something I would like to see. I do remember Joel Hruska (extremetech) pointing out that tessellation was overly used in Batman and this was the reason that AMD were getting their butts kicked however, when you actually look at the bench threads of real users actually benching the game, the results don't tally up with what was said and AMD are the better optimised for this game or at least has the better performance and that isn't down to brute force, as if that was the case, AMD would own in every bench.

As for the tessellation debate, I just checked the Unigine Heaven bench and seems nVidia can deal with tessellation in a much better way than AMD.

Score 2105, GPU 980 @1565/2200, CPU 5960X @4.5, Gibbo
Score 2102, GPU 780ti @1450/2080, CPU 3960X @5.47, newhit
Score 2023, GPU 780ti @1450/1915, CPU 4770k @4.63, vsg28
Score 1998, GPU 980 @1542/2021, CPU 3930k @5.0, Besty
Score 1992, GPU 980 @1562/2031, CPU 5820k @4.5, uksoldierboy
Score 1991, GPU 780ti @1405/2015, CPU 4930k @4.63, Nickolp1974
Score 1975, GPU 980 @1536/2010, CPU 4770k @4.4, khemist
Score 1967, GPU 780ti @1385/1900, CPU 4770k @4.5, khemist
Score 1949, GPU 980 @1516/2002, CPU 4790k @4.5, Huggie86
Score 1942, GPU nvTitan @1406/1902, CPU 2600k @3.4, Akula
Score 1940, GPU 780ti @1385/1900, CPU 3930k @5.0, tayto_0
Score 1935, GPU nvTitan @1361/1927, CPU 3960X @5.4, newhit
Score 1913, GPU 980 @1499/1955, CPU 3770k @4.5, Jono8
Score 1905, GPU 780ti @1319/1973, CPU 4770k @4.6, zia
Score 1897, GPU 780ti @1328/2000, CPU 3770k @4.5, darket
Score 1893, GPU 980 @1371/2078, CPU 3930k @5.0, whyscotty
Score= 1875, GPU 980 @1450/1752, CPU 4770k @4.5, MOOGLEYS
Score= 1875, GPU 780ti @1316/1920, CPU 4960X @4.8, MjFrosty
Score 1872, GPU 780 @1443/1972, CPU 3960X @5.2, newhit
Score 1870, GPU nvTitan @1346/1876, CPU 3930k @5.0, Gregster
Score 1868, GPU 780ti @1300/1950, CPU 2600k @4.3, Parabellum
Score 1864, GPU 780ti @1300/2025, CPU 4670k @4.4, boonstick
Score 1858, GPU nvTitan @1293/1906, CPU 2500k @5.0, khemist
Score 1851, GPU nvTitan @1293/1877, CPU 4770k @4.7, whyscotty
Score 1844, GPU 290X @1361/1725, CPU 4930k @4.7, Pulse88
Score 1825, GPU nvTitan @1280/1852, CPU 3930k @4.8, whyscotty
Score 1822, GPU 780ti @1265/1950, CPU 4770k @4.3, thefogo
Score 1812, GPU 780 @1411/1852, CPU 3770k @5.0, Geeman1979
Score 1806, GPU 780ti @1241/2001, CPU 3770k @4.5, ohara59
Score 1801, GPU 780ti @1250/2009, CPU 4670k @4.5, NeoStuey
Score 1798, GPU 780 @1411/1872, CPU 3930k @5.0, Kaktus69
Score 1793, GPU 780ti @1126/1975, CPU 3770k @3.9, waspy88
Score 1792, GPU 780ti @1270/1908, CPU 2600k @4.6, Sin_Chase
Score 1791, GPU 780ti @1267/1814, CPU 4770k @3.9, Ghia
Score 1790, GPU 780ti @1123/1921, CPU 4930k @4.75, systemerror
Score 1789, GPU 780 @1410/1800, CPU 3820 @4.5, Geeman1979
Score 1787, GPU 780ti @1305/1795, CPU 3770k @4.5, Castiel
Score 1780, GPU 780ti @1101/1925, CPU 2500k @4.55, sejm
Score 1774, GPU 780ti @1250/2000, CPU 2700k @4.5, _corTEC
Score 1769, GPU 780 @1372/1902, CPU 4770k @4.7, whyscotty
Score 1746, GPU nvTitan @1189/1801, CPU 3770k @4.5, csaris
Score 1745, GPU 780 @1381/1827, CPU 2500k @4.5, NoNameNoNumber
Score 1744, GPU 780ti @1208/1800, CPU 4770k @4.6, Dicehunter
Score 1737, GPU 780 @1399/1677, CPU 4820k @4.5, ToxicTBag
Score 1734, GPU 780 @1372/1625, CPU FX8320 @5.0, alex_123_fra
Score= 1732, GPU 780 @1320/1802, CPU 4770k @4.4, DLD
Score= 1732, GPU 780ti @1241/1750, CPU 4790k @4.8, smsmasters
Score 1731, GPU 780ti @1220/1975, CPU i5 750 @4.2, Tommy_Here
Score 1724, GPU 780 @1306/1893, CPU 4770k @4.5, Pudgey
Score= 1721, GPU 780 @1359/1678, CPU 4770k @4.9, Tonester0011
Score= 1721, GPU 780 @1301/1852, CPU 4770k @4.3, Noxia
Score 1704, GPU 290X @1280/1625, CPU 4930k @4.8, Kaapstad
http://forums.overclockers.co.uk/showpost.php?p=25214819&postcount=1741

Maybe AMD need to sort out their handling of tessellation?
 
Tommy is still ignoring the fact that all gameworks options can be turned off - if a library isnt being called then it cant affect performance

Whatever is affecting performance isnt gameworks, it is core game code and/or drivers
 
I thought it was that developers were not allowed to share the GW source code with anyone. Apart from that anything goes, including sending their game to AMD to be profiled and optimised.

But do we really need to do this again, perhaps somebody could quote the AMD PR minion who ran his mouth off, yet all the GW claims ended up being rubbished.
 
When you take Sniper Elite V3 for example, why is nobody screaming AMD are playing foul and purposefully gimping the game on nVidia to make AMD look better? because that's what it looks like to me.

It's because AMD are the good guys and NVidia are the devil incarnated on Earth to ruin PC gaming, I thought everybody knew that by now.

Did NVidia users complain about an AMD conspiracy when Tomb Raider (an AMD sponsored game) with TressFX ran poorly at launch on NVidia? nope.

Did NVidia themselves launch a massive media campaign trying to make out AMD as the bad guys and absolving themselves of all responsibility that it ran poorly? nope.

NVidia went quietly to the developers, worked with them to patch the game and then released a new set of drivers thus bringing the game up to par on NVidia hardware.

All you ever get from AMD is "it's NVidia's fault", "it's the developers fault", it's always everyone elses fault and they don't have to lift a finger because their army of fanboys always back them and criticise everybody else.
 
Last edited:
I thought it was that developers were not allowed to share the GW source code with anyone. Apart from that anything goes, including sending their game to AMD to be profiled and optimised.

But do we really need to do this again, perhaps somebody could quote the AMD PR minion who ran his mouth off, yet all the GW claims ended up being rubbished.

Yes, the devs can send the game for profiling, however AMD claim that this makes it impossible to optimise, without source code.

Working with developers does not equal source code access, but that is what AMD would uave you believe.

At the end of the day, any code being run through Dx in to YOUR hardware via YOUR drivers, you can profile and see what it is doing, to say it is impossible to do any driver optimisation without source code is nonsense.
 
It's because AMD are the good guys and NVidia are the devil incarnated on Earth to ruin PC gaming, I thought everybody knew that by now.

Did NVidia users complain about an AMD conspiracy when Tomb Raider (an AMD sponsored game) with TressFX ran poorly at launch on NVidia? nope.

Did NVidia themselves launch a massive media campaign trying to make out AMD as the bad guys and absolving themselves of all responsibility that it ran poorly? nope.

NVidia went quietly to the developers, worked with them to patch the game and then released a new set of drivers thus bringing the game up to par on NVidia hardware.

All you ever get from AMD is "it's NVidia's fault", "it's the developers fault", it's always everyone elses fault and they don't have to lift a finger because their army of fanboys always back them and criticise everybody else.

Whilst what you say seems extreme, it isn't far wrong. It does seem for the most part that nVidia just get on with it, as seen with GameWorks and try and make the job easier for devs along the way. AMD do seem to blame everyone openly and when there is things wrong, it is everyone's fault bar their own. The result of GameWorks is a good one and nVidia are clearly trying to push the effects of games and allow our high end systems to excel in jaw dropping visuals.
 
AMD do seem to blame everyone openly and when there is things wrong, it is everyone's fault bar their own.

Just to be fair to AMD, half he time it starts as a comment on one of the tech sites that gets blown well out of proportion by the army of supporters, sometimes being taken up by AMD themselves with their fairly large and very vocal PR team.

Pretty much like this currant discussion, one article posted that is in fact about Intel manipulating benchmarks 14 years ago, happens to mention Nvidia and gameworks and here we are. ;)
 
True Bru and Orangey needs Perma banning for bringing it up. The thread was going so nicely :D

It is a valid discussion mind and I would like to actually see some evidence of GameWorks gimping but even the devs interviewed from various sites said there was none.
 
At the end of the day, any code being run through Dx in to YOUR hardware via YOUR drivers, you can profile and see what it is doing, to say it is impossible to do any driver optimisation without source code is nonsense.

Indeed, infact its only relatively recently that AMD or nVidia have had any source code access at all when working with or optimising for a game. In the past it was very rare.

I'm guessing this is somewhat down to many games being based off unreal engine, unity, etc. and less cases where they've built the entire IP from scratch.
 
Back
Top Bottom