• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

Soldato
Joined
28 May 2007
Posts
10,114
Jesus, I didn't realise how badly the FuryX gets tonked by a decent aftermarket card@1440p! Especially considering how much more there is in the tank when it comes to tweaking :eek:

In some cases it's like 40% plus...

Check 4k where the real men play lol or according to some. The 980ti is impressive when when the clocks are boosted. Faster than a 295 at stock and embarrassing the stock Titan X/Fury X in price/performance.
 
Last edited:
Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
You need the Visual Studio debug run times installed. That's a bit too much effort given they are not redeployable and you can only get them from installing Visual Studio.

I have VS community 2015 installed. But not the win10 SDK. i just transplanted the DLL into the executables folder and it worked. well that is a pain if you also need the Debug runtime. must be since the guy built it in debug.
 
Caporegime
Joined
18 Oct 2002
Posts
30,340
Check 4k where the real men play lol or according to some. The 980ti is impressive when unlocked when the clocks are boosted. Faster than a 295 at stock and embarrassing those stock Titan X/Fury X in price/performance.

I've had 4k and SLi Ti's and tbh I prefer 1440p and higher fps! 4k is a *bitch* of a resolution to run everything smoothly :eek:
 
Soldato
Joined
28 May 2007
Posts
10,114
I've had 4k and SLi Ti's and tbh I prefer 1440p and higher fps! 4k is a *bitch* of a resolution to run everything smoothly :eek:

I watch a lot of media so my next monitor will probably be 4k but for gaming i reckon 1440p would be more to my liking. Will wait a while as the prices on monitors seem to be getting pretty good.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
The Fury X up against a TX @2160p is an absolute non contest. You should try using them both before you make statements.

Try using a Fury X running XCOM2 maxed @2160p, it does not run. On a single TX it is very playable.

I also like what AMD did with Shadow of Mordor and Fury X to get over quadfire memory problems, they disabled quadfire lol.


Xcom 2, haven't played it, won't, but from what I've seen if it plays at sub 60fps on any high end card even at 4k it's embarrassing. Basic game, extremely visually unimpressive and 17fps (if theRealDeal is right) doesn't denote playable on a TX over a FuryX.

From what I've seen people complain it runs incredibly badly, that isn't the basis of a good comparison and generalising all gaming performance based on one stand out game in the last couple of years is intentionally misleading. No I don't need to own the cards or play specific games to know the difference. There is more than enough information around the web. 4k at max settings runs great on a 4GB card and 6/12GB cards don't show any noticeable performance benefit in the majority of games. That you can name a single game(that otherwise runs poorly across all cards and all settings/resolutions) or two for every 50-100 that run fine isn't enough to suggest 4GB isn't enough. Hundreds of reviews, dozens of reviews specifically highlighting the question of memory for 4k, thousands of posts of users experience all diametrically oppose your view that 6 or 12GB is required Kapp.

I've seen little to no evidence that anything but an extremely small number of games have trouble with 4GB of memory at 4k. From what I recall Shadow of Mordor is also a woeful example. It was an Nvidia sponsored game at a time when they were trying to sell expensive cards with more memory than AMD. They offered a super dooper ultra texture setting which was as far as anyone can tell, identical textures offered completely uncompressed for absolutely no reason whatsoever. So you could run high or ultra and have exactly the same game but if you happened to have a card with less memory it would start running like crap. Is that actually a case of a game needing more memory, or is that marketing crap where if you go in and purposefully de-optimise something with zero IQ benefit you can make it perform worse somehow?

SO Xcom 2, which has insanely poor performance for the quality of graphics provided... and a marketing ploy by Nvidia are your examples of why more than 4GB are required?
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Jesus, I didn't realise how badly the FuryX gets tonked by a decent aftermarket card@1440p! Especially considering how much more there is in the tank when it comes to tweaking :eek:

In some cases it's like 40% plus...

That review has it down as 17% faster at 4k, 31% faster at 1440p, it costs £190 more than a Fury X (38% more), uses more power and is noticeably noisier, most of which is true when comparing to a stock 980ti as well except that is also noisier than a Fury X.
 
Soldato
Joined
28 May 2007
Posts
10,114
Xcom 2, haven't played it, won't, but from what I've seen if it plays at sub 60fps on any high end card even at 4k it's embarrassing. Basic game, extremely visually unimpressive and 17fps (if theRealDeal is right) doesn't denote playable on a TX over a FuryX.

Here is the performance of FuryX/980ti

http://www.overclock3d.net/reviews/gpu_displays/xcom_2_pc_performance_review_-_amd_vs_nvidia/9

Same link Kaap in the comments is asked what he gets saying this

"About 17fps maxed @2160p"
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
You really need to start backing this up with some hard evidence as so far all i have seen is you getting 17fps compared to 6 fps on the other cards in Xcom 2. If that is playable to you then so be it. The original titan has no chance against Fury X as it's around 70-80% slower at 4k from what i have seen in some recent games.

I get that Titans are expensive in comparison but lets not go over the top with defense as it seems like you are backing your Titan purchases to much. Dm linked you into the Techpower review above and all though his link was wrong in it was probably meant for the whole summary and not Warcraft the results show the original Titan is not even in the ballpark of fury X. Performance summary below.

https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_Matrix/23.html

Edit :Ohh and what is even funnier is how much faster a 295 is compared to all the cards on the list with it's 4gb apart from a really fast gtx980ti. Damn that's fast lol.

Xcom 2, haven't played it, won't, but from what I've seen if it plays at sub 60fps on any high end card even at 4k it's embarrassing. Basic game, extremely visually unimpressive and 17fps (if theRealDeal is right) doesn't denote playable on a TX over a FuryX.

From what I've seen people complain it runs incredibly badly, that isn't the basis of a good comparison and generalising all gaming performance based on one stand out game in the last couple of years is intentionally misleading. No I don't need to own the cards or play specific games to know the difference. There is more than enough information around the web. 4k at max settings runs great on a 4GB card and 6/12GB cards don't show any noticeable performance benefit in the majority of games. That you can name a single game(that otherwise runs poorly across all cards and all settings/resolutions) or two for every 50-100 that run fine isn't enough to suggest 4GB isn't enough. Hundreds of reviews, dozens of reviews specifically highlighting the question of memory for 4k, thousands of posts of users experience all diametrically oppose your view that 6 or 12GB is required Kapp.

I've seen little to no evidence that anything but an extremely small number of games have trouble with 4GB of memory at 4k. From what I recall Shadow of Mordor is also a woeful example. It was an Nvidia sponsored game at a time when they were trying to sell expensive cards with more memory than AMD. They offered a super dooper ultra texture setting which was as far as anyone can tell, identical textures offered completely uncompressed for absolutely no reason whatsoever. So you could run high or ultra and have exactly the same game but if you happened to have a card with less memory it would start running like crap. Is that actually a case of a game needing more memory, or is that marketing crap where if you go in and purposefully de-optimise something with zero IQ benefit you can make it perform worse somehow?

SO Xcom 2, which has insanely poor performance for the quality of graphics provided... and a marketing ploy by Nvidia are your examples of why more than 4GB are required?

DM did I forget to mention that quad SLI now works in XCOM 2.:D

XCOM 2 maxed @2160p including 8 X MSAA
4 TitanXs
FPS in top left corner

02Fg3DC.jpg

tgd3CZ4.jpg

I prefer playing the game on a single TX as the SLI profile needs a bit of work but I thought I would post the 4 way SLI screenshots to prove that 12gb cards are capable of getting good fps.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
65fps for a isometric view of an incredibly basic looking game with what, £3200 ish of graphics cards.... that really should give you an idea of how badly the game is made. Regardless of the settings, the game shouldn't be demanding.

As said if your two go to examples are a brand new horribly unoptimised game and a purposefully unoptimised game from a couple of years ago, then 4GB really isn't an issue.
 
Soldato
Joined
19 Feb 2007
Posts
14,529
Location
ArcCorp
DM did I forget to mention that quad SLI now works in XCOM 2.:D

XCOM 2 maxed @2160p including 8 X MSAA
4 TitanXs
FPS in top left corner

I prefer playing the game on a single TX as the SLI profile needs a bit of work but I thought I would post the 4 way SLI screenshots to prove that 12gb cards are capable of getting good fps.

Why 8xMSAA if you don't mind me asking ? I've tried the game at 4K with and without MSAA just to see and I cannot tell the difference, Even at 1440P it makes little to no difference yet is such a resource hog.
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
65fps for a isometric view of an incredibly basic looking game with what, £3200 ish of graphics cards.... that really should give you an idea of how badly the game is made. Regardless of the settings, the game shouldn't be demanding.

As said if your two go to examples are a brand new horribly unoptimised game and a purposefully unoptimised game from a couple of years ago, then 4GB really isn't an issue.

You asked for proof and you got proof, I have totally blown all your arguments out of the water.

What you think of the game is totally unimportant, it does not have to be pretty.

The argument was why a Fury X and GTX 980 Ti was stuck at 4fps and 6fps respectively at max settings and could TXs do any better.

The one thing I will say about XCOM 2 is it can use 10.5gb of memory, it may be the first game but it certainly won't be the last and there will be plenty more in the works.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,359
Location
kent
As a completely unrelated thought(a hypothetical one at that), it would be interesting to see last couple of pages if the two cards in question were the 390 and the 970. I bet opinions would differ somewhat, with lots of calls that 3.5GB(+512) isn't enough.:)
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
DM did I forget to mention that quad SLI now works in XCOM 2.:D

XCOM 2 maxed @2160p including 8 X MSAA
4 TitanXs
FPS in top left corner

02Fg3DC.jpg

tgd3CZ4.jpg

I prefer playing the game on a single TX as the SLI profile needs a bit of work but I thought I would post the 4 way SLI screenshots to prove that 12gb cards are capable of getting good fps.

Just ran the above game save and settings on my Fury Xs

Single Fury X was 4 fps

Quadfired Fury Xs was 2 fps

Fiji cards don't have the memory to cope with these settings.

I also don't think the drivers were supporting quadfire either.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
You asked for proof and you got proof, I have totally blown all your arguments out of the water.

Nope, I didn't.

What you think of the game is totally unimportant, it does not have to be pretty.

To you, to me and 99.999999% of gamers it isn't. If it's not pretty it shouldn't need £3200 of graphics power to run 65fps, it points to the horrifically bad optimisation of the game. When it looks massively worse than other games that have come out in the past two years and runs massively worse as well it is NOT a good representation of 4k gaming, in fact it's exactly the opposite, it's a single game that stands out as the worst possible representation. The massive majority of gamers don't use 4x Titan X's, the massive majority of gamers would disable poor quality effects and certainly settings that kill performance with absolutely no IQ improvement for the sake of smoother gaming. How pretty a game is, for most gamers, is directly linked to what performance they will put up with.

The argument was why a Fury X and GTX 980 Ti was stuck at 4fps and 6fps respectively at max settings and could TXs do any better.

No that wasn't the argument, the discussion was you insisting Fury X sucks at 4k due to memory. I provided a list of about a dozen widely played games, most which look FAR better than this all of which run great at 4k in which the 6/12GB options provide zero benefit, you insist that this single game is representative. I never once asked if Titans could or couldn't do better in this game, I asked what that has to do with 4k gaming in general when every other game doesn't share this result.

The one thing I will say about XCOM 2 is it can use 10.5gb of memory, it may be the first game but it certainly won't be the last and there will be plenty more in the works.

A severely unoptimised game, probably like SoM, using uncompressed textures for absolutely no reason except to mess with AMD, is not representative of how much memory 4k gaming needs.

I didn't ask for proof, so I didn't get proof, you're running 8xmsaa at 4k, pretty much pointless. Are you running 'max settings', ie stuff that reduces IQ and produce unrealistic effects like DOF?

The game runs like absolute filth at 4k on 4x Titans, the percentages of gamers who run 4x Titans..... 0.000000001%. £3200 or so for 4 cards to achieve 65fps in a game that looks this bad and should be runnable at 4k with max settings on a 680gtx.... what a truly epically great example of how Fury X can't run 4k.

Again I DIDN'T ASK FOR PROOF, I just said that one game means nothing in terms of if 4GB is enough. if 99% of games run great at 4k but one game that requires 4 titans to produce 65fps doesn't run great I don't care.

If running max settings is so important, why aren't you running 8k with dsr, or 32xSSAA, why 8xMSAA... is that the setting you found that broke the Fury X but only embarrassed £3200 of graphics cards with 65fps in a game that has all the graphical quality of a 5-6 year old game?

EDIT:- Having looked at some videos and some performance charts, that memory usage goes from minimal with zero impact on performance at high to completely crippling a 6GB 980ti at max as badly as it hurts the Fury X.... this is absolutely another Shadow of Mordor. High to Max gives zero visual gain but a sudden massive jump in memory from under 4GB to over 6GB. The only other game I've seen display this behaviour is Shadow of Mordor where they randomly, pointlessly uncompressed textures for no reason at all. There is no benefit to doing this, it doesn't improve graphics, it is simply less efficient for the sake of using memory. You spent £3200 to get better performance than cards with 4GB of memory and the only way Nvidia achieves the difference is paying a couple of devs to throw in a 'max' option with zero IQ improvement with uncompressed textures to pretend like more than 4GB matters.

If Max textures vs high had a genuine IQ improvement that is one thing, but zero improvement with the sole purpose of adding the setting to use more memory on purpose, more Nvidia sabotage... though in this case you could call it a therapy setting to make people who bought their 12GB cards feel less bad about it.

You repeatedly and consistently use extreme rare case games with pointless settings and known massive performance problems to generalise 4k gaming. If 1 or now 2 games run poorly on sub 12GB cards out of hundreds when the rest all run fine, the only sensible generalisation to be made is the 2 games have problems and 4GB is absolutely fine currently for 4k gaming.
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
I didn't ask for proof, so I didn't get proof, you're running 8xmsaa at 4k, pretty much pointless. Are you running 'max settings', ie stuff that reduces IQ and produce unrealistic effects like DOF?

The game runs like absolute filth at 4k on 4x Titans, the percentages of gamers who run 4x Titans..... 0.000000001%. £3200 or so for 4 cards to achieve 65fps in a game that looks this bad and should be runnable at 4k with max settings on a 680gtx.... what a truly epically great example of how Fury X can't run 4k.

Again I DIDN'T ASK FOR PROOF, I just said that one game means nothing in terms of if 4GB is enough. if 99% of games run great at 4k but one game that requires 4 titans to produce 65fps doesn't run great I don't care.

If running max settings is so important, why aren't you running 8k with dsr, or 32xSSAA, why 8xMSAA... is that the setting you found that broke the Fury X but only embarrassed £3200 of graphics cards with 65fps in a game that has all the graphical quality of a 5-6 year old game?

EDIT:- Having looked at some videos and some performance charts, that memory usage goes from minimal with zero impact on performance at high to completely crippling a 6GB 980ti at max as badly as it hurts the Fury X.... this is absolutely another Shadow of Mordor. High to Max gives zero visual gain but a sudden massive jump in memory from under 4GB to over 6GB. The only other game I've seen display this behaviour is Shadow of Mordor where they randomly, pointlessly uncompressed textures for no reason at all. There is no benefit to doing this, it doesn't improve graphics, it is simply less efficient for the sake of using memory. You spent £3200 to get better performance than cards with 4GB of memory and the only way Nvidia achieves the difference is paying a couple of devs to throw in a 'max' option with zero IQ improvement with uncompressed textures to pretend like more than 4GB matters.

If Max textures vs high had a genuine IQ improvement that is one thing, but zero improvement with the sole purpose of adding the setting to use more memory on purpose, more Nvidia sabotage... though in this case you could call it a therapy setting to make people who bought their 12GB cards feel less bad about it.

You repeatedly and consistently use extreme rare case games with pointless settings and known massive performance problems to generalise 4k gaming. If 1 or now 2 games run poorly on sub 12GB cards out of hundreds when the rest all run fine, the only sensible generalisation to be made is the 2 games have problems and 4GB is absolutely fine currently for 4k gaming.

Wow what a rant, but the bottom line is you got it wrong again.:D
 
Associate
Joined
17 Nov 2013
Posts
423
Also I thought that nvidia stripped out a lot of the features from maxwell to save power, surely that wasn't just DP because the mid range cards shouldn't have had that anyway, so does that mean maxwell has less compute ? presumably it still has some or it would make it useless to some users

Doesn't that mean that fury x is always going to be better for async compute, because it has more compute features overall ?
 
Man of Honour
Joined
13 Oct 2006
Posts
92,168
Also I thought that nvidia stripped out a lot of the features from maxwell to save power, surely that wasn't just DP because the mid range cards shouldn't have had that anyway, so does that mean maxwell has less compute ? presumably it still has some or it would make it useless to some users

Doesn't that mean that fury x is always going to be better for async compute, because it has more compute features overall ?

Yes and no - while in some cases* the advantage would go to the Fury X depending on the nature of the compute load on 28nm there just isn't enough space on the core used by those features to be able to pull ahead of Maxwell's ability to "brute force" it in most scenarios - think of it kind of like with tessellation - while AMD had a dedicated tessellation block they just couldn't spare enough space to pull ahead of nVidia's ability to brute force tessellation on the shader hardware.

If we were talking scaled up Fiji v scaled up Maxwell on 16nm it would be a very different story.


EDIT: Brute force isn't quite the right word but my brain isn't coming up with quite the right phrase for it :S

* There are some types of compute load where the nature of how nVidia is handling it isn't efficient enough to beat the better granularity of the Fiji architecture but generally where performance is really an issue you don't tend to be dealing with those kind of loads.
 
Last edited:
Back
Top Bottom