• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fury X vs 980 Ti Longevity

The OP says he's looking at 2560x1080 anyway, and they aren't 144hz either

There due pretty soon, I recently read a review on a 144hz 21:9 panel with a working free-sync range of between 30 and 95.
Actually I have a feeling that might of been a 3440 x 1440 model, I'd have to go find it to confirm that bit though. The thing is 144 hz is due and there's working models out in wild. It also said a G-sync version was coming.
 
That's what I thought.. I don't understand what people are complaining about. The cards they bought still have the same performance that they previously did.

The people that make a point of this are AMD owners, that telsl you all you need to know.
 
One of the things I would be looking for to maximise longevity would be Vram, DX12 will allow more draw calls rendering more unique objects which will all have their own textures, normal maps and meshes. As gamng moves to 4K then asset resolution will increase.

Rely, TitanX is the most future proof card by far but I really wouldn't bother because buy the time DX12 games around full fat pascal will be out with potentially 16GB HBM2.0 , and AMD's 16nm offerings will be out as well on HBM2.0.
 
I have a 980ti and 1440@144 and I'm seeing around 100fps either at max settings or with very minor tweaks in most games

The OP says he's looking at 2560x1080 anyway, and they aren't 144hz either

Just some corrections, that is in the future. I am looking to whichever card can push a 1080P 144Hz monitor to the max but a 21:9 2560x1080 144Hz monitor would be welcomed now currently. The current 21:9 ones is an asus rog monitor, that cost really high and is 3440x1440. It would be good to have a "fallback monitor" such as the Freesync ones since we have no way to predict the future but with 1080p on a 980ti I think there is a lot of life. .

always depends where the bottleneck is.
dx12 removes the cpu overhead which means value will be great even when those games and engines comes out. a card like the 7970 still runs 1080p well and thats shows games dont scale that much over the years.
Yeah, but to be honest I have doubts on that prediction that AMD will be a better DX12 card than Nvidia.

" however there will be the Oculus rift with OLED displays, but that does not have Free-Sync/G-sync"

Oculus doesnt need adaptive sync from what I can tell, or its possible they have their own implementation, what they use is low persistence frames at strict intervals that the graphics card must keep up with and very low frame times / input latency

I really want OLED as I have sensiitive eyes to standard LED backlighting
but from what I've read there are no ventures currently to create a product like that, the best you can get is a TV which cost around 5 grand for a decent resolution, oculus bypasses that by using a very small screen which is cheap.

I think you might find that Fury X is enough for 1080p, but a single 980Ti might not be enough for 4K or 144hz 1440p, so you may end up going SLI anyway, the performance difference with the 980 Ti wont make a great difference in that scenario.
Yes the Fury X is enough for 1080p, but I am speaking about longevity not now :D I don't plan on going 4 K as its too expensive in the long term, but if the game ecosystem stagnates, e.g graphical settings won't increase, I might be going for a 1440p in the future.

Also yeah agree on your OLED thoughts, OLED really needs to come into the picture, and I hope it will be at least in 2 years time. But I'll be contended with the oculus by then.

One of the things I would be looking for to maximise longevity would be Vram, DX12 will allow more draw calls rendering more unique objects which will all have their own textures, normal maps and meshes. As gamng moves to 4K then asset resolution will increase.

Rely, TitanX is the most future proof card by far but I really wouldn't bother because buy the time DX12 games around full fat pascal will be out with potentially 16GB HBM2.0 , and AMD's 16nm offerings will be out as well on HBM2.0.

I saw in the Fury X owners thread that the drivers had been optimised to use less vram on vram intensive games, so maybe its not so bad for the Fury X and might be an advantage compared to the 6gb of 980ti vram. Also seeing you like Nvidia a lot, what is your thought on predictions about the way AMD cards are designed to benefit more from DX12?
 
Last edited:
Why people talk about lifespan when talking about high end computer parts I do not know.

You buy an i7? There will be a 'better one' next year.

But a 980Ti/FuryX? Again, there will be a new flagship next year. But then again, if you buy a 980Ti/FuryX so that you don't have to upgrade as often, you probably couldn't afford/didn't really need the 980Ti/FuryX in the first place.

Just my thoughts.
 
A new NVidia top end card each year or a new top end AMD every two years, with a rebrand in between. It might not continue that way, but it has for the last couple of years.
Also don't forget that the fury is new this year, does that mean it will be rebranded for next year ? We will have to wait and see.
 
Why people talk about lifespan when talking about high end computer parts I do not know.

You buy an i7? There will be a 'better one' next year.

But a 980Ti/FuryX? Again, there will be a new flagship next year. But then again, if you buy a 980Ti/FuryX so that you don't have to upgrade as often, you probably couldn't afford/didn't really need the 980Ti/FuryX in the first place.

Just my thoughts.

Indeed it is stupid, it is always better to buy things for now. However it is still a valid question in my opinion. My upgrade cycle is 3 years flat and I would like to have the best bang for buck if you get what I mean and currently I am for the Nvidia 980ti to last over the Fury X which is more clouded in details. Also I already know the lifespan of the 2 cards in question will be long, especially with a 1080p 144hz monitor, but what I want to know is which is longer? The Fury X or the GTX 980ti.
 
Last edited:
I saw in the Fury X owners thread that the drivers had been optimised to use less vram on vram intensive games, so maybe its not so bad for the Fury X and might be an advantage compared to the 6gb of 980ti vram. Also seeing you like Nvidia a lot, what is your thought on predictions about the way AMD cards are designed to benefit more from DX12?


I don't like Nvidia a lot, I like technology a lot and don't trust AMD (or Nvidia) marketing material.

All that AMD are doing is pulling resources out of vram that they believe aren't needed and the develop left in vram as a cache. The risk is they make a mistake, or fail to pull enough resources out of vram. Lots of people have reported stuttering with the FuryX in shadows of Mordor and GTAV at ultra settings consistent with having to pull resources form system ram. That is something that will onyl get worse. AMD"s software tricks wont always work, for sure some game engines will just be relatively lazy in caching resources but other games with a large draw distance like GTAV maxed out will simply require ore than 4GB VRam for smooth play. It is is basic maths, 1+1=2. If something needs more than 4GB vram the FuryX will suffer, that is just a plain fact, the same as if something needs more than 6GB the Ti will suffer, but there is an extra 50% buffer here.

There are no valid prediction that AMD cards will benefit better from DX12, mostly just PR h9t air and AMD fanboys spreading FUD. Nvidia's cards going all the way back to Fermi support DX12, and the newer GPus have features like order independent transparency which is a great performance boost for future game engines using deferred rendering (Cryengine 2/3, Unity, Unreal 4). We currently only have a single DX12 benchmark, Star Swarm, and it shows Maxwell handily beating Hawaii. In general Maxwell V2 supports more DX12 features than Fiji.


Nvidia is a DX12 partner and has contributed to DX12 design since the beginning, Microsoft has chosen nvidia to demonstrate X12 demos. Nvidia has a long history of working closely with developers to support the latest DX versions and having polished drivers. Anyone who really thinks that AMD cards have some kind of magical as yet untapped power are deluding themselves.

Also most people don't really seem to understand what DX12 will do. Games can have a higher draw call rate and are less likely to be CPU bound but that doesn't make the game render any fast on the GPU. If there is a flaw in the GPU, e.g. poor ability to scale to the number of shades, then that weakness will be their regardless of the API. Similar a card that is weak in tessellation will be just as weak in DX12. Tessellating a mesh and rendering it with the same pixel shader will have the exact same cost in DX12 as DX11 once those commands are executed on the GPU.
 
Longevity and high end PC parts in general just don't go together IMO.

Look at my sig as an example, all bought fairly recently second hand, the CPU, RAM and Board cost me £110, the card I paid £75 for back at Christmas.
Somebody mentioned 5 years earlier in the thread and yes, my current setup was certainly high end five years ago and it's now - to me - a huge upgrade on my old Q8300 system that I've just retired.

I can run GTA V in a mixture of Very High and a few Ultra settings @1080p, same goes for BF4 (everything @ Ultra) 1080p, I'm very pleased with my upgrade for relative peanuts.

I'm not so sure how I'd feel had I bought it all at full whack five years ago mind you, ballpark full system cost ,new,would have been way over £1500...

Five years in - from its era if you follow me - my system still cuts it very nicely, and actually betters a friends recent £900 build PC by some margin.

I'm unsure how I'd feel having paid full price for it, given what the components are worth now (roughly 15% of new retail) the last time I bought a high end card at full retail (GeForce 4600 Ultra) I very quickly found it obsolete and rather worthless after a Direct X revision iirc, the same hasn't happened with my GTX580 (until DX12 of course) I suppose longevity- especially with a GFX card - depends entirely when in the Direct X cycle you buy, I bought my 4600 at the wrong time, had I bought the 580 at launch, then I'd have clearly got it just right and enjoyed relatively long longevity as it still - just - cuts the mustard with today's top titles....

I think the above makes sense! :o
 
Last edited:
Longevity may not always be guaranteed when buying new hardware but that doesn't mean that you should make a bad decision from the off, when you have a choice of cards retailing at more than £500 with 4GB and 6GB respectively, you've got to be a bit of a gambler to take the 4GB option. The roll of honours speak for themselves most people have made the sensible choice.
 
if what we are lead to believe is true - the next lot of cards being on a die shrink of nearly 50%. then these cards are going to one of the shortest lifespan cards in recent generations imo.
 
Something the AMD fans that think DX12 will be AM"s savior should take note of:
If you believe DX12 is very similar to Mantle (hint, its not but I'm trying to frame this in the best possible light) then Mantle should give a good insight into DX12 performnace (it will to a certain extent, but Mantle is best possible performance case been engineered for GCN exclusively).

Furthermore, BF4 is the showcase Mantle game, with mantle co-developed by DICE and the game engineer specially design with mantle in mind.

Moreover, if DX API overhead is killing Hawaii performance then it should be having an even bigger impact on Fiji, Fiji should show even bigger gains.

So wat do the FuryX BF 4 MAnlte results show? a 20% DROP in performance compared to DX11, in a scenario that should maximize the API advantage. In the best possible scenario FuryX is 20% slower than DX11!! Think about it for a minute.

That doesn't mean the FuryX will be 20% slower in DX12. However, it does mean that drivers and game specific optimizations completely dominate over the API differences to such an extent that AMD unoptimized Fiji Mantle drivers are way slower than DX11.

SO AMD DX12 performance will largely be dictated by their drivers, exactly the same as with DX11.


##############

Something else to consider is part of the approach of DX12 was to remove a lot of intermediate layers and expose only a lower level closer to the hardware. For people that program it is a bit like going form a high level language like javascript/Python/C# that manages memory and provides high level functionality, and getting developers to program in plain vanilla C, with manual memory management, no cases, no extensive library. Performance now becomes more dependent on what the developer does, with the high level API and drivers doing less work with less room to optimize. It is going to be very common for developers to write abstraction layers to regain much of the high level functionality that was lost. We might also see developers write slightly different code paths for this functionality based on different architectures of GPUs. Nvidia is in a good position having 76% market share to incentivize developers to optimize for Nvidia hardware.

How much this will effect DX12 is yet to be seen.
 
To those you just say nvidia without a thought.
AMD very often age better performance wise. I think because nvidia seem to forget about the older cards very quickly. I also wouldn't be surprised at all if the furyx is ahead of titanx in a couple years but by then everyone will have forgotten and moved onto the next thing. A few example are the 7900gtx back in the day was faster than x1900xt, soon after it can't even run games at 20fps and x1900xt was still getting good frames. It even played Crysis 1 medium settings dx9 at 40-60fps. It even still runs games at low/med.
7970 another example but that's more due to late driver enhancements it should have been quicker than gtx680 all along.
9800pro vs its competition was so good that it even completed against nvidias next gen mid range for years.
290xt was barely faster than titan1 at the start now its often a lot faster.


Exceptions that I can think of, 8800gtx, it aged gracefully. I'm sure there's others.
 
if what we are lead to believe is true - the next lot of cards being on a die shrink of nearly 50%. then these cards are going to one of the shortest lifespan cards in recent generations imo.

You'd say that everytime there's a manufacturing process improvement if you believe the hype. The first Pascall will not be at the level that Maxwell is (relative to it's lifespan). And the performance improvement will be there, but it won't be all that.

Even if it was, if you're the kind of guy that drops £600 on a card, you're not going to be the kind of guy that is happy to wait a year without having the best stuff.
 
yes but this gen - 900 series is the last 28nm before die shrink and its a bigger die shrink than has previously been seen in terms of %.

if you want cards to last you have to buy a card that isnt going to be the last on that die.

im not one for believing in hype. i have not been around this scene for very long so not seen a die shrink or anything yet, but the way people go on about it its like the first EVER easter.
 
yes but this gen - 900 series is the last 28nm before die shrink and its a bigger die shrink than has previously been seen in terms of %.

if you want cards to last you have to buy a card that isnt going to be the last on that die.

im not one for believing in hype. i have not been around this scene for very long so not seen a die shrink or anything yet, but the way people go on about it its like the first EVER easter.
I'm not talking about that, so which is which, will Fury X last longer than the 980ti? Would like some substantial counter-arguements for the 980Ti, as I am set on getting it in place of the Fury X as D.P seems to be the only one heavily arguing for it. Doesn't help also that the Fury X is really low on stock and on other retailers its more expensive than the 980Ti while offering less performance.
 
Last edited:
I'm not talking about that, so which is which, will Fury X last longer than the 980ti? Would like some substantial counter-arguements for the 980Ti, as I am set on getting it in place of the Fury X as D.P seems to be the only one heavily arguing for it. Doesn't help also that the Fury X is really low on stock and on other retailers its more expensive than the 980Ti while offering less performance.

how do you mean "which will last longer" then?
 
get the 980ti
Yeah I'll be getting it.

how do you mean "which will last longer" then?
I don't think it is hard to understand. Read the first sentence of the opening post. "Question is difficult, which card provides a better lifespan?" The topic is also called "Fury X vs 980 Ti".

And then I gave out some information such as VR, Drivers etc, you are free to debunk them like what D.P has done which is what this topic is also intended to be about; clarification of the rumours. So far I have concluded that the 980Ti will last longer. Also since apparently no one argues for the Fury X, which tells the tale really.
 
Last edited:
Has there been a driver since the furys release? Did it offer great performance gains? I'm sure if it did or was we would all have heard about it. So in them lines the 980ti sounds like it would be good for now. But if what people say is correct about the AMD cards getting a second wind a lot later in life and topping the nvidia cards that where topping them then it could be AMD to go for. But as we all know nvidia all but forget about there older cards once something newer and better is released.

I can't really say which one will last longer.

If I was to buy a card now it would be a stop gap card while I wait on the die shrink. Buying the first one of them should last almost to the end of that generation. It would be a lot closer than the 980ti to the most current card.

EDIT - I can't go into the tech stuff because I don't understand it lol.
 
Back
Top Bottom