• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

980 Ti vs 1080 - WHAT???

I wouldn't have thought that 980ti owners would have been especially interested in GP104 cards. Not so much a sidegrade but a minor upgrade, especially at current prices.
 
I wouldn't have thought that 980ti owners would have been especially interested in GP104 cards. Not so much a sidegrade but a minor upgrade, especially at current prices.

Im not for sure, I spent Just under£1200 on my 980ti's with some AIO coolers and clock to 1430mhz ish stable. I would have to spend upwards of £1400 on 1080's for not a lot of extra speed.

I only brought the 980ti's to max out the witcher 3 at 1440p and my 780 non ti's couldn't cut it, at the current rate ill probably hold hold out for volta/vega as nothing coming out in the next 9+ months looks like will stress out a 980ti, specially if SLI is supported
 
Something a bit wrong Max OC to Max OC on my 1080 vs 980Ti was 20-30% about 1501 vs 2101. Not a huge jump no. Not up to the hype anyway.

TXP is a good 30% faster still.

Pretty much the same for me coming from a Titan X Maxwell to a 1080. Nice gains but nothing massive.
 
I wouldn't have thought that 980ti owners would have been especially interested in GP104 cards. Not so much a sidegrade but a minor upgrade, especially at current prices.

Correct, for me at least. At this moment in time I'm content to wait for Volta. Even if 1080ti does rock up, what's it's going to cost? £850, £900? My 980ti isn't being pushed hard enough to justify it.
 
Because you have a Pascal card :p

No but if you add up the results it does add up to 23% lol.

I think that a 1080 is a good upgrade from the 980ti (at 3440x1440), I would say it is a hybrid of "high end" and "Midrange", because it has more cores (2560), better memory (G5X) and more performance than normal.

For example 55fps > 70fps is a bigger difference than it sounds like, because 55fps avg looks not great whereas 70fps avg looks good.
 
Last edited:
Correct, for me at least. At this moment in time I'm content to wait for Volta. Even if 1080ti does rock up, what's it's going to cost? £850, £900? My 980ti isn't being pushed hard enough to justify it.


Same.

I could buy Titan XP SLI tomorrow if I wished, but A) I don't really need it and B) I'd rather get full use out of my 980ti and the added cost of it's water block.
 
I think the best thing to do is either keep your 980ti/TX or get a 1080, which is not a lot if you already have a 980ti or TX and ebay it. And wait for the 11xx volta cards... 980ti or TX > 1080 is not a lot and the performance is better, 980ti or TX > TXP is a lot and will then get replaced by the 11xx cards at 1/2 MSRP in about 12 months, bad idea to get the pascal titan if you care at all about value or depreciation etc.

I guess if you still have your 980ti now, they have tanked in value on ebay and also the 1080 has gone up since launch, so overall its not worth it as much, but if you have a TX then they still have good resale on ebay. You could ebay your TX for 500 and get a 1080 jetstream or gamerock for 600... Ebaying a 980ti for 300 though is not as good.
 
Last edited:
980ti is only going to drop in price - even if there are cases where the 1080 is less than spectacular compared to it now down the line in future games the gulf will widen.
 
Ebaying a 980ti for 300 though is not as good.

Depends. I paid £450 for my Strix (new), so £300 doesn't seem too bad, but I'd rather keep it for now, it still performs exactly as it did before the 1080 came out, it didn't suddenly stop playing AAA games at 1440p at 90-144fps just because a new card got released.
 
http://www.3dmark.com/fs/9086285


gtx 1080 positives

- less heat runs cooler ( on water )
- less power
- more support, maxwell now ended
- few extra features like better memory compression etc

that said a maxwell 980 ti or titan x is still a badazz card and the only reason I went 1080 was because my titan x was too hot when overvolted/custom bios
 
Last edited:
Maxwell support ending doesn't suddenly render the card a useless paperweight.

I don't think anyone is saying the 1080 isn't a great card, it's just a pointless upgrade for most 980ti/Titan X owners.
 
http://www.3dmark.com/fs/9086285


gtx 1080 positives

- less heat runs cooler ( on water )
- less power
- more support, maxwell now ended
- few extra features like better memory compression etc

that said a maxwell 980 ti or titan x is still a badazz card and the only reason I went 1080 was because my titan x was too hot when overvolted/custom bios

Sorry if off topic, but just out of interest, what are you using to cool your CPU and GPU? I have virtually the same system as you:

http://www.3dmark.com/fs/9709774
 
I think if you have a maxwell TX, the best thing is to ebay the TX and get 1080, because the TX can still go for 500... and you get can get 1080 with good cooler for 600 (jetstream or gamerock). For 100 that is defintely worth it...

I think the TXP is a really bad card if you are bothered about value or depreciation etc. If you do not care about that, then just get a TXP, but don't be surprised when it is equalled by 1170 / 1180 in about a year at half the MSRP and TDP.
 
Last edited:
Maxwell support ending doesn't suddenly render the card a useless paperweight.

I don't think anyone is saying the 1080 isn't a great card, it's just a pointless upgrade for most 980ti/Titan X owners.

At 3440x1440 it isn't pointless... It is like saying the TXP is a pointless upgrade from the 1080... The 1080 is 23% faster average compared to a 980ti.... the TXP is 28% faster than a 1080.

23% faster average is quite a lot, for example in some games you go from 55fps average (jerky) > 70fps average (not jerky), for 1080p or 1440p, the 980ti is still ok, for 3440x1440 the 1080 is worth it, for 4k the only card that is good enough is the TXP.
 
Last edited:
http://www.3dmark.com/fs/9086285


gtx 1080 positives

- less heat runs cooler ( on water )
- less power
- more support, maxwell now ended
- few extra features like better memory compression etc

that said a maxwell 980 ti or titan x is still a badazz card and the only reason I went 1080 was because my titan x was too hot when overvolted/custom bios

All true, but if i already had a 980ti i'd barely give the 1080 a glance. Its not sufficiently faster to justify the cost. Dont get me wrong, the 1080 is a cracker - i got one myself, but i replaced a 780. Now thats an upgrade!
 
If you aren't happy, return it, then skip a generation.

Also.. comparing OC to non-OC is a little bit silly.

Talking about benefits of a better process/design: My 1070 take less power to run than my R9 380. That is nuts. Now if I can only bring myself to stop running my CPU at full speed the whole time, I'd save a fair bit lol.
 
Last edited:
My metric for high end is pushing the limits of what is possible - between using barely half its available thermo/electrical budget, GP100 showing much bigger cores are posible and that aside from the VRAM modules most of the parts on the PCB are more commonly found on mid range GPUs I struggle to take it seriously as "high end".

Your definition of 'high end' is ridiculous. Sure NVIDIA could have released a big die Pascal product first but it would have been prohibitively expensive and in very short supply. The P100 has not yet been released, is itself not a fully enabled die and is expected to cost upwards of $5,899. The fact that 1070's and 1080's were in short supply some time after their respective releases and that some people are still waiting for some SKU's after the release of a successor flagship product (The Pascal Titan X) shows that NVIDIA DID PUSH THE LIMITS FOR A VOLUME HIGH END PRODUCT

Its quite simple yields improve as a process runs its course and the bigger dies can be stockpiled for release later in the process run when they could not be supplied in any volume at the earlier stages. It is therefore entirely sensible and consistent to follow the pattern we have seen of late.

Pretty much no consumer products, especially one to be a volume product, pushes the limit of what is possible because it would be cost prohibitive.

People consistently refer to the days of the GTX 580 and before as some sort of proof that NVIDIA used to release big die GPU's as flagship products fairly early in the respective product cycles which is of course true to a large extent (of course we all know that the GTX 580 was just a refined die from the GTX 480). They then use this to infer that NVIDIA is now having us all over by releasing smaller die GPU's first then releasing larger dies down the line and so like to troll and bait on computer forums by calling products like the 680GTX, 980GTX and 1080 GTX 'mid range' of course this ignores the massive, escalating costs that consecutive process shrinks have placed on the semiconductor world which now make it far more cost prohibitive to launch a bigger mass production high end product initially within a process life cycle .......

Gotta love Nvidia's marketing. Don't tell me you believe that R&D line about Pascal costing billions in research? It's plainly just Maxwell spiced up and with Volta coming in early 2017 according to the latest rumours, Pascal is just a stop gap. Look at how quick they are releasing the Pascal parts, getting them all out before more solid info about Volta appears and cuts into their sales.

Rroff is right, but, nobody wants to hear about their £600 cards been mid range.

NVIDIA's quarterly revenue up to the 31st of January 2016 was $1.4 Billion there R+D spend was $344 Million in the same period. So not far of 25% of the money that NVIDIA took in that quarter NOT PROFIT it spent on R+D.

NVIDIA's profit for that quarter (from previous link) was $207 million some 60% of their R+D spend.

NVIDIA spend a massive amount of money on R+D and although Pascal has a lot of architectural similarities to Maxwell it is on a new smaller node that has cost $$$ to develop and is heavily delayed.... So yes I do believe that Pascal cost an awful lot of money to develop.. their R+D spend for the year January 2015 - January 2016 was over 1.3 billion. Which of course would have included products other then Pascal but it would be entirely expected for the R+D spend on pascal to be in the 100's of millions of $'s
 
Last edited:
For the high end "pushing the limits of what is possible"

That would be a 5000 CUDA core 375w card (that is the limits of power)

So no the top end cards are built to be "good enough" eg. 30% faster. To be "worth upgrade"... Not "pushing the limit of what is possible".
 
For the high end "pushing the limits of what is possible"

That would be a 5000 CUDA core 375w card (that is the limits of power)

So no the top end cards are built to be "good enough" eg. 30% faster. To be "worth upgrade"... Not "pushing the limit of what is possible".

Pretty much what I was saying

'Pretty much no consumer products, especially one to be a volume product, pushes the limit of what is possible because it would be cost prohibitive. '

Rroff's version of 'high end' doesn't actually exist in reality.... (certainly not at anything other than an exorbitant price)

The 1080 was just about 'good enough' for a volume released high end product any attempts to make it bigger/better etc would have made it more expensive to develop an make and likely reduced the already constrained supply to boot.

Just because the 1080 doesn't use 300 watts of power doesn't mean that it wasn't pretty close to the optimum GPU NVIDIA could have released at the time because other parameters(i.e. like die yields) could easily be the constraining factor. If mass power consumption is your main parameter for 'high end' then there's always AMD cards to fill the role currently.....
 
Last edited:
Your definition of 'high end' is ridiculous. Sure NVIDIA could have released a big die Pascal product first but it would have been prohibitively expensive and in very short supply. The P100 has not yet been released, is itself not a fully enabled die and is expected to cost upwards of $5,899. The fact that 1070's and 1080's were in short supply some time after their respective releases and that some people are still waiting for some SKU's after the release of a successor flagship product (The Pascal Titan X) shows that NVIDIA DID PUSH THE LIMITS FOR A VOLUME HIGH END PRODUCT

Until Kepler that is exactly what happened the high end GPUs were near the upper end of what was possible without going completely silly and the cores that didn't make the grade were used in the rest of the lineup - high end and volume don't generally go together - as soon as you are talking volume you know you are talking a mid range product and not high end when it comes to GPUs. A "volume high end product" is not the definition of a high end GPU.

There is a huge gulf between the GP100 and the 1080 - easily enough space to make something that is more like high end in the region of the TX stats. 16nm is more expensive than some previous processes but not to the extent you are portraying and the process itself quickly hit a good level of maturity once it was out of risk - nVidia is pushing the clock speeds a bit more than the process was really envisioned for which has had some impact on availability.

If you look at ~300mm2 cores that were around about the same performance faster over the previous high end on the node before them we are talking 8800GT, GTS250, GTX460, etc. you have to completely rewrite the rules to make the 1080 high end. Meanwhile nVidia is laughing all the way to the bank off those who buy into it.

Rroff's version of 'high end' doesn't actually exist in reality.... (certainly not at anything other than an exorbitant price)

The 1080 was just about 'good enough' for a volume released high end product any attempts to make it bigger/better etc would have made it more expensive to develop an make and likely reduced the already constrained supply to boot.

Just because the 1080 doesn't use 300 watts of power doesn't mean that it wasn't pretty close to the optimum GPU NVIDIA could have released at the time because other parameters(i.e. like die yields) could easily be the constraining factor. If mass power consumption is your main parameter for 'high end' then there's always AMD cards to fill the role currently.....

Did you miss the GTX280, GTX480, etc?

Obviously power use has to be proportional with performance - a card that used a ton of power but was slow obviously doesn't qualify.
 
Last edited:
Back
Top Bottom