• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 1070/1080 cost the same as 680/670 (after FX & inflation)

Associate
Joined
22 Oct 2012
Posts
1,089
Comparing now to the GK 204 release (GTX 680/670) in May/June 2012:

- VAT is identical at 20%
- The pound is worth 90% of what it was (in US dollars)
- Inflation increased prices by c 4.% (US inflation as otherwise we are double counting via the exchange rate)
- Thus a pound now is roughly 90% * 96% = 86% of what it was in 2016
(EDIT: to be clear this is assuming no one buys FE)

Therefore we can deflate the £365 cost of the cheapest AIB 1070 (on OCUK) to £315 in 2012 dollars. As the cheapest 670 was £300 on release, 1070 prices have increased by about 5% over the 670. The same logic can be applied for AIB 1080's which are currently £525, or £455 deflated. The cheapest 680 was £430 on release, so 1080 prices have increased by about 5.8% over the 680.

This suggests that the feeling that the 10 series is worse value than we've had historically might not be entirely true, once we look at the true value of a pound by controlling for exchange rates and inflation. To be fair the Founders Editions represent considerably worse value, but aside from that smoke and perspex act, a 10 series card appears to cost about the same as 2012's 600 series did.

Note that the 1070 series gives roughly the same amount of GPU silicon (314 vs 294mm^2), about the same amount of memory silicon and the same quality of manufacture as the 600 series. I've drawn a comparison to the 600 series rather than the 700 or 900 series' as like Pascal, Kepler was the first release a new (28nm) manufacturing process, otherwise we'd need to add yields etc to die size comparisons.

Thoughts?
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
I'm glad you don't calculate my mortgage payments.
Actually I do have some (okay a tiny bit of) input on mortgage payments ;)

Whats your take on brexit? Noooope nvm that could get messy.

The majority agree on here 10 series so far are absurd prices. Roughly 75-100 too much.
In!! The EU is a grand human experiment that we should nurture. r.e. GPUs, agree a majority of people think the 10 series are a rip off; that's why I posted this comparison. What I'm saying is that regardless of how we feel, the maths says they aren't any more expensive.

Wow I found my old Leadtek Geforce 5950 Ultra order price £387 back in 2003, that included 17.5% VAT and the pound in Oct 2003 was $1.68, not that much different to current $1.41 rate. Geforce 5950 Ultra launched at $499 MSRP in 2003 and GTX 1070 launched at $379 MSRP on 10 June 2016.

If 20% VAT was added in 2003, I would paid £394 for Geforce 5950 Ultra which is around GTX 1070 Founders Edition price.
Taking your 20% VAT price of £394 at an FX rate of 1.68, you'd have to pay £470 at the current 1.41 rate for that 5950 Ultra, so the different exchange rate does have a fair bit of effect. Inflation over that period was bang on 30%, so that takes your 2003 card all the way to £613 in today's prices.

... The trouble is, is we are being fleeced and aren't seeing prices anywhere near that in GBP due to the FE nonsense and short supply. The 1070 should start at £320 and the 1080 should start at £500 which i think would be fairly reasonable. I think prices will start to settle there ( where they should be) once supply is readily available and stock is sitting on shelves not being sold.
Agreed, prices will definitely go down over the next 3-6 weeks, and that's when we should buy. The same thing happened with the 600 series though; so it's not like (FE aside) the starting price for the 10 series is that unusual.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
Yeah but the 5950 was a top end card, while the 1070 is mid/mid-high range.

Should compare the price to the top cards, but the top cards these days are £1000 titans... and £800 1080ti isnt unrealistic looking at 1080 prices.

As above tho, that 5950 Ultra would be over £600 in today's money.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
Yeah for the absolute top end card. If the Titan was £600 it wouldn't be too bad.

TBH that is straying into territory that kind of muddies the waters, namely the waters of 'was the GF-104/Big Fermi chip a new kind of market segment?'. Leaving aside that question (as it's been repeated endlessly on other threads, with no resolution in sight) and making this just a comparison of cards of the last 4 years, I think the point that the 1070 and 1080 are the same price as the 670/680 remains, and (although I haven't done the maths) I suspect the same would broadly hold for the 770/780 and 970/980.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
I have made the same argument time and time again on the CPU forum to rebut the hordes of people clamoring to blame evil Intel for their pricing when (excluding extreme outliers like the 6950X) its been pretty steady for years (their pricing NOT the UK retail pricing).

People always forget how poor the pound/ dollar exchange rate is currently and need reminding to add VAT when comparing USA MSRP's and eventual UK retails prices
It can be frustratingly like convincing your grandparents that part of the reason cars don't cost £15 like in 1950, is that people aren't paid 5p an hour any longer.

If your bank manager told you that your mortgage would rise from £1000 per month to £1480 in the space of eighteen months, well.... :rolleyes:
Yeah but if you took out a Mortgage on a US property, on an American house, and the exchange rate changed would you look at the connection as being more intuitive?
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
Come on, stick to the subject and the facts. I gave the example that everyone is basing their opinion on, and the 48% increase that has been deployed. The exchange rate when the 970 was bought was roughly 1.6. Today, the rate is 1.42.

970=289-20%=£240 @ 1.6 = $384
1070=429-20%=£357 @ 1.42 = $506

Not quite right; to reduce prices by 20% you have to multiply by 80% not divide by 120%. Using your pound prices that would make for:

970 = £289 * 0.8 = £231 @ 1.6 = $369
1070 = £429 * 0.8 = £343 @ 1.42 = $487

Moreover this example that you are basing your opinion on, arbitrarily chooses a single model (the Gigabyte G1 gaming) whose price has increased dramatically, most 1070 models have not increased by anywhere near that much.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
The observation is that the question (judgement is probable a better term) is biased in favour of self interest.

I think I get this, you're saying the question of why prices have risen is something driven by the rational self interest of a consumer?


The error with claiming 'you forget its a business' is that it forgets what a consumer is.

That's not really a claim I've made, although I guess I don't generally expect companies to cover exchange rate fluctuations.

Rather my thinking is that inflation and exchange rates explain all of the current (no-FE) prices. There's no question about why prices have risen; there's a clear explanation. As there has been no price rise pushed by Nvidia itself there's also no question of whether we're being ripped off by a profit-mad company. Hence the only question we have left - self interest or no - is whether we are willing to pay the number of pounds a 1070 costs, given currency fluctuations and inflation.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
The 670 and 680 were the initial high performance cards of their generation just like the 1080 and 1070 are for their generation so sorry trying to compare the 1080 and 1070 to the 660ti and 660 doesn't wash

I've tried to avoid this topic as the main thrust (giggidy) is that when people compare cards they don't fake into account FX rates or inflation. That having been said I have to agree with what you're saying. The Big Fermi chip and its successors are obviously a new market segment based on die size and memory bandwidths.


I know, its highly repetative claim made on the forum, thought I would slip it in. Lots of decent papers online on the subject. That would place some of the observations you have made in a wider context.

I'm vaguely aware of the wider contexts given I'm an Economist :) Again though the main point is that there's little to no Nvidia driven price rise, it's all factors external to the Green team.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
If something is £120 inc VAT you divide by 1.2 to see the price is £100 without VAT.
Ach, thanks for the correction; I always forget how much VAT is here.

**very comprehensive list**

Now, if see from the info above, the 560ti, 560, the 680, 670 and the 1080,1070 are all on the same tier. Made from the second tier chip etc. So going by that your price comparison is valid. But the only problem I have with your comparison is that you should have started at the 560ti and 560.

Why? because it was from the 560ti to the 680 that the big price increase happened. The 560ti was $249 on release, the 680 was $499 on release.

But, since the release of the 1080 and 1070, the chip used is not important. And is not a valid method of deciding where a card should be in Nvidia's line up. I have argued otherwise, but have been told that it didn't matter.

Dead interesting reply melmac :) I think though that we absolutely should use consider die sizes when comparing cards. At its release the GK-110 (780Ti) was simply the biggest chip that had ever been stuffed into a graphics card, so large it's smaller GK-104 cousins were still bigger than most previous 'top-tier' chips. That makes it hard to equate the modern Ti or Fury-X cards with previous chips like the 480 or 7800 GTX, rather they're kind of 'supercards'. This pattern has continued all the way until now with the 1080 die size being similar to previous top cards to a GTX 680 or 7800 GTX. The 600 series comparison is valid IMO, but it would be good to see others.

Looking back at your list, it very true the 200, 400 and 500 series' muddied the water; they were gigantic compared to their predecessors. Nvidia wanted to start their 'big die' strategy with the 200 series cards iirc, but when their first unified architecture failed to perform they released their big chip as a mainstream product. Nvidia did release 500mm^2 cards as '70 and '80 series, for all that it was because they had no other choice.

So the point you make is an excellent one; and we would ideally look at all previous chips as there isn't quite a one-to-one mapping. But again the main point for me (sorry to be a broken record) is that we do need to incorporate inflation and exchange rates in such comparisons.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
I am not sure what you are trying to say about the large chip? Every generation aside from the 1st generation Kepler has had a large chip. The 580, 480 and 280 were all much bigger die sizes than the 680. The 780ti was no more a super chip than the 580 GTX. The 780Ti was no bigger than the Titan chip. They were both GK110. You know the 780 was also based off the GK110 chip too.

Obviously every generation has it's largest chip. But the GK-110 was about 8% larger than even the 500 series chips by surface area, and (EDIT: was not the largest piece) of silicon ever used in a GPU. GK-110 and it's successors also represented a fracturing of the enthusiast market, from what had been a two chip to a three chip strategy. On those bases I think it's fairer to treat the GK-110 and the cards it powered as being meaningfully different

Yes of course, but you don't need to go back any further than the 500 series cards. It was the move from the 500 series to the 600 series that the name change occurred. The x60ti and x60 were renamed the x80 and x70.
As above, neither the GK 104 nor 110 chips eactly matched their predecessors in size, one was smaller and the other bigger, so it's a little more complicated than saying it was a renaming of the '60 and '70 models. As mentioned Nivida had wanted to create a 'really ****ing huge' chip for the 200 series, but ended up releasing it for the mainstream. Thus in some ways the 200, 400 and 500 series were unusual.

So just do your price analysis between the 560ti/560 and the 1080 and 1070.
The thread title is 'GTX 1070/1080 cost the same as 680/670 (after FX & inflation)'. This was chosen as they were the most recent point, where we had a new architecture + new manufacturing process occurring at the same time. If you'd like to perform another comparison go ahead do it;'d be interesting.

Visual Pun
Maybe 2deer 4U... But not any more dear than the 600 series.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
What's this two chip to three chip strategy? there has always been numerous chips each generation. If it's a strategy, it didn't last long, It was two cards per chip for Maxwell and it's looking the same for Pascal.
This strategy is a well known one. Nvidia previously used a two chip strategy for the enthusiast segment (among other names, 104 and 106) post GK-110 there have been three. Enthusiast Maxwell definitely had three: GM-206, GM-204 and GM-200.

I don't need to work anything out, I know the price has increased massively. We are getting charged high end prices for second tier cards.
Actually you do need to work things out, otherwise all you have is an intuition. I've gone to the trouble to make a comparison to the 600 Series, is it too much to ask that you do the same?

I guess we will agree to disagree, You will pick the 680/670 to 1080/1070 price comparison as it suits your argument, to show that there has been no price increase. I will pick the 560ti/560 to either the 680/670 or the 1080/1070 comparison as it suits my argument.
No I'm not doing that; as I've said a half dozen times now my main point is that there has been no significant price increase since the 600 series and that we need to include inflation and FX in our calculations. I thinkt this is interesting and may inform people's understanding of the value offered by the 10 Series.

You are getting increasingly aggrevated because you wish the analysis to be done using the 500 series; in contrast I accept this would be interesting and complementary to my brief analysis. But again I'd prefer you do this work. You also appear to wish us to accept wholesale your suppositions about there having been a numbering change between the 500 series and 600 series, however the situation appears much more complex than that, for the reasons outlined above.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
They structure their lineup based on profit and certain other strategic goals. I'm sure the marketing/sales depts have ideas about how they would like to sell the stuff but it all depends on how the chips fall, to coin a phrase.
That's pretty much how it seems to work eh.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
quantitatively it matters little but to handle 4% inflation you don't multiply by 0.96, you divide by 1.04, same mistake as with calculating the ex-VAT prices.
I was using a deflationary table for the inflation rate and the reciprocal of 1.042 rounds out as 0.96 For GST... I uhh just *cough* got the wrong number for the UK, my bad.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
Small correction, the GT200 was the biggest GPU die created (GTX 260/280) at 576mm² untill the GM200 which was 601mm², where the GK110 was 561mm²
Fair point. Actually (and even more importantly) at 561mm^2 that also leaves the GK-110 only about 8% larger than GF110, not the 20% I figured. Edited accordingly.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
What 3 enthusiast segments did Maxwell have? The GM206 cards were the 960 and 950? are they Enthusiast cards now? Sub £200?? They are mainstream cards. The GM200 was the Titan and 980ti and the GM204 was the 980 and 970? For Pascal the GP102 seems to be the two high end cards, the GP104 is the 1080 and 1070 and The GP106 is powering the 1060 which is a mainstream card.
The '60 is defined as an enthusiast card. Just admit where you're incorrect Maxwell had three enthuiast chips contrary to what you said.

You are making it more complicated than it is. The simple explanation is normally the right one. The previous two generations the Gxxx4 cards made the x60 cards. Then Kepler arrived, the highest silicon NVidia had available was the GK104. Normally this chip would have been used for the x60 and x60ti cards as seen in previous years. But with no high end chip, Nvidia produced the 680, 670, 660ti and 660 from this chip. They had no choice. You are asking me to work it out, as I said I don't have to. I said it already.
That's not the best invocation of Occam's Razor, because both alternatives are quite simple. I have come around somewhat to seeing the 500 series as having a lot more chip compared to the 600/700 series. But this was because of someone else who bothered to highlight a veryinteresting fact. Maybe you could do more than shout your conclusions? Maybe start a thread with a 500 series comparison?

I am becoming increasingly aggravated? lol, news to me. I am having a discussion while watching the Euro's, why would I be aggravated?
Stay frosty.
 
Last edited:
Associate
OP
Joined
22 Oct 2012
Posts
1,089
Yeah no thanks, the drugs you've been taking are clearly effecting your judgement. You said "In!! The EU is a grand human experiment that we should nurture" that is the most retarded statement I have heard for a pro for the EU. It an unsustainable model of fleecing the English to pay for the rest of Europe.

The rest of your economic drivel is just that also.
I think you mean "affecting": don't worry it's a common mistake. Moving on from English, numbers don't become drivel just because you don't understand them; this applies both to the economics of the European Union, inflation and so on.

If you'd like Ladybird do a fantastic book series on some of these issues specially aimed at Britain First supporters.
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
This is all due to a delibrate short supply so people spend more give it another 2 months and they will be sub 600 for all versions (bar exotic!)
Yeah very true. Prices are going to fall a fair bit I think.

We'll done, you just proved the point the regressive in morons who think they are intellectually and morally superior by calling people who disagree with their opinions racist.
I think you mean "well done".

Well as a staunch support of UKIP all I can say is this, 13 % of the electorate voted for UKIP and 34 for tory. That swing has and will move massively and you are in the vast minority. People are leaving.
Yeah and I guess you really hate minorities eh.

Also, if you think you can put a bunch of numbers on a screen and claim to be an economic deity they you are fooling yourself as well as looking like a ****.
Doing basic maths to account for Forex and inflation is not claiming to be an economic deity. All of which brings me to.

Source: work for hedge funds so I probably know more about the economy than your pony little mortgage brokerage. Voting in is economic suicide in the long term. But your trade is an bubble held up by a demand for housing caused by influx of migrants so of course you want to remain in. To keep your bubble inflated.
It doesn't seem likely you work for a hedge fund based on your English and Maths skills, inspite of your barely checked aggression. Perhaps you clean the tables there?


p.s. I don't work for a mortgage brokerage, girl
 
Associate
OP
Joined
22 Oct 2012
Posts
1,089
LOL, you get very angry when someone points out how wrong you are. You do not like at all do you?
Nope, but I do get angry when people refuse to be reasonable and/or misquote. I actually derive a slightly wincing pleasure from being proven wrong by smart people wielding facts. But back to you...

The 960 is not an enthusiast card now. Based on a GM206? And all the sites you listed, well you forget one important word that they all use, and that is "mainstream"… Techspot Techporn Guru 3d Bjorn3D

I didn't suggest (all but one of) those sites, you misquoted me. Here are some quotes from the sites I suggested.

Eurogamer: "[GTX 960] Does Nvidia's sub-£200 enthusiast-level Maxwell card deliver?"

Anandtech: "Launching today is the GeForce GTX 960 and the GM206 GPU. Following in the established traditions of the x60 video cards, NVIDIA is looking to reestablish their place in the enthusiast video card market with their latest offering."

I could go on, but I don't think anyone on these forums, or any other forums would call the 960 anything other than mainstream card.
It's certainly at the lower end of enthusiast melmac, but the majority of people who call themselves enthusiasts don't buy >£200 GPUs. I'll side with those debutantes over at Anandtech and call it an enthusiast card.

You rattled on about the 200 series having no large chip, and then you went on about the GK110 been a wonder chip and couldn't be compared to previous generations.
Actually I said:
Looking back at your list, it very true the 200, 400 and 500 series' muddied the water; they were gigantic compared to their predecessors. Nvidia wanted to start their 'big die' strategy with the 200 series cards iirc, but when their first unified architecture failed to perform they released their big chip as a mainstream product. Nvidia did release 500mm^2 cards as '70 and '80 series, for all that it was because they had no other choice.

Which seems pretty reasonable. You have decided to misquote me and engage in a straw man fallacy, which could be effective if not called out.

Point out your simple reasoning the answer? I hope it's not the same reasoning that you used about the 200 series having no large chip or the GK110 been special.

I didn't say that, but here's what I did say, which encapsulates my reasoning:

As above, neither the GK 104 nor 110 chips exactly matched their predecessors in size, one was smaller and the other bigger, so it's a little more complicated than saying it was a renaming of the '60 and '70 models. As mentioned Nivida had wanted to create a 'really ****ing huge' chip for the 200 series, but ended up releasing it for the mainstream. Thus in some ways the 200, 400 and 500 series were unusual.
 
Last edited:
Back
Top Bottom