• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** PRE-ORDER SPECIAL: GIGABYTE GTX 1080Ti OC BLACK ONLY £639 !! **

The biggest difference is the resolutions that people play at, my first PC in 2004ish had a 17" monitor with a 1280 x 1024 resolution.

Now I am gaming on a pimp 1440 165hz G sync monitor.

If resolution's hadn't gone up I'm sure we would all have a card that was less than £200 in our rigs now.
 
..and now because of some competition Nvidia have released drivers that put the Titan back into the semi-pro card territory (which is what the Titan brand was when it first came out).


I didn't realise it was what got cut out after Kepler. When you consider how all it needs is a driver update to re-instate that capability it makes me feel like they've been screwing pooch by switching it off in the first place.
 
But everybody else is using naming convention to say that cards haven't got more expensive.

Listen, all I know is that a GP106 (3rd tier chip) costs £50 more than a 2nd tier chips (GF104) adjusted for inflation, VAT and new exchange rate.

And that the xx80 cards are no longer the top tier as they once were. There is now a tier above them. Consequently all the lower tier cards (xx70, xx60) are devalued by being shunted one tier downwards. Yet the pricing does not reflect this.

If we're concentrating on performance, well a 460 as said chewed through all the games of its time. Nobody is going to say that the 1060 is all that great, even for 1080p. It's a GP106, and it performs as well as a 3rd tier chip might. It's well below what you might expect if your last xx60 was the 460/560. The 660 was the first xx60 card they squeezed out on their 3rd tier chips, and the xx60 cards have been (excuse my French) bull crap ever since.


Actually it's not French it's Gaulish and records show that the phrase's origins came from how Asterix explained the events that would unfold when Obelix got hungry and chased the herd for his dinner, He said there'd be nothing but Bull **** as far as the eye could see. True story.
 
I didn't realise it was what got cut out after Kepler. When you consider how all it needs is a driver update to re-instate that capability it makes me feel like they've been screwing pooch by switching it off in the first place.

That is what happens without good competition. One company can just take the **** and do what they like :(
 
Are we past the mining BS yet ? I am waiting for 1080tis to be 600£ top
Nah. It seems that this "fad" is now here to stay, for the foreseeable future.

Before Eth GPU mining was a fad. But now... well, it seems GPU mining is being deliberately/purposefully being forced by some coins. With no ASICS, the only way to get ahead of the curve is to have as many GPUs as possible.

And if Eth stops using GPUs, there are other coins which will take its place to use GPUs.

From my limited understanding, it does seem that there is now no end in sight for GPU mining.
 
But everybody else is using naming convention to say that cards haven't got more expensive.

Listen, all I know is that a GP106 (3rd tier chip) costs £50 more than a 2nd tier chips (GF104) adjusted for inflation, VAT and new exchange rate.

And that the xx80 cards are no longer the top tier as they once were. There is now a tier above them. Consequently all the lower tier cards (xx70, xx60) are devalued by being shunted one tier downwards. Yet the pricing does not reflect this.

If we're concentrating on performance, well a 460 as said chewed through all the games of its time. Nobody is going to say that the 1060 is all that great, even for 1080p. It's a GP106, and it performs as well as a 3rd tier chip might. It's well below what you might expect if your last xx60 was the 460/560. The 660 was the first xx60 card they squeezed out on their 3rd tier chips, and the xx60 cards have been (excuse my French) bull crap ever since.

What I find funny... I had this GTX470 card (actually I ended up with about 4 of them) - its core was 529mm2 (~500mm2 being the biggest its economical to make a consumer card outside of stuff like Titans) it was ~60-70% faster than the previous generation high end, it had 93.333% of the SMs enabled compared to the highest end card on that process* the launch price was about £325 though the slumped just after release to about £250 before going back up again - you know what other card is ~60-70% faster than the previous generation high end, approx. 500mm2 core, 93.333% of the SMs enabled? the GTX1080ti and might get change out of £650 if you are lucky.

nVidia is taking people for mugs ever since Kepler and some are even saying "yes please" :( it isn't like they are struggling to make bigger GPUs as GP100 showed granted 16nm production costs nVidia a bit more per card than previous generations but its not hugely so.


* Fully unlocked was actually supposed to be 16 SMs but they couldn't make yields so it was 15 until the refresh with the 580 that 16 SMs were unlocked.
 
Does anyone know if this is quieter than the
Gigabyte Aorus GTX 1080 Ti?

Not sure which to go for.

I would pay the extra for Aorus cooler well worth it for the extra £30 to £40. This Gigabyte black would be better used with water or Aio. If you are thinking about getting a Gigabyte be careful where you buy it from they have had to many screamers with this power phase design and you do not want to have issues with returning the card.
 
nVidia is taking people for mugs ever since Kepler and some are even saying "yes please" :( it isn't like they are struggling to make bigger GPUs as GP100 showed granted 16nm production costs nVidia a bit more per card than previous generations but its not hugely so.

But you still can get the big GPU's (ie 1080ti) for the same price as the big gpu's from previous generations 10 years ago (adjusting for inflation). Remember the 8800GTX was $599 and the 280 was $649. Also, remember the GTX 1080ti would not be the UK price it is today, back when the 400 series came out. If using the exchange rate and vat rate for March 2010 it would have be ~ £550.

Of course the reason this started with Kepler, was because AMD's best card could be rivalled by the GK104 (680). Ever since then, AMD have been struggling to keep up with Nvidia's big dies. They just about managed to with the 290x (but even that had a $549 release price), but they never really managed to rival the 980ti and it certainly doesn't look like they will with the 1080ti.

I also think people are getting a bit too hung up on die size and how they perceive what a certain die size should cost.

Remember, even before Kepler, AMD released the 7970 which only had a 365mm² die size, yet AMD still wanted $549 for that. Nvidia then had the 300mm die 680 which rivalled it in performance so released it at $499. So why does that make Nvidia the big bad wolf in that scenario?
 
I also think people are getting a bit too hung up on die size and how they perceive what a certain die size should cost.

It isn't just about die size I also talked about the performance uplift for a given die size. Sure it really takes a deeper analysis but it doesn't take that deep a look to get a good idea.
 
You're sure about that?

http://www.trustedreviews.com/reviews/xfx-geforce-6800-ultra

How about a top end card being £400 (in 2004) and the 2nd top (6800) being £250 (I had one, bought from OcUK).

Instead of the top card (actually Titan) being £1000+, and the 2nd top (the Ti) being £700.

Exchange rate different then and you have to adjust for what money is worth today. Last time I looked at dollar prices adjusted for the year, the 8800 ultra is still the highest priced card if you ignore titans. In fact the 1080ti is cheap in comparison.
 
But you still can get the big GPU's (ie 1080ti) for the same price as the big gpu's from previous generations 10 years ago (adjusting for inflation). Remember the 8800GTX was $599 and the 280 was $649. Also, remember the GTX 1080ti would not be the UK price it is today, back when the 400 series came out. If using the exchange rate and vat rate for March 2010 it would have be ~ £550.

Of course the reason this started with Kepler, was because AMD's best card could be rivalled by the GK104 (680). Ever since then, AMD have been struggling to keep up with Nvidia's big dies. They just about managed to with the 290x (but even that had a $549 release price), but they never really managed to rival the 980ti and it certainly doesn't look like they will with the 1080ti.

I also think people are getting a bit too hung up on die size and how they perceive what a certain die size should cost.

Remember, even before Kepler, AMD released the 7970 which only had a 365mm² die size, yet AMD still wanted $549 for that. Nvidia then had the 300mm die 680 which rivalled it in performance so released it at $499. So why does that make Nvidia the big bad wolf in that scenario?
If Volta GV106 is (somehow) a bit faster than the 1080 (OK, it's a stretch/unlikely), would people be OK with the 1180 being GV106? An then having the Titan as the GV104?

If that is the 1180 still cost £500+ and the Volta Titan still cost £1000. Then maybe the GV102 as a "Titan Extreme" for £1,800.

Or would people then start to say, "Hold on, this is getting a bit much. It's a tiny, low-end chip in their new range, but now it's the xx80 card!?!"

Or instead would people say, "Good for them. If they can make more money, they should do it! I'll still pay."
 
Nope, they'd just be getting bigger....

kH8ctTB.jpg


They really wouldn't have, there were a few prototypes of rampage doing the rounds (their next gen card) and they were substantially smaller than the voodoo 5 6k.





Dunno why but the screw terminals where the Molex power connects to the board always make me go WTF when I see that board - just seems so inelegant and from another time in electronics to GPUs.


Because that's the unofficial workaround to get it working, it was originally going to be externally powered with its own adapter and plug. When 3dfx went under some working samples remained without the adapter and thats how people got them working.

7aZCOcH.jpg
 
If Volta GV106 is (somehow) a bit faster than the 1080 (OK, it's a stretch/unlikely), would people be OK with the 1180 being GV106? An then having the Titan as the GV104?

If that is the 1180 still cost £500+ and the Volta Titan still cost £1000. Then maybe the GV102 as a "Titan Extreme" for £1,800.

Or would people then start to say, "Hold on, this is getting a bit much. It's a tiny, low-end chip in their new range, but now it's the xx80 card!?!"

Or instead would people say, "Good for them. If they can make more money, they should do it! I'll still pay."

As long as each range brings options that are either:
- Faster than the previous option, but for the same price, or;
- Same performance as before but for cheaper,
then I'm happy with what they are doing. If I as a consumer don't feel the jump was sufficient, then I don't have to upgrade.

Every fool knows that competition drives both innovation and efficiency. Due to that, I'm impressed that nvidia even bothered with Pascal or Volta, they could have just refreshed Maxwell and still been a year ahead of the competition.
 
Exchange rate different then and you have to adjust for what money is worth today. Last time I looked at dollar prices adjusted for the year, the 8800 ultra is still the highest priced card if you ignore titans. In fact the 1080ti is cheap in comparison.
I believe today you get $1.30/£ where as in August 2004 it was $1.85/£.
 
I believe today you get $1.30/£ where as in August 2004 it was $1.85/£.

That's about as good as it gets. I just bought some dollars at $1.29/£1 for my holiday, if I had bought a fortnight ago they were as low as $1.14/£1 in some places. It's quite volatile right now.

Last time I bought dollars, maybe 10 years ago, it was about $1.66/£1 iirc and even then we thought that was a bad deal.
 
Back
Top Bottom