• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce Titan rumours.

Why are people talking about low yields on this Titan card? What does Yield even mean in relation to graphics cards?

And why is it bad for them...help a dummy out here.

Yield basically means for ever GK110 chip produced, the amount that actually work and meet the grade.

You have the Tesla cards that the GK110 were originally intended for, and they have a spec that they want them to reach.

If the a chip doesn't meat the spec they need for that product, it's not used and "binned" to be possibly be reused on a lower end product.

This is where the Titan supposedly comes in, they are using chips that didn't make the grade as Tesla chips, which really would be most of them as it's a very large chip and large chips are always plagued with yield issues because there's simply far more to go wrong on a bigger chip, and there are far less chips made per wafer:

http://news.cnet.com/i/bto/20100209/ibm-power-7-chip-wafer-small.jpg.

It means that the chips are very expensive to produce if most of them are failing, what we don't know is how many are managing to make the grade for Titan's spec, it's very likely not many.
 
Last edited:
Why are people talking about low yields on this Titan card? What does Yield even mean in relation to graphics cards?

And why is it bad for them...help a dummy out here.

You have to charge for chips as total production cost divided by what you can sell, the same amount of work goes into a faulty chip so more duff chips higher individual unit cost
 
Ah thanks lads, I should have known because.......i work for one of the worlds largest producers of semiconductors XD

Just a simpleton in finance though, not a techy.
 
Everyone talking about it's not limited, it probably won't be an issue anyway because they probably won't sell in the numbers most here think they will, so it won't be much of an issue anyway.

If Asus couldn't shift 999 of these:


1 x Asus MARS II GeForce GTX 580 X2 3072MB GDDR5 PCI-Express Graphics Card £1199.99







What makes Titan any different considering the limited numbers made of Mars?

99.5% of PC gamers have been priced right out the game, here they start at £830 which leaves it more or less firmly at the feet of hard core Benchmarkers:

I am buying at least two. I dont want to though but for benches I have no option. the pricing is disgusting.

Didn't hear anyone direct any flak at this guy(nothing to do with the fact he has a chest the size of an angry Bruce Banner!;)) for voicing his opinion, so not much point directing it at me or anyone else either if you don't like what you read.

That is about the only bad thing I can honestly say about the actual product, apart from that it's looking a mighty fine toy indeed.

Kudos to Nvidia for delivering a mighty card in a small quiet package which earns them the fastest title that's been sorely missed this gen.

Sorry to say this but I really hope sales tank because if it doesn't, my yearly updates are numbered as the encouragement given if it does sell well opens the door to absolute abysmal(comical even) future pricing.:(




If you do buy one, I'm not trying to take away anything from the value you deem it at, it's your money, it's frustration at the route Nvidia are taking us down that's got my goat and I hope you enjoy your purchase, I'm sure it will be a fantastic ride. :)
 
Last edited:
I put myself in the "stupid" bracket when it comes to toys, but I don't know, maybe I just feel the price is unfair !
I've never had a problem paying twice as much for quality but....It's close to buying an Airline ticket and picking one up in the states as buying locally !!

TBH In certain ways the price isn't unfair.

OCUK sell the K20 5gb for £2999.

The Titan is a failed K20x and comes with 6gb vram and a nice cooler for £830.

My bugbear with it is that by selling a single GPU card for £830 they then set a price for future high end GPUs.

It's funny, but so many people have brand amnesia on these forums. When the 7970 launched AMD were completely shunned for daring to charge £439 for a card. Every one was up in arms about it.

The problem of course is that they were only charging what had been set by Nvidia who were, at the time, charging £439 for a 3gb 580.

AMD's last launch before that was the 5870 (ignoring the 69xx for a moment) and they were around £280-£350 depending on gouging on launch. Fermi came out and was £450 :rolleyes:

So whilst the few that have the money and feel like it's a good idea to line Nvidia's pockets may think it's a good idea they are only screwing themselves.

Because if Nvidia can get away with it they most certainly will.

It's getting to a point now where even the enthusiasts are being priced out.

You know it's funny but my wonderful fiancee made a point to me earlier.

"Hang on a minute. Isn't that the Overclockers forums? where people buy stuff and then overclock it to get good value for money?"

Smart cookie my Fiancee.
 
My bugbear with it is that by selling a single GPU card for £830 they then set a price for future high end GPUs.

It's funny, but so many people have brand amnesia on these forums. When the 7970 launched AMD were completely shunned for daring to charge £439 for a card. Every one was up in arms about it.

The problem of course is that they were only charging what had been set by Nvidia who were, at the time, charging £439 for a 3gb 580.

You must have made all that up, proof or it didn't happen!
:rolleyes:

;)
 
TBH In certain ways the price isn't unfair.

OCUK sell the K20 5gb for £2999.

The Titan is a failed K20x and comes with 6gb vram and a nice cooler for £830.

My bugbear with it is that by selling a single GPU card for £830 they then set a price for future high end GPUs.

It's funny, but so many people have brand amnesia on these forums. When the 7970 launched AMD were completely shunned for daring to charge £439 for a card. Every one was up in arms about it.

The problem of course is that they were only charging what had been set by Nvidia who were, at the time, charging £439 for a 3gb 580.

AMD's last launch before that was the 5870 (ignoring the 69xx for a moment) and they were around £280-£350 depending on gouging on launch. Fermi came out and was £450 :rolleyes:

So whilst the few that have the money and feel like it's a good idea to line Nvidia's pockets may think it's a good idea they are only screwing themselves.

Because if Nvidia can get away with it they most certainly will.

It's getting to a point now where even the enthusiasts are being priced out.

You know it's funny but my wonderful fiancee made a point to me earlier.

"Hang on a minute. Isn't that the Overclockers forums? where people buy stuff and then overclock it to get good value for money?"

Smart cookie my Fiancee.

This isn't really how it works though. The professional chips carried a price "premium" for more than just the chip that it was.

For example look at quadros and firepros. They use the same GPUs as the desktop graphics cards do, but they come with a completely different level of support in terms of software and firmware.

Personally though, I don't think this GPU will be available in any sort of numbers to set any sort of precedent.

Speaking of brand amnesia, that's also a good point too, there's quite a few people who get upset and moan about "nVidia bashing", they never take the time to actually address what's being said, they simply don't like the fact that people are criticising nVidia. They seemingly don't care at all whether these criticisms are true or not, but are then very happy to moan about AMD, with the usual "AMD drivers, herpederp".

I moan about nVidia enough, and it's known quite well that I really dislike them as a company, but if people are making stuff up, I'll point it out, people bashing doesn't bother me, it's when people talk smack and make things up that I find annoying, and the AMD drivers thing, I find especially annoying because there's never any actual proof for it, it's always based on something they read ages ago that the uninformed repeat. I've point it out on a few occasions now that anyone who actually knows their graphics history, knows that nVidia are the ones with the track record for bad drivers, and it's very easily found, but for some reason "AMD's bad drivers" sticks despite it not being true.

I dislike nVidia because, well nVidia and the things they do, I don't like AMD or dislike nVidia because of anything to do with AMD, I like the truth, and a lot of the time nVidia and the truth don't really get along too well, which is where my nVidia related issues come from.
 
Last edited:
Hmm, I do want this and it'll be the first time I went green since my 470's but I just got married this month and don't want the reason for divorce to be a graphics card worth more than my car. :D
 
And I don't believe those claims about the gk104 been a midrange part. And the other claim that drives me nuts is the people who believe that Nvidia held the high end Kepler back because the 7970 was so bad.

If you look at all the previous nVidia high end GPUs since the move away from fixed function pipelines (and throw in a bit of knowledge how those stats compared to high end/mid range and tech progress) its fairly telling:

8800: 484mm2, 384bit, G80, 24 ROP
<9800>
280: 576mm2, 512bit, GT200 , 32 ROP
285: 470mm2, 512bit, GT200b, 32 ROP
480: 529mm2, 384bit, GF100 , 48 ROP
580: 520mm2, 384bit, GF110 , 48 ROP
680: 294mm2, 256bit, GK104 , 32 ROP

I've excluded the 9800 series as that was an optical shrink refresh onto a massively smaller process and doesn't count. Obviously given history there are a lot of reasons nVidia want to stay away from a ~500mm2 core, but taking everything into consideration its very obviously that the progression from 580 high end to the stats you'd expect from the 6x0 mid-range ends up at the 680 rather than the 660 as you'd expect. Looking at the previous mid-range parts:

8800GS 324mm, G92, 192bit, 12 ROP
460 332mm2, GF104, 256bit, 32 ROP
560 332mm2, GF114, 256bit, 32 ROP

Unfortunatly gets a bit complicated with the 200 series as the card that takes the spot there is actually the GTS250 which is based on the older G92 core rather than the GT2xx and the 8800 series is a bit fractured with cards that don't neatly fit in.

Whatever way you shake it and I have my own reasons beyond this for believing it to be true, what you get for the 680 is inline with a mid-range card by any other previous generation metric.
 
Agree.

Gibbo will say that as he wants to sell as many as he can, Its a business after all.

The 690 was supposed to be in limited numbers if you remember, now theres hundreds of them.

There will be plenty of Titans after a month or two and the price will come down also i think.

The 690 is in limited numbers mate, they just don't sell.

The reason? well it's quite complicated but basically goes something like this.

At the last Steam survey multiple GPU users on Steam were around 5% of people using Steam.

SLI users are actually very, very low. Most know the pros and cons and usually tend to avoid multiple GPU solutions and just buy one card.

Then there's the heat, noise and power consumption. Multiple GPU cards don't exactly have a good track record with power usage. Take a look at the 295 590 and 6990. They all sucked down power like it was going out of fashion.

Fact - AMD lose money on cards like the 6990, they were ONLY made to enter the willy waving contest. R&D doesn't even get covered by the cost of having to make custom PCBs that are not mass produced like say a 7970. That means custom cooler, double the materials ETC.

Then you only sell them to a very select audience, those who don't care about pretty much anything (IE games all working first time, heat and power issues and so on).

THAT is why there is no official 7990 because AMD are pulling in their belts and concentrating on what is important rather than entering costly willy waving competitions with Nvidia.
 
This isn't really how it works though. The professional chips carried a price "premium" for more than just the chip that it was.

For example look at quadros and firepros. They use the same GPUs as the desktop graphics cards do, but they come with a completely different level of support in terms of software and firmware.

Titan needs exactly that. It's a different core to the 6xx so needs completely different drivers. That means Nvidia now need a dedicated team of coders working on support for Titan as well as SLI for it and so on.

Sorry I've nit picked a bit of your post there but I'm absolutely knackered :D
 
If you look at all the previous nVidia high end GPUs since the move away from fixed function pipelines (and throw in a bit of knowledge how those stats compared to high end/mid range and tech progress) its fairly telling:

8800: 484mm2, 384bit, G80, 24 ROP
<9800>
280: 576mm2, 512bit, GT200 , 32 ROP
285: 470mm2, 512bit, GT200b, 32 ROP
480: 529mm2, 384bit, GF100 , 48 ROP
580: 520mm2, 384bit, GF110 , 48 ROP
680: 294mm2, 256bit, GK104 , 32 ROP

I've excluded the 9800 series as that was an optical shrink refresh onto a massively smaller process and doesn't count. Obviously given history there are a lot of reasons nVidia want to stay away from a ~500mm2 core, but taking everything into consideration its very obviously that the progression from 580 high end to the stats you'd expect from the 6x0 mid-range ends up at the 680 rather than the 660 as you'd expect. Looking at the previous mid-range parts:

8800GS 324mm, G92, 192bit, 12 ROP
460 332mm2, GF104, 256bit, 32 ROP
560 332mm2, GF114, 256bit, 32 ROP

Unfortunatly gets a bit complicated with the 200 series as the card that takes the spot there is actually the GTS250 which is based on the older G92 core rather than the GT2xx and the 8800 series is a bit fractured with cards that don't neatly fit in.

Whatever way you shake it and I have my own reasons beyond this for believing it to be true, what you get for the 680 is inline with a mid-range card by any other previous generation metric.

We did cover this in another thread, and I will post it again..

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

I agree with you
 
If you look at all the previous nVidia high end GPUs since the move away from fixed function pipelines (and throw in a bit of knowledge how those stats compared to high end/mid range and tech progress) its fairly telling:

8800: 484mm2, 384bit, G80, 24 ROP
<9800>
280: 576mm2, 512bit, GT200 , 32 ROP
285: 470mm2, 512bit, GT200b, 32 ROP
480: 529mm2, 384bit, GF100 , 48 ROP
580: 520mm2, 384bit, GF110 , 48 ROP
680: 294mm2, 256bit, GK104 , 32 ROP

I've excluded the 9800 series as that was an optical shrink refresh onto a massively smaller process and doesn't count. Obviously given history there are a lot of reasons nVidia want to stay away from a ~500mm2 core, but taking everything into consideration its very obviously that the progression from 580 high end to the stats you'd expect from the 6x0 mid-range ends up at the 680 rather than the 660 as you'd expect. Looking at the previous mid-range parts:

8800GS 324mm, G92, 192bit, 12 ROP
460 332mm2, GF104, 256bit, 32 ROP
560 332mm2, GF114, 256bit, 32 ROP

Unfortunatly gets a bit complicated with the 200 series as the card that takes the spot there is actually the GTS250 which is based on the older G92 core rather than the GT2xx and the 8800 series is a bit fractured with cards that don't neatly fit in.

Whatever way you shake it and I have my own reasons beyond this for believing it to be true, what you get for the 680 is inline with a mid-range card by any other previous generation metric.

That's all conjecture though. It wasn't a secret that nVidia knew they needed to get away from massive GPUs like that because it was financially impacting them.

Look at AMD, they went for a 512bit bus on the 2900XT, and then dropped to a 256 bit bus because it really wasn't needed at the time, and took a while before moving to 384.

What your info actually shows is that nVidia's high end GPUs have actually been decreasing slightly in size since the GTX280s, and that they were actually trying to reduce the size as much as they could.

It says it all there really. 280 to 285, quite a large reduction in chip size (wasn't this due to moving to 55nm?), 480 to 580, another reduction, and after Fermi, they really needed to bring the size down because they got battered with criticism over that.

They didn't need a chip on a desktop GPU of that size at all.
 
We did cover this in another thread, and I will post it again..

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

I agree with you

This doesn't really make any sense, plus it's an editorial.

nVidia always use the same chip for the "80" and "70" part, so it would have made no sense to do what's being suggested there, and the "70" would hardly make it mid range either.

The theory is that it was supposed to be the 660 based card, not a 670, and since a 670 and 680 would always use the same chip, just with a bit lasered off for the 70.

Seeing it called a 670Ti doesn't really mean much, as that would be the first time a second to the top chip has been called a "Ti".
 
Back
Top Bottom