• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible GTX 670 release on 10th May

oh dear usual Nvidia greed, card looks ugly too

For those who don’t know, the GeForce GTX 670 card is based on salvaged Nvidia GX104 GPUs. These are GPU’s with malfunctioning units or chips that don’t clock up to Nvidia’s reference GTX 680 clocks.

Therefore, the non-working units are disabled and the working frequency is lowered. This way, the GPU gets sold at a lower price instead of being tossed in the trashbin.

i need say no more !

Man, nobody was claiming this was a bad thing when the 8800 GT came out. :p
 
Man, nobody was claiming this was a bad thing when the 8800 GT came out. :p

Quite... and is it not the same with AMD and my 6950 which is a binned 6970?... hence some unlocked, some didn't. Anyway, this fact does not bother me at all.

I am very interested in getting one of these, as I am sick and tired of my HD6950... it was -er- an experiance going red for a while, but I've had enough now :(
 
A 670 should easily match a 7950, which must be nvidias target. Remember a stock 7950 is only 800mhz, the true value in it is the overclocking headroom. At stock it is not that close to the 7970.
I can see it coming in at around 350 quid too, assuming they clock it so that it beats a stock 7950 by 5-15%.
 
oh dear usual Nvidia greed, card looks ugly too

For those who don’t know, the GeForce GTX 670 card is based on salvaged Nvidia GX104 GPUs. These are GPU’s with malfunctioning units or chips that don’t clock up to Nvidia’s reference GTX 680 clocks.

Therefore, the non-working units are disabled and the working frequency is lowered. This way, the GPU gets sold at a lower price instead of being tossed in the trashbin.

i need say no more !

you utter numpty

this is how ALL graphics cards and CPU's are made

they make the chip, they test it and depending on the test they bin it based on what clocks it will run at or worst case they fuse off the faulty bit and it becomes a lower spec chip

e.g. my 3930k is an 8 core xeon with 2 cores fused off, a phenom X3 is an X4 with one core fused off, a 6950 is a 6970... etc. etc.

show me any top spec CPU / GPU and I will tell you what it's lower spec but same silicon equivalent is

and actually you are wrong because often by fusing off 1 section of the chip they actually clock BETTER than the full fat version
 
http://hexus.net/tech/news/graphics/38629-geforce-gtx-670-details-emerge/

how far do you all think is the gap between 670 and 680? Assuming the difference is only 192 cuda cores

I DSRd my 7970 and 670 / 7950 + 128 ssd looks more appealing than 680 for me atm

I think the gap will be similar to the 570-580.


you would have thought by now a process would be invented where they do not have to bin chips but everyone is fully functioning.

+1

With these high tech labs making wafers, I would have thought every chip made would be capable of the same clocks.
 
+1

With these high tech labs making wafers, I would have thought every chip made would be capable of the same clocks.

its cheaper to have

machines with looser tolerances all making the same chips which inevitably produce defective chips that can be speed binned to lower products

than to have different production lines making perfect chips on each line (having tight tolerances is VERY expensive)
 
I've been thinking about the card pictured, assuming that is the reference design I actually quite like it. I swore I would get a custom cooler solution when I get a 670 but may pick up a reference design one. Probably KFA2.
 
I've been thinking about the card pictured, assuming that is the reference design I actually quite like it. I swore I would get a custom cooler solution when I get a 670 but may pick up a reference design one. Probably KFA2.

My reference EVGA is cool and quiet. I wasn't expecting it to be like that though. Man it is Just pig ugly though.
 
you would have thought by now a process would be invented where they do not have to bin chips but everyone is fully functioning.

if you knew anything about how they make processors you would know how utterly impossible that is

it's amazing that they've even managed to get down to 28nm and have plans in place for 22nm and 15nm... below that and they are really going to struggle to have any yield at all

they are currently saying that anything below 5nm is "impossible" but give it another 10 years and no doubt they'll have worked that out too

also don't forget that 40nm had terrible yield issues in the first 6-12months and is now much much better - 28nm will go the same way as it matures, by which point chip makers at the cutting edge will have moved on to newer nodes
 
Last edited:
If that's how you feel about it fine.

Fact is the 7970 costs more to make. At least AMD have that as a small excuse.

If the card coming is a 670ti or plain 670 then it will use the same wafer core as the 680. Just because they laser cut some of the cores and shaders or whatever they want to call them doesn't mean the wafer will be any cheaper to buy.

Thus, putting an already scarce wafer on a PCB and charging a lot less for it won't be the game plan and I can promise you that.

If this wafer shortage continues and affects both companies you will see prices soar. Look what happened to hard drives when the parts became scarce due to the flooding in Thailand.


Firstly, theres really no proof of wafer shortages, there aren't that many wafers on 28nm, there aren't nearly as many customers for 28nm as 40nm. How much is enough, who knows, in 18 months TSMC will be pushing out probably 200k+ wafers a month on 28nm, while right now they are pushing around the 30k mark probably, the thing is, AMD/Nvidia aren't using 30k wafers a month now, and they won't be using more than 30k a month in 2 years either.

AMD/Nvidia are often TSMC's biggest customer on a new process, both because they don't need that many wafers and the higher price of the product can absorb lower yields more effectively than guys making chips that need to be produced for $5 a piece and sell at $5.50 by the 100million to make a decent profit.

Either way, AMD have huge supply, have had huge supply since Jan, and have added midrange cards that sell in magnitudes larger quantities with many more wafer starts which also have no supply problems.

No one except Nvidia is talking about yields negatively which suggests Nvidia's yields are once again in the tank, in which case they most certainly don't cost less to produce than AMD. Think of it this way, AMD had more supply on launch than the 680gtx has had to date, on a high end part, they've since lauched two other products that have vastly higher wafer starts, that also have no supply problem. Nvidia has released one real product, which doesn't need that many wafers, a fraction of what the 7770/7870 would need, and can't supply it.

a 300mm^2 chip with terrible yield will cost many times MORE to produce than a highly yielding 370mm^2 chip. Who knows the actual numbers, but by stock(when 28nm was "new" AMD sent out 2x10k shipments before they hit the shelves, Nvidia supposely have only just managed 10k on a much more mature process with much larger wafer output) Nvidia is having massive yield problems.

of course, every wafer that Nvidia has already done could have dozens of 670gtx's ready to go, you seem to suggest the 670gtx won't be cheaper for Nvidia, but it will.

Take a wafer that costs $8k or whatever, now take 20 680gtx's off it and that is it, you've got $400 per gpu, now take 40 670gtx's off the SAME wafer after binning, and you've got 60 gpu's total and each core is now $133.3 in cost. They don't buy a wafer and take 20 680gtx's off it then chuck the rest out, then buy a new wafer and take 40 670gtx's off it then throw away the rest.

As for comparing to hard drive prices, massive portions of hdd production shut down for essentially 3-6 months, TSMC have enough supply for AMD/Nvidia, and production (assuming it did shut down) was only down for a couple weeks and AMD has since dropped prices significantly.

It's always completely impossible to tell where the profit line is and how much we the consumer are being screwed. 28nm costs more, everyone says that, yield wise(largely from apparent supply) Nvidia are in the toilet, AMD are fine. I picked up a 7970 for just over £330, a 5870 cost £300.... how bad are we really being screwed by AMD and how much are people LETTING retailers screw them?

you would have thought by now a process would be invented where they do not have to bin chips but everyone is fully functioning.

Read up on chip production, then think about the fact that 4-7billion transistors smaller than the eye can see will be on chips this generation, and then realise that getting ANY fully working chips is a god damned technical marvel, well to be fair, Nvidia's probably circa 7billion transistor chip is going to have a horrifically bad yield, we all know it :p
 
Back
Top Bottom