• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

Larger die sizes cost significantly more than smaller ones, and relatively the rumoured die size of 680 is much less than 480 and 580.

I repeat. And? You expect a corporation to care about such irrelevant things?

Pricing will come from where these cards stand in regards to Tahiti.

I would call Nvidia all sorts of rude and nasty names, but TBH after all of their bad luck with Fermi I actually wish them well.

I don't like wishing death on a company, as without them the 7970 would cost £800. And we would be using it for about ten years as there would be no reason to replace it.

They deserve a bit of good fortune IMO. Fermi must have been an absolute nightmare.
 
I wouldn't be, especially after buying 6970's for £280 on release day and they are still listed as £270 today. I don't like feeling like a mug (excluding the OcUK mug which is the sex).

What matters is that when you bought your cards you saw them as worth the money you paid. You must have if you bought them.

When I bought my 7970 I needed a new card. I had enough money to pretty much buy any card on the market barring the two dual GPU roasters which I had vowed never to touch.

The 7970 when I bought it was the best, fastest card in its category. So I bought it. The only other option was a 3GB 580 which was a fair chunk slower no matter how you sliced it.

What happens after should never be taken into account. You can't rewind time and change your mind, chit happens.
 
Jesus H christ people like to overblow simple things. Why is it called the 680gtx? Quite simple, how do Nvidia spin not having a high end card, or not being able to produce one on a "working" architecture for 6 months after their performance part? Answer they don't the ONLY answer would be "we made it big and can't get yields". Avoid the question, if you only have one kepler out there was never any chance other than Nvidia calling it the x80.

Would AMD ever release a 7780 and forgo the high end naming and let everyone talk about the "missing" parts above it, no.

As for naming, it was pretty likely and most people correctly assumed MONTHS ago that they'd go for 680gtx for whatever the highest end part they can release now is, and then we'll see a 780gtx name for the "big kepler" when we finally see it.

The other possibility a couple people are randomly touting based on Fudzilla, is that Nvidia think 7970 performance is low enough they can rename their whole range. So firstly what would they call big kepler, what would they call the dual gk104? What happens when AMD release a 1100Mhz + card 7980 and for the first time in what 6 years or so Nvidia have their "highest end" part losing to AMD's. It looks bad, it looks less bad than not being able to manufacture a high end part. One makes them look bad to enthusiasts, one makes them look bad to investors, guess which way they've gone.

The fairly simple explanation is, they can't NOT release a high end part, so whatever is the highest kepler they can release gets the high end branding, its got nothing to do with AMD as it only looks bad with AMD. Seeing as all it can do is make it look like AMD have closed the gap on the top end Nvidia parts this generation.


As for die size and "around 300mm2", the picture is rather daft but, its either a G92B with a circa 265mm2 core size, making GK104 WAY below 300mm2, or that is a G92 B STEPPING, which is 325mm2, and puts the GK104 around the 340mm2 we've known it will be for some time.
 
Last edited:
How do NDAs work anyhow? Even superinjunctions were made a mockery of with Twitter etc - there is no way to trace the leak. But Nvidia has cordoned off the internet and will make sure that .... HOW??? :)
 
How do NDAs work anyhow? Even superinjunctions were made a mockery of with Twitter etc - there is no way to trace the leak. But Nvidia has cordoned off the internet and will make sure that .... HOW??? :)

You sign a non disclosure agreement. Meaning, you leave yourself wide open for legal action should you open your mouth.

I've been under NDA many times. What I wasn't told was that the parts I received were marked and so completely unique to me. I would have had to sit and stare at some one else's part to realise the difference.

Which makes it a good thing then that I kept to the NDA. Breaking one is not very clever TBH.
 
How do NDAs work anyhow? Even superinjunctions were made a mockery of with Twitter etc - there is no way to trace the leak. But Nvidia has cordoned off the internet and will make sure that .... HOW??? :)

Most of the people who will be under these kind of NDAs work in that industry and thats their job so leaking info would lead to them possibly losing their job (NDAs are legally binding) and not being able to find future work in that industry because of it. In the past companies like AMD and nVidia have worked out who has been leaking information by deliberately putting some different false information in with the real information thats being sent out to people, etc.

In the past it was relatively easy to get some good info out of various partners/AIBs, 3rd party developers, etc. but these days it seems to be locked down a bit tighter and those who aren't reliable get left out the loop.
 
Last edited:
How do NDAs work anyhow? Even superinjunctions were made a mockery of with Twitter etc - there is no way to trace the leak. But Nvidia has cordoned off the internet and will make sure that .... HOW??? :)

I think NDA's are just a tease obviously have a review release date and a retail selling date, but what's the rest of it about ?
 
Most of the people who will be under these kind of NDAs work in that industry and thats their job so leaking info would lead to them possibly losing their job (NDAs are legally binding) and not being able to find future work in that industry because of it. In the past companies like AMD and nVidia have worked out who has been leaking information by deliberately putting some different false information in with the real information thats being sent out to people, etc.

In the past it was relatively easy to get some good info out of various partners/AIBs, 3rd party developers, etc. but these days it seems to be locked down a bit tighter and those who aren't reliable get left out the loop.

Absolutely. Common sense dictates that you should send out slightly different information to each individual.

As an example when doing QC on software that was about to be released (can't mention what sector sadly) the sound samples were different in each test version sent out. That way if any sound samples or a video were leaked it would be easy to find out whose version it was.

Each program was water marked in a completely different place, so if you blurred it out they would simply refer back to a "map" and you'd pretty much had it.

Mind you it went a lot further than that. 128 bit USB HASP security dongles and god knows what else.
 
Absolutely. Common sense dictates that you should send out slightly different information to each individual.

As an example when doing QC on software that was about to be released (can't mention what sector sadly) the sound samples were different in each test version sent out. That way if any sound samples or a video were leaked it would be easy to find out whose version it was.

Each program was water marked in a completely different place, so if you blurred it out they would simply refer back to a "map" and you'd pretty much had it.

Mind you it went a lot further than that. 128 bit USB HASP security dongles and god knows what else.

Ok, but what stops people releasing just a performance comparison- say Nvida 680 gets XX fps on BF3, XX fps on Metro2033 etc... Surely you cant watermark performance (one card gets 55fps, while another gets 58 etc)
 
Sorry to go off topic, not like anyone else has :D

What was so bad about Fermi?

At launch it was far too expensive for the performance it provided. It was hot, it was loud but most importantly it was very late.

ATI had been making DX11 hay for months, Nvidia turn up months late and offer at very best 10% more speed with large amounts of FSAA. Turn the FSAA down and at launch there was a fag paper between the two cards (5870 and 480, and 5850 and 470).

That wasn't good enough.

Then they found out they had breached Rambus' terms and had to pay royalties on every card sold using their technology.

http://semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/

Their first samples were made from MDF. They were screwed and glued to a circuit board that made no sense. This one, to be precise.

Fermi_power_cropped.jpg


Then have a read of this and I will come back to that pic in just a moment.

http://semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/

Right. Now, remember the pic? look at the power plug. It doesn't even connect to the card. What can be seen on that card though is that they have cut that PCB down considerably using nothing but a saw. If you look closely at this again.

Fermi_power_cropped.jpg


you can clearly see that at the end of the card are two solder points with solder in them. Not only does this point out clearly that the PCB used on the sample card waved around had been cut down, it also proves that the card originally (when used in testing) had three sets of power connectors.

IE - a set of six pins not soldered. Then a set of 8 pins that had indeed been soldered, then another set of pins that were either another set of 6 pins or another set of 8 pins that had simply been cut off.

This indicates clearly that Semiaccurate's article about Fermi being broken and unfixable had clear truth in it. I would imagine Nvidia spent the 6 months they were late trying to cut down how much power the card needed to work.

The 560ti 570 and 580 were hailed as excellent cards. All that Nvidia did, though, was fit them with low leakage transistors (to lower the power draw) as well as a TDP throttle to stop them using as much power when ran in something like Furmark, and a better cooler (a vapor chamber, oddly the same one that had used when they revised the too hot too loud 260 and 280).


All of that smacks of fail. The fact that OCUK (almost two years on) has a huge batch of EVGA GTX 480s says it all.

At £450 they were horrifically over priced, so any one waiting for them with any sense just went 5870 as the 480 and 470 pretty much confirmed all of the rumour and speculation. They were too hot, too loud, too late and most importantly of all too expensive.
 
Epic fail then :eek:
Thanks for the info ;)

Yeah it wasn't pretty. Not only that but for some very odd reason during all that time they ceased production of their 200 range (275 285 295 25 GTS etc) and that led to BFG going out of business.

Then Intel gave them the Spanish archer (the El Bow) and refused to license any Intel sockets to them (so Nforce chipsets and boards were no more) and so on.

It really was a terrible time for Nvidia tbh.

The only people who bought the 470 and 480 were those that -

1. Absolutely hated ATI and would have bought a dog poo with Nvidia written on it.

2. Had rattled on for ages about how marvellous Fermi was going to be and how wrong all of the information doing the rounds was, so left themselves with no choice.

But there weren't many. Steam posted stats that something like 80% of people were running ATI cards, the rest "other".

So they deserve a bit of luck. Fermi wasn't cheap to develop and cost a large sum of money, which is why (I imagine at least) 5xx prices have barely dropped since launch.
 
Back
Top Bottom