• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX680 Rumour

It's funny how DM's rants make me feel informed and I come away with a more balanced view of a topic that leaves me impartial and willing to give either side a go with their next generation.

Yet xsister's rants makes me want to see Nvidia crash and burn because you're style is more jumping on the defence than trying to explain the topic which reeks of fanboy.
 
Last edited:
Except Semiaccurate is tripe and you linking it doesn't help his case one bit but rather hurts it. It's opinionated journalese dressed up in technical language for the masses, and only the non-technical would use it as a source. I am quite aware DM's rant sound like they were sliced off semiaccurate. And that is part of the big problem. Of all the reviewers/tech journalists anandtech is an example a few guys who actually understand engineering -- I think some of them are even electronics engineers -- And they are far more conservative. You should notice that.

Only the bottom-of-the-barrel journalists resort to opinionated rants like semiaccurate. You see it everywhere. Fox news comes to mind. If news is reported as a series of facts with the viewer drawing the conclusion, then that is something else. Semiaccurate reads like a blog post by some AMD shill ranting about his butthurtness.

At any rate, it is impossible for anyone to know enough from a few journalistic pieces (be it tech journalism or otherwise) to be able to dictate how companies should do their engineering -- which is a much more complex affair. It's one thing to have an opinion. but to present that opinion overbearing as fact is nothing short of sheer ignorance of the engineering process. Microelectronics (and nanoelectronics which is the phase we are now entering) is a thoroughly specialised and complex affair even in electronics engineering which is a very specialised field (compared to other fields like mechanical or chemical which are more general). If NVIDIA was building 500mm2 chips they would do so with real reason dictated by all their research engineers with BEngs/MEngs, and PhDs/EngDs. It is impossible even for an engineer who is an outsider to effectively determine the prudence of design decisions. Design decisions go through a complex process, and no real engineer would dare to do so knowing what hte process is like. Much less an oppinionated layperson and part-time forum troll.

It is experts in microelectronics and nanoscale systems that are behind ITRS and if they predicated larger chip designs it's because they new it had its use and its niche. And if NVIDIA is using them (and it has successfully enough) then that is their affair. It would be presumptuous to a tremendous degree to think some forum troll or some idiot-journalist on semiaccurate know better.

Well its clearly you who doesn't know about this field, firstly the guy who runs Nvidia isn't an engineer, he's got a business degree, and he's always been the business side of it.

The vast majority of Nvidia, Intel, AMD and most other companies design decisions ultimately rest on economic and business decisions, which is where yields and die size come in.

You presume no one can make presumptions, except we can, the laws of physics don't actually change, because Nvidia want them to. Yields don't magically go up the larger you make chips, and you can't magically make the same number of big chips as you can small chips from the same number of wafers.

Business, costs, sales drive most fundamental decisions in the industry, NOT engineers, but the engineers bosses. Which IS why Nvidia are going so big, they want all out performance and what DP performance, which is hard to leverage from an ultra efficient design but easier to leverage from a simpler but bigger design.

You can most definitely presume a LOT about the decisions Nvidia make, firstly, because its not illegal, and secondly because again, wafer costs don't magically change because its Nvidia using them, yields don't magically go up on the same production lines because its Nvidia chips being made, you can't magically make more space on a 300mm wafer than another company.

You seem to believe the design rules are all entirely different for Nvidia from everyone else and they use completely different manufacturing equipment from everyone else. They don't, meaning you can learn a lot from comparing other chips to theirs.

480gtx yields started off in the tank at 2% and never got above 20%, and 580gtx are better, but every single piece of info suggests yields are pretty poor for that as well. There is one fundamental reason for this, and a fundamental reason why the SAME architecture on a smaller scale....... has vastly higher yields.
 
I'm sorry but you clearly do not know about this field. Take a step back and stop pretending you do. Try to unlearn everything you learned from Semiaccurate because it is not even semi-accurate. Pick up a few books on Circuit Theory, Differential Equations, Digital Design, Analogue Electronics. Once you get some grasp of the topics, go study CMOS design. This will take you a few years. Then when you're done try churning out designs according to engineering specs, and you will see what a complex process it is. IT is not a bunch of nerds hashing things out in english language terms you can fit on an oppinionated blog or forum post.

Sorry but please point out where I said chip design wasn't complex, I dare you, or double dare you.

I said, and try to READ it, the FIRST STEP. If you are really trying to convince someone that the first step for designing a chip, isn't deciding what you want in it, and then working out what you can actually fit in it, it is you that should apparently re-read your books.

I said nerds, not because its bad, but when everyone is listing everything they want, and its about chips its basically geek porn. You quite simply get a bunch of people in a room, the right people if you can, and ask quality engineers where they, and the clients/dev's, software people, console guys, want in future chips, then they have their geek porn list of every awesome feature that could possibly be in their next chip, and scale it back to something reasonable, and do-able.

EDIT:- lastly, can you name the last over 500mm2 chip, or hell the last over 400mm2 chip that actually got to market without problems.

Off the top of my head, there is the 2900xt, the 280gtx, the 480gtx, the 580gtx, and SB-E.

2900xt, around 420-450mm2 iirc, yield problems, heat problems, expensive and not good enough. 280gtx, they wanted 256 shaders, they had to cut that down early as Nvidia decided 256 was too big, it had some yield problems and missed clock targets marginally, 285gtx they had a hard to shrinking it, it took a LONG time to come out and it was a very small scale shrink. 480gtx, disaster, multiple respins, millions down the drain, horrible yields, never ever made it to market as a full product. 580gtx, is officially called a GF100B(or was), its simply a full respin, its a design a year late, a good design, who said it wasn't, but it was the end of a LONG run of severe and costly problems, and all indications are, it has not particularly good yields at all. SB-E, 435mm2, yield problems, was supposed to be out 6 months ago, many of the interconnects caused problems, PCIe-3 is on there, and not working as are a few other bits and bobs.

All in all, I can't easily name a single chip over 400mm2 that hasn't had problems since the 8800gtx, which came out on a mature 90nm process, what, over 5 years ago.
 
Last edited:
Well its clearly you who doesn't know about this field, firstly the guy who runs Nvidia isn't an engineer, he's got a business degree, and he's always been the business side of it.

The vast majority of Nvidia, Intel, AMD and most other companies design decisions ultimately rest on economic and business decisions, which is where yields and die size come in.

You presume no one can make presumptions, except we can, the laws of physics don't actually change, because Nvidia want them to. Yields don't magically go up the larger you make chips, and you can't magically make the same number of big chips as you can small chips from the same number of wafers.

Business, costs, sales drive most fundamental decisions in the industry, NOT engineers, but the engineers bosses. Which IS why Nvidia are going so big, they want all out performance and what DP performance, which is hard to leverage from an ultra efficient design but easier to leverage from a simpler but bigger design.

You can most definitely presume a LOT about the decisions Nvidia make, firstly, because its not illegal, and secondly because again, wafer costs don't magically change because its Nvidia using them, yields don't magically go up on the same production lines because its Nvidia chips being made, you can't magically make more space on a 300mm wafer than another company.

You seem to believe the design rules are all entirely different for Nvidia from everyone else and they use completely different manufacturing equipment from everyone else. They don't, meaning you can learn a lot from comparing other chips to theirs.

480gtx yields started off in the tank at 2% and never got above 20%, and 580gtx are better, but every single piece of info suggests yields are pretty poor for that as well. There is one fundamental reason for this, and a fundamental reason why the SAME architecture on a smaller scale....... has vastly higher yields.


FAIL! AGAIN

http://en.wikipedia.org/wiki/Jen-Hsun_Huang
Jen-Hsun Huang the CEO has degrees in electrical engineering at both BS and MS level. He IS an engineer. Stop embarrassing yourself.

I don't know this field? Don't kid yourself. I am IN this field. I studied in engineering. I WORK in engineering and I'm a researcher in engineering. Which is also why I can see through tripe, which clearly only a few others do. Most of the rest are blinded by your glib journalese ripped-off of semiaccurate.


I don't believe anything is any different for NVIDIA. If you were ranting about AMD and dictating design decisions to them. Or to Intel I'd grow just as annoyed. I've met many engineers at Intel when my girlfriend worked there, on many occasions. I've talked to them. And we didn't sit around poring over semiaccurate and thinking : "Gee whiz! This journalist sure has it all figured out!" Sorry to disappoint. That's not how it works. Nobody in the field cares what you think they should be doing. Write a letter to NVIDIA if you're so sure you know what needs to be done. Watch them laugh you out of town.
 
Last edited:
Anyway, I'm bowing out of this before things get any uglier, or posts get any longer than they have to.

Believe what you want, but avoid ranting about a topic like you have all the answers. If you can't do that, at least don't direct any of them at any of my posts. And I'll steer clear of yours. I have no respect for your opinion, and really don't need to hear your long-winded walls of vacuous text.

When I need a good laugh at material that rivals Lewis Carroll in absurdity I'll go read Semiaccurate.
 
To be fair to Xsistor he talks like an academic, a theorist. Obvious by the fact he completely ignores that not all decisions are based on engineering sense. He should also realise once you start claiming specific expertise in an internet debate you better be prepared to prove it. Now I've no idea if DM really knows a lot on the subject but he offers reasons for his opinions which go a long way to encouraging people to listen.

Besides which Xsistor's sig always smelt like a troll...btw my phone suggests an alternative to Xsistor as caustic ;)
 
this is just going to be so funny if AMD's new chip is bigger than NV's this time around.
 
FAIL! AGAIN

Look yeah, DM et al has constantly showed you that large chip sizes don't really make any sense from OUR PERSPECTIVE. This is the point he is trying to make. Of course it makes sense to NVidia as they are the ones doing it and they want to be the big boys in town with the biggest, fastest single chips.
If you truly believe what you are doing is correct, of course you are still going to go with massive chips. I think one of their problems is they simply think yield problems are going to be a lot smaller than they are, and really they are becoming massive and GIGANTIC (everyone is having yield problems, even Intel? And they are fab kings! :P). Didn't even ATI have yield problems at first with the 5000 series? (and they're fairly small chips!)

Please stop calling DM rubbish. It doesn't look good. DM will never claim to know everything, but he talks logic from the outside point of a consumer. DM is one of the main reasons I still come to this forum, as he offers fairly balanced, interesting discussions that I can understand.

The fact of the matter is, as many people have said, the massive chips NVidia make are pretty insane, and everyone but you and most NVidia fanboys know that.

Actually, you know what? I have an odd respect for NVidia sticking to their guns. It doesn't make any sense, and I hope they tone it down a little (just enough to make yields decent), but I have to respect them for sticking with their designs! (not that they really have a choice)

Another point I'd like to make is, as a getting-to-late-20s sarcastic English person, I have a very sensitive "fraff radar". ATI have set it off lots of times. NVidia set it off almost every time they open their mouth (and I set my own off EVERY TIME I open my mouth! :D) Unfortunate really as I'd like to have more faith in them ¬_¬
 
this is just going to be so funny if AMD's new chip is bigger than NV's this time around.

After reading the to-and-fro above I was thinking just that :D AMD have churned out some big and or inefficient chips before... Look at Bulldozer

Come on guys, instead of getting your bras in a twist try and enjoy this golden age of technology we live in. You've never had it so good or so cheap
 
After reading the to-and-fro above I was thinking just that :D AMD have churned out some big and or inefficient chips before... Look at Bulldozer

Come on guys, instead of getting your bras in a twist try and enjoy this golden age of technology we live in. You've never had it so good or so cheap

This post deserves a quote! Enjoy the good life we all have and the privilege of having the access to all this technology.
 
having a 500mm2+ chip makes me feel like i got my monies worth. :)

why should end consumer worry about yields when all they should bother with is performance.
nvidia provides large chips, on which they cant make a lot of profit. im ok with that since i get more silicon for the money.

do amd/ati cheapo out on the silicon just to make more cash on the consumer? probably not, but if i have the option of purchasing 2 cards, one from each corner, both similar performance and similar price. id get the one with the bigger chip, that way i know the manufacturer has put more into the card and is making less of a markup.
 
Last edited:
Lol, surely you'd get the smaller chip?

- Less complexity means less chance of a fault.
- Less heat generated, meaning slower and quieter cooling solutions.
- Lower power requirements - this will depend on PSU but at the very least gives you more headroom for multi-GPU if it's an option.
 
Surely mav if the smaller chip has similar performance more thought and effeort went into it. It has to be harder to get similar performance along with all the other benefits of a smaller chip.

Anyhow to me price/performance is the most important thing. I have a pc that should be ok to deal with any graphics card.
 
Its going to be tough to resist the temptation if AMD come out with a good card in January. I've been holding off purchasing that 27" screen because I want a single GPU card that's powerful enough to do it justice. I usually get nvidia cards, I just find their drivers are of a better quality, but the desire to scratch my upgrade itch is becoming unbearable.

And please ffs stop hijacking my lovely thread with your inane social politics. Thank you.
 
Looking forward to seeing how the higher memory interface turns out, I just hope the new AMD cards can handle AA without such a heavy performance hit compared to Nvidia.
 
Last edited:
Back
Top Bottom