Why couldn't they make todays best cpu ten years ago?

It'll be totally awesome in 10 years to have like 16+ core processors and tens of gigabytes of ram. Wow, my word processing and porn surfing is going to go ballistic, I tell you.

Actually, it will go slower. Because in reality all better, faster hardware does is allow software developers to become more lazy :p
 
The "slowing" of moores law isnt related to a contradiction of it.

Silicon is coming to the end of its use in computing, it still has some life in it and new ways are coming around to make it more useful, but in the long run i fully expect Nano Carbon to replace it, but its far to expensive to use right now for any commercial gain.

Thing is that there is also the prospect of a complete change from how computing works in the first place to "Quantum" computing, which im not sure about at the moment, but every few months a piece of the puzzle seems to be solved, but it will make computing more difficult to understand, at least beyond a point, considering the physics involved.

Going even further we have photonic computing, which will be a truly wonderful thing.
 
local storage devices are likely to be irrelevant in the future with mega bandwidth available, a local processing unit and ram is all you need. worn as a wrist band or a skin implant :)
 
Will it not get a to a point that its deemed we don't need any more processing power for desktop PC's?

I wonder that as well... For instance most popular new uses for computers (facebook, youtube, etc) are hardly very taxing. But as PermaBanned said, it will probably just let the software developers get more lazy. :p

I'd imagine the areas that uses for extra power will be legitimate will be in gaming, where more realistic simulations of the real world will be run at ever higher resolutions , and that's about all I can think of! It seems that connectivity (so the speed of the internet connection) has become much more important for most modern uses of computers than their power. I suppose the extra power might let big applications open faster, and you'll be able to edit video and images faster, but most people don't really do anything like that.

But maybe we will never see a limit to desktop speed due to demand, and people might just drift over to mobile computers so desktops would become a thing of the past. So laptops, or mobile phones, or tablets may have replaced the desktop PC in a few decades, and all we keep at home is the big screen. (Or maybe we'll have really small projectors, or maybe roll up screens by then!). Or maybe the trend towards internet based services will continue, and all the PC of tomorrow has to do is send input to a server and receive and display output. Who knows what the desktop of tomorrow will look like.
 
Also you don't throw all your eggs into one basket. Intel have roadmaps probably 15+ into the future! They might have the tech now but why shoot themselves out of more profit, when they can stage releases.
 
Also you don't throw all your eggs into one basket. Intel have roadmaps probably 15+ into the future! They might have the tech now but why shoot themselves out of more profit, when they can stage releases.

well that's not really true is it.


if they could release a processor that's 15 years ahead of everything else in a few years they'd be pretty much the only high end processor provider on the planet as no one would use their competitors, more power usage, slower speeds, bigger sizes and higher cost why would you buy that over Intel's cheaper smaller faster and more efficient system?
 
well that's not really true is it.


if they could release a processor that's 15 years ahead of everything else in a few years they'd be pretty much the only high end processor provider on the planet as no one would use their competitors, more power usage, slower speeds, bigger sizes and higher cost why would you buy that over Intel's cheaper smaller faster and more efficient system?

A processor that would be 15 years ahead of any software that could use it? People would buy one and keep it for as long as physically possible even if they update it the speed increase would be negligible in terms of what you could actually notice, so people wouldn't bother with it. They wouldn't get nearly as much new money, so less money for research, and with competitors out of the way the market stagnates and there's no incentive for progression. No, incremental updates are where the profit's at, so it's incremental updates we shall se ;)
 
Also you don't throw all your eggs into one basket. Intel have roadmaps probably 15+ into the future! They might have the tech now but why shoot themselves out of more profit, when they can stage releases.

Why do people think this, almost every piece of equipment in Intel's fabs today making 32nm chips, simply weren't available 3 years ago(a year before 32nm started being made) and didn't exist 2 years before that. Intel don't make this kit, each fab will cost 2-3BILLION to fit out with new kit every two years, and the companies researching ways to make the equipment work take years to come up with improved versions and a long time to make the equipment.

Theres no staggering anything, EVERYTHING in high end cpu's is dictated by transistors you can fit in a given space, Intel can't increase this with the equipment they have, and if they go over a certain die size they'll end up losing money overall, they make crazy profits per chip, but they have to pay for the R&D and the equipment which is probably in the range of 20billion every couple years to keep 4-5 fabs upto date.

You double the size of the chip, you quadrouple costs, and end up making profit per chip, but losing money when you factor in R&D.

Ok, for reference 10 years ago we were still on P4 Northwood(just under 10 years actually) 55million transistors 145mm2 in size. thats 0.3million transistors per mm2. 2600k i7, 995million transistors, in 216mm2, the transistors are about 12 times as dense in current chips, something physically impossible on older process tech and equipment which wasn't in existance till a few years ago.

If I've got this right or not, a 995million transistor chip on the 130nm process Northwood uses, would be around 2623mm2, or around 16cm x 16cm in size? Now, imagine somehow fitting that in your system, and being 18x bigger, it would use 18x more power, which would be 1500W. Now, iirc they would only have been on 200mm diameter wafers back then, which would mean one wafer one fit one chip on it, and due to yields, you'd probably only get one working chip every 10+ wafers, at $8-10k a wafer.......... yeah, theres a reason they couldn't make the same chips back then.
 
Last edited:
A processor that would be 15 years ahead of any software that could use it?


EVen if you take 5 years to privately develop the software (which wouldn't really be an issue as long as you had the instruction set done software companies would leap on it as would hardware manufactures) you'd still be 10 years ahead of the curve, no way can your competitors keep up with that kind of lead.

People would buy one and keep it for as long as physically possible even if they update it the speed increase would be negligible in terms of what you could actually notice, so people wouldn't bother with it.


if today Intel released a processor that was the exact same speed (or even a little bit slower) as thier current stuff but 15 years of advance in size and energy use (and production yeilds +the reduced cost per unit), it would still be the best selling processor ever as it would be ideal for data centrers and servers and super computers because of the lower running costs.

a few % on each CPU's power usage quickly ads up when you've got tens of thousands of them.

They wouldn't get nearly as much new money, so less money for research,

They'd have the largest product lauch in CPU history they'd have a lot of money.
and with competitors out of the way the market stagnates and there's no incentive for progression.


it would mean they could charge less for faster cpus then their competitors and still make more profit.


No, incremental updates are where the profit's at, so it's incremental updates we shall se ;)


No incremental updates are necessary in terms of the fact we can't just pull products magically from the future and they have to be developed incrementally because unless someone has a radical new idea fully formed with every stage of manufacturing you have to devise it piece by piece from new research and discoveries.

You can't use what you don't know.
 
I remember that the Voodoo 5 came out at about the same time as the first 1 GHz processor because I was doing some PC building at the time. A customer ordered a really expensive build with both of them in and naturally we felt duty bound to benchmark the system by playing some Quake. :)
 
I remember people getting excited when RAM was £1/MB, and people being weary of the radiation emitted from CPUs exceeding 1GHz.
 
No incremental updates are necessary in terms of the fact we can't just pull products magically from the future and they have to be developed incrementally because unless someone has a radical new idea fully formed with every stage of manufacturing you have to devise it piece by piece from new research and discoveries.

You can't use what you don't know.

Or you end up with systems that have a massive percentage of hardware failures.

The Xbox 360 for example :D
 
Interesting stuff, so what does the future hold, anyone have an idea where it's going, will we simply see more and more cores added, so in the next ten years we have 16 or 32 core processors?

Im hoping ray tracing and more advanced simulations in gaming will drive performance forward, i want avatar level graphics in the not too distant future at 60 fps, will it ever happen and if so when?
 
Or you end up with systems that have a massive percentage of hardware failures.

The Xbox 360 for example :D

Xbox 360 wasn't exactly years ahead of its time, it was just rushed out the door without enough testing. In the Intel example we're running under the assumption that they were 'ready' to release the hypothetical cpu and weren't going to have a high % of failures. :)
 
I think the reality of the situation is that production processes are what limits hardware development. Yes, it's possible (probable, even) that some companies have preliminary hardware designs years into the future. That's great and all, but they have no way of producing those parts (at least, not without them being catastrophically hot and expensive to make). Hardware companies produce high end parts that more or less reach the economically viable limits of their production processes - if they could produce better processes faster, they would, because better processes produce cheaper to produce products. That way even if they were 'drip feeding' us, they'd be paying less for it. At the moment, it seems more like their hands are tied by this thing called 'progress'.
 
Back
Top Bottom