• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD 12 Core CPU - 1st Half 2010

Associate
Joined
24 Jan 2007
Posts
1,490
Location
Guildford
EDIT: ROADMAPS FOUND TO BE FALSE/IRRELEVANT

Just wanted to see if anyone else knew about this?

I just found the AMD Roadmap for 2008-2010 found on: http://arstechnica.com/hardware/new...ne-amd-talks-shanghai-performance-roadmap.ars

Claims that later on in this year they are going to be releasing a 6 core CPU then later on next year (First Half), a 12 core CPU... If this is the case then why is everyone advising to invest in an i7 as if this is true Intel are going to be crying in the corner? lol As all I see Intel doing is DOWNGRADING their gear for the next year to year and a half, where as AMD are going to have 3x as many cores within 1 year :S. (I'm not an AMD fanboy! lol I've never actually had an AMD before anyone says anything ;) )

AMDRoadmap.jpg
 
Last edited:
Twelve cores running slowly is perhaps worse than four cores running well. Nonetheless that is an interesting find.

I agree with you that Intel appears to be releasing slower chips rather than improving the current high performance range, but disagree that people are advising investing in i7. I haven't seen i7 described as future proof for a while at least. Perhaps Intel's move to 32nm will lead to massive improvements.

I expect multithreading to eventually get to us, at which point the 12 core system would be quite something. I'd expect it to be cheaper than the 6 core i7s too.

I recognise most of the names in the first link, and none of the ones in the second. However the first picture is clearly labelled server/workstation, and the second desktop/notebook.
 
Last edited:
Just wanted to see if anyone else knew about this?

I just found the AMD Roadmap for 2008-2010 found on: http://arstechnica.com/hardware/new...ne-amd-talks-shanghai-performance-roadmap.ars

Claims that later on in this year they are going to be releasing a 6 core CPU then later on next year (First Half), a 12 core CPU... If this is the case then why is everyone advising to invest in an i7 as if this is true Intel are going to be crying in the corner? lol As all I see Intel doing is DOWNGRADING their gear for the next year to year and a half, where as AMD are going to have 3x as many cores within 1 year :S. (I'm not an AMD fanboy! lol I've never actually had an AMD before anyone says anything ;) )


You do realise that article is over a year old and plans can change.
 
ah silly me! for some reason i thought workstation was a desktop for some reason lmao, sorry, i did post it first thing in the morning though so i had just woke up lol XD... and another thing..

Why is it I’m seeing so much 'x86' energy efficient rubbish... x86 is 32bit... whatever happened to 64bit... jeeze should be going FORWARD in technology not BACKWARDS lol. Why not make a big leap for once, Intel, AMD, Microsoft (and others) all develop a REAL 128bit WORKING system, actually have some cooperation and some real improvements. And I wish more pressure was put on software companies to develop 64bit software, instead of pumping out 32bit only rubbish, especially drivers, when you could have paid 300 odd quid for a sound card (like someone I know) and then they don’t bother to develop 64bit drivers, so you are stuck with this out of date pile or **** lol

I'm all for helping the environment and making less power hungry computers, I mean we need it these days with the cost of electricity, but common... this is taking the biscuit, I think all these 'green' issues are just a good excuse for companies to develop cheap, underperforming rubbish which doesn't cost them anything to make, and then make it a tad cheaper, to make the customer feel they are getting a good deal, when really, they are making twice the profit and we are lumbered with a slow 'green' computer, which is so 'green' it may as well be used as compost, NOW there’s an idea! Biodegradable CPU's, go on then Intel, go away and develop us a cheapy plastic processer that operates at 12MHz and is 'healthy for the environment' but 'crap for your computer' lmao

Bottom line. Hardware companies. Software companies. Stop pumping out rubbish we already have and give up some real innovations and stop just thinking of your financial budget and what time lunch is at lol.
 
Last edited:
Why is it I’m seeing so much 'x86' energy efficient rubbish... x86 is 32bit... whatever happened to 64bit... jeeze should be going FORWARD in technology not BACKWARDS lol. Why not make a big leap for once, Intel, AMD, Microsoft (and others) all develop a REAL 128bit WORKING system, actually have some cooperation and some real improvements.

The 64-bit technology we're using today is just an extension of x86 - x86-64. Therefore when people refer to modern x86 chips they do in fact mean x86-64. Even our 64-bit processors are still x86 so nothing is going backwards at all ;) I can't even think of a recent x86 chip that hasn't had x86-64 extensions too, other than the Atom but even the new Atoms are x86-64 now.

As for 128-bit, well, many people are still skeptical about 64-bit. 128-bit isn't going to do anything for us today, or even for many many years to come, seeing as 64-bit CPUs can already address ridiculous amounts of RAM.
 
The 64-bit technology we're using today is just an extension of x86 - x86-64. Therefore when people refer to modern x86 chips they do in fact mean x86-64. Even our 64-bit processors are still x86 so nothing is going backwards at all ;) I can't even think of a recent x86 chip that hasn't had x86-64 extensions too, other than the Atom but even the new Atoms are x86-64 now.

As for 128-bit, well, many people are still skeptical about 64-bit. 128-bit isn't going to do anything for us today, or even for many many years to come, seeing as 64-bit CPUs can already address ridiculous amounts of RAM.

Ah I didn't realise it meant it like that, I do understand what you mean, after all its just a bigger calculation method.

But at the same time, after studying 32bit vs. 64bit and the binary code in college, 64bit 'should' be twice as fast as 32bit, thus 128bit, would be 4x as fast. However, the problem is that the software companies refuse to adopt it because basically it’s a lot of work for them to do. As they have all been brought up in a 32bit environment so I doubt there are many people out there that actually know how to program in 64bit and that’s the problem. If we could properly utilise 64bit, and even 128, computers would be FAR more advanced, but it’s happening all too slow if you ask me, the only thing companies want to do, is make easy money for easy work. That’s alright for them but it sucks for us customers lol.


I was actually informed by my lecturer, that Windows Vista as it stands, actually has a few 128bit components hidden away inside its architecture, however they aren't being used (which seems extremely pointless).

This is one thing that Microsoft CAN’T be blamed for. They actually want to push for 128bit, and they are trying their best to make 64bit work, but other companies and 3rd party software developers cba to make 64bit applications, so then Microsoft are made to look like the guilty culprit by having 2 different operating systems, one of which not a lot of stuff works on. 64bit is getting better though, BECAUSE software developers are adopting it, but so far, there has been no noticeable gain in things except video encoding etc.

Microsoft need to push forward to 64bit maybe even 128bit, but they get left standing when software developers say they won't support it. Trust me, if 128bit was being used in say... Windows 7, and it was fully utilised by both them and 3rd parties, we would have a FIRE of an operating system. Slow loading and decoding times would be a thing of the past. Shoddy A.I would be a thing of the past, everything would be much better than it is now, but until Microsoft physically force 3rd parties to make the switch (i.e making an operating system 64bit only or 128bit only) we are always going to be a step behind.

Admittedly I am using Microsoft, Microsoft, Microsoft lol, but we have to admit they are the leading front when it comes to operating systems whether you love them or hate them lol. Once Microsoft go 64/128bit then others will follow.
 
Last edited:
There was an article some time ago saying that Microsoft will got 64bit only in Windows 8 or whatever they decide to call it. Developers are now wasting a lot of time making 32 and 64-bit versions of the same application. If in the recent future go all 64-bit that will allow developers to focus supporting multicore systems better so that we can benefit from these multicore CPUs.
 
But at the same time, after studying 32bit vs. 64bit and the binary code in college, 64bit 'should' be twice as fast as 32bit, thus 128bit, would be 4x as fast.
While studying these architectures did you also research the conditions under which 2x or 4x performance would be realised?

Under many conditions the 64 bit processor will be no faster than a similar 32 bit one.

In fact moving from 32 bit to 64 bit datatypes (when not needed) requires 2x the storage, 2x the bandwidth and 2x the cache to perform equally.

The only benefit having 64 bits emerges when
a) processing data too large to fit into a 32 bit data types.
b) requiring more than 4GiB of addressable memory

These applications are becoming more common but are by no means a compelling reason to upgrade for most people.

I was actually informed by my lecturer, that Windows Vista as it stands, actually has a few 128bit components hidden away inside its architecture, however they aren't being used (which seems extremely pointless).
Maybe he was refering to the SSE units in most modern x86 processors. These use 128 bit registers though they are by no means 128 bit components as you may be imagining them.
 
Last edited:
Ah I didn't realise it meant it like that, I do understand what you mean, after all its just a bigger calculation method.

But at the same time, after studying 32bit vs. 64bit and the binary code in college, 64bit 'should' be twice as fast as 32bit, thus 128bit, would be 4x as fast. However, the problem is that the software companies refuse to adopt it because basically it’s a lot of work for them to do. As they have all been brought up in a 32bit environment so I doubt there are many people out there that actually know how to program in 64bit and that’s the problem. If we could properly utilise 64bit, and even 128, computers would be FAR more advanced, but it’s happening all too slow if you ask me, the only thing companies want to do, is make easy money for easy work. That’s alright for them but it sucks for us customers lol.


I was actually informed by my lecturer, that Windows Vista as it stands, actually has a few 128bit components hidden away inside its architecture, however they aren't being used (which seems extremely pointless).

This is one thing that Microsoft CAN’T be blamed for. They actually want to push for 128bit, and they are trying their best to make 64bit work, but other companies and 3rd party software developers cba to make 64bit applications, so then Microsoft are made to look like the guilty culprit by having 2 different operating systems, one of which not a lot of stuff works on. 64bit is getting better though, BECAUSE software developers are adopting it, but so far, there has been no noticeable gain in things except video encoding etc.

Microsoft need to push forward to 64bit maybe even 128bit, but they get left standing when software developers say they won't support it. Trust me, if 128bit was being used in say... Windows 7, and it was fully utilised by both them and 3rd parties, we would have a FIRE of an operating system. Slow loading and decoding times would be a thing of the past. Shoddy A.I would be a thing of the past, everything would be much better than it is now, but until Microsoft physically force 3rd parties to make the switch (i.e making an operating system 64bit only or 128bit only) we are always going to be a step behind.

Admittedly I am using Microsoft, Microsoft, Microsoft lol, but we have to admit they are the leading front when it comes to operating systems whether you love them or hate them lol. Once Microsoft go 64/128bit then others will follow.

I'm sorry but that's the biggest load of drivel I've ever seen posted.

Can you explain what college work you've done to show that an x86-64 processor is, or can be, twice as fast in 64-bit mode than 32-bit? All it is is double width registers and the ability to address more than 4Gb of memory, there is nothing there to indicate anything more than a couple of percent improvement, and that's best case scenario.

How would your lecturer know about the innards of Vista, and even then why would Microsoft expend effort when there are no true 128-bit processors around? The few processors that can work on 128-bit's of data are all 'SIMD' processors/registers/units so it's not a single 128-bit value anyway.
 
Bit confused by your reasoning here VortX. 64 bit code isn't twice as quick as 32 bit. If anything it introduces more overhead during calculation and can be expected to run slower. 128 bit more so.
Execution speed comes from coding elegance, not from moving to a larger data bus. I had a fairly long fight with an oxford compsci over this, I assumed 64 bit code on an amd64 bit processor would be far quicker. I lost that argument conclusively.

As an example. A 32 bit Debian minimal install, using a kernel carefully compiled to only support your hardware and requirements (therefore being very small itself) is far, far quicker than windows. If you go up a few notches to Gentoo you'll have an operating system significantly quicker again.

Windows is useful for programs written so exclusively for windows that Linux cannot yet use them. For me that means computer aided design. Beyond that it's all pretty and 'user friendly' until one tries to do something the average user doesn't want to, at which point it's very hostile to the user. The system being so ludicrously difficult to configure is why I dislike it so much, even something really basic like putting temporary files in a ramdisk requires 3rd party software or coding it from scratch.

I strongly disagree that Microsoft are the leading front for operating systems. They did a very good job at bringing computers to the masses, but they are no longer achieving this as the masses now know what a computer is. Linux has been supporting 64 bit code for rather longer than windows has, and as 64 bit code is better is your main argument I think this rather breaks the 'windows leads the way' idea. http://en.wikipedia.org/wiki/X86-64#Linux

Well I clearly write too slowly, comprehensively beaten to the point.
In fact moving from 32 bit to 64 bit datatypes (when not needed) requires 2x the storage, 2x the bandwidth and 2x the cache to perform equally
is exactly the grounds I lost my argument on.

Which college was this you were learning at?
 
Last edited:
Ok I understand your arguments, as for the 128bit comment, don't aim that at me as I am only forwarding what a lecturer has told me.

As for 32bit to 64bit, I did say IF it was utilised PROPERLY, by both hardware and software, and the operating system.

Quite simply a 32bit CPU can send 32bits of information in one instance. A 64bit CPU can send 64bits of information in one instance. Whether the current CPU's actually do this or not I don't know, but that is the theory behind it. To say that 32bit is better than 64bit however is pure nonsense. The only reason 32bit is so good is because everyone, including software and hardware developers are using it. If there is no difference then why don't we just go back to the good old days of the Atari and use a 8bit OS...

Fact is it DOES make a difference. And in theory it should be twice as fast, maybe not in real world differences and in every situation, I mean obviously drivers aren't going to be any better etc. But if an operating system has a CPU that can calculate 64bits in once instance opposed to 32bit, then technically it’s twice as fast.

I hope this makes sense and what I am saying is actually relevant, please don't shoot me down in flames as I am only telling you what I have learnt and what I have been taught.


Edit: @ JonJ678

Just a quick mention, when I said about Windows leading the front in OS's, I mean in a non commercial market such as you and I. Obviously companies such as Linux and Mac can make their operating systems 64bit/128bit, but I bet it wouldn't make much of an impact compared to if Microsoft did. By this I mean things such as general day to day applications, games, including A.I, video editing etc. Linux and Mac are mainly for workstation/business environment, I'm talking about something that would impact everyone who uses a computer. The main things I think the jump to 128bit would bring, if utilised properly, would be to games, as mentioned about A.I, physics etc, and maybe more so, pure calculation applications used on workstations etc, things such as video encoding, not to mention things like.... (Can’t remember what it’s called will edit when I remember or someone says, the program developed to connect people all over the world to use their computers to calculate scientific algorithms for medic purposes).
 
Last edited:
Fact is it DOES make a difference. And in theory it should be twice as fast, maybe not in real world differences and in every situation, I mean obviously drivers aren't going to be any better etc.
I'm afraid this isn't quite correct. I think someone's sold you the 64-bits = 2 * 32-bits story! Btw being able to 'send' twice as much data at once is just not right!

Anyway, besides the increase in memory space - the most significant change from 32->64-bit is that the part which can do arithmetic can handle larger numbers. This is only relevant in certain situations.
Consider this (practical) example: 32-bit number * 32-bit number = 64 bit number.

With a 32-bit CPU this can be done with one 32-bit multiply instruction.
However, if you want to do 64-bit * 64-bit number = 128-bit number on a 32-bit CPU you've got to do:
1) multiply the low 32-bits of the left number by the low 32-bits of the right
2) multiply the low 32 of the left by the high 32 of the right
3) multiply the high 32 of the left by the low 32 of the right
4) multiply the high 32 of the left by the high 32 of the right
5) combine the results with ands, shifts and ors

with a 64-bit CPU this would be just one 64-bit multiply. So that's a minimum of 5 steps on a 32-bit machine to do a task which would take just one on a 64-bit machine - similar things apply when you want to add 64-bit numbers together with a 32-bit CPU.

However the amount of times you actually need to work with 64-bit integer numbers as a programmer are quite low - 32-bits are normally more than what you need, and 16- is often quite adequate. The reason going from 8-16 and 16-32 was important was because 16- and 32-bit maths are actually useful (and accelerating them is good) but 64-bit integer maths is quite useless.
 
I'm afraid this isn't quite correct. I think someone's sold you the 64-bits = 2 * 32-bits story! Btw being able to 'send' twice as much data at once is just not right!

Anyway, besides the increase in memory space - the most significant change from 32->64-bit is that the part which can do arithmetic can handle larger numbers. This is only relevant in certain situations.
Consider this (practical) example: 32-bit number * 32-bit number = 64 bit number.

With a 32-bit CPU this can be done with one 32-bit multiply instruction.
However, if you want to do 64-bit * 64-bit number = 128-bit number on a 32-bit CPU you've got to do:
1) multiply the low 32-bits of the left number by the low 32-bits of the right
2) multiply the low 32 of the left by the high 32 of the right
3) multiply the high 32 of the left by the low 32 of the right
4) multiply the high 32 of the left by the high 32 of the right
5) combine the results with ands, shifts and ors

with a 64-bit CPU this would be just one 64-bit multiply. So that's a minimum of 5 steps on a 32-bit machine to do a task which would take just one on a 64-bit machine - similar things apply when you want to add 64-bit numbers together with a 32-bit CPU.

However the amount of times you actually need to work with 64-bit integer numbers as a programmer are quite low - 32-bits are normally more than what you need, and 16- is often quite adequate. The reason going from 8-16 and 16-32 was important was because 16- and 32-bit maths are actually useful (and accelerating them is good) but 64-bit integer maths is quite useless.

Well I certainly haven't reached this level in my course yet lol thanks for the heads up (love learning).

But even after reading that... surely if you have a 64bit code that needs calcuating right... on a 64bit CPU thats 1 step, fair enough...

But what happends when you combine a string of even 8 bit instuctions?

Does it calculate one at a time no matter whether the CPU is a 32bit or a 64bit?

I had the impression that say you needed to calulate 8 x 8bits then on a 32bit CP U and 64bit CPU it would do the following:

32bit:

Step 1 - 8bit + 8bit + 8bit + 8bit
Step 2 - 8bit + 8bit + 8bit + 8bit

Steps taken: 2

64bit:

Step 1 - 8bit + 8 bit + 8bit + 8bit + 8bit + 8 bit + 8bit + 8bit

Steps taken: 1

Thus making the 64bit CPU twice as fast? Or does it not work like this?
 
Ah I think you're talking about a SIMD CPU, right? If you had a 32-bit wide SIMD CPU which could do 8-bit mulitplies, then yeah you can often do 8/8/8/8 followed by 8/8/8/8 as you said in two steps. If you had a 64-bit wide SIMD machine then you could do 8/8/8/8/8/8/8/8 as you say.
A 64-bit WIDE SIMD machine isn't the same as a 64-bit CPU - the individual multiplication is only 8-bits wide. I don't actually know any CPU which can do 8 integer multiplies at once. Often you have two instructions for doing this: one which does the even elements, and one which does the odd ones. So to do eight you'd still have to use two instructions. Like this:
ODD: N/A / 8 / N/A / 8 / N/A / 8 / N/A / 8
EVEN: 8 / N/A / 8 / N/A / 8 / N/A / 8 / N/A

But again, this is still different to a 32/64/128-bit machine. Even though your Pentium 4 is '32-bit', it can still do 128-bit SIMD SSE since that's really 32/32/32/32. Does this make any sense?
 
I *think* the Cell SPU's can do 8 Integer multiplies at the same time, in fact I think it can do 16 (128-bit wide registers/execution units)

Problem lies in how many times you need to do either a 64 (or 128) bit integer operation, or even multiple, consecutive, 8-bit integer operation.

I get now where your original thoughts were coming from, but the majority of code simply doesn't work that way, in a similar sense that the majority of code is inherently serial and will not gain much/any speed from multiple pipelines/execution units or even cpu cores themselves.

And as OrphanBoy has mentioned even non-64-bit architectures can do SIMD instructions on more than 64-bits worth of data (eg multiple smaller values, like 4x 32-bit), the '64-bit' tag generally comes from the ability to access 64-bit of memory address space and nothing more, what you're talking about is SIMD which is already comfortably into the 128-bit region...
 
Oh my its starting to go in one ear out the other lmao, didn't sleep at all last night so bare with me...

So what your saying is that a general CPU can only calculate 1 string value at a time (i think thats the right term), so even if you have a 256bit CPU, if the code is 8bit, followed by another 8bit, it would do them in 2 steps, one at a time, where as a SIMD can do them simaltamiously?

Also if this is the case, then surely if software developers were to make their applications using 128bit coding, then 128bit CPU's would be very effective and much faster?
 
Back
Top Bottom