• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD 12 Core CPU - 1st Half 2010

I *think* the Cell SPU's can do 8 Integer multiplies at the same time, in fact I think it can do 16 (128-bit wide registers/execution units)
Haha, almost! They can do 8 16-bit multiplies, but only four at one time - four odd in one instruction, and four even in the other one. In fact there's no 32-bit * 32-bit multiplies so you have to do this with 3*multiplies and then 2*adds. I program these for a living :(
But yeah you're totally right - the majority of code won't make any use of these features. Unless the CPU is 100% loaded and it's banging through lots of sequential data (serial as you say), extra execution units and so on often don't make a lot of difference, which is why recent consoles have cut fancy stuff like this out to save money.

VortX said:
So what your saying is that a general CPU can only calculate 1 string value at a time (i think thats the right term), so even if you have a 256bit CPU, if the code is 8bit, followed by another 8bit, it would do them in 2 steps, one at a time, where as a SIMD can do them simaltamiously?
Yeah that's right. You can only do this if there are no dependencies between the two numbers though, in which case you need to do them sequentially. For instance if I wanted to do:
X = A + B as well as,
Y = C + D
then I could do these in parallel on a SIMD machine since there's no dependencies between the two sums. But if I wanted to do:
X = A + B as well as,
Y = C + X then I'd have to do the upper sum first before doing the lower one, since that one uses the top on in the sum. You couldn't use SIMD for this. Can you dig it?
 
But what happends when you combine a string of even 8 bit instuctions?
I think you may be confusing operations performed on data (instructions) with the data being operated on.

An instruction such as ADD(a,b)->c can be applied to datatypes of any size.

Within the cpu a,b and c are represented by registers (basically temporary areas to store data). On a 64 bit cpu these will be 64 bits wide. If you are working on 32 bit numbers these registers will only be half filled. Each register is a slot to store a single piece of data so the remaining space cannot be filled with another number.

Instructions must be ordered sequentially (one after another) as an instruction may depend on the output of a previous instruction in its calculation.

For example
ADD(a,b)->c
ADD(c,d)->e

would have to be run in sequence as the second instruction relies on the output of the first.

On the other hand not all instructions rely on each other and modern "Out-of-Order" processors do run them in parallel. The way they do this is very similar to a dual/quad core processor. They have multiple execution units.
ADD(a,b)->c
ADD(a,d)->e

But wait in the second example we want to add "a" to more than one piece of data. There must be a better way and there is. SIMD/SSE/Vector processing allows us to apply a single instruction/operation to multiple data. This is a lot like what is being talked to above.

ADD(a) to (b,c,d,e)
 
Oh my its starting to go in one ear out the other lmao, didn't sleep at all last night so bare with me...

So what your saying is that a general CPU can only calculate 1 string value at a time (i think thats the right term), so even if you have a 256bit CPU, if the code is 8bit, followed by another 8bit, it would do them in 2 steps, one at a time,
Yep

where as a SIMD can do them simaltamiously?
SIMD can apply a Single Instruction to Multiple Data. For example I want to add "5" to the brightness of every pixel in a photo.

Also if this is the case, then surely if software developers were to make their applications using 128bit coding, then 128bit CPU's would be very effective and much faster?
SIMD is only useful if you have extremely parallel non interdependent data like image/video processing, cryptography, SETI@home etc.. This is the way graphics cards are architected these days and why they are fast at these workloads.

The majority of desktop applications don't display this kind of parallelism and show no benefit from SIMD.
 
Wow thanks a lot guys, amazing how much you can learn in one day :)

Im studying networking at college so I don't know much about software in that respect (as you may have noticed lol).

But thanks for the heads up I now understand why 64bit isn't that much better than 32bit.

Basically the only reason you get a slight performance boost in 64bit is if the programmer has put in some 64bit string values, if there anything less 32bit (which they most likely are) then 64bit makes no difference because it has to proccess them 1 at a time, a bit like a calculator, you can't type 3 sums in at the same time :).

Thanks! I think I may even get a book from the college library on this now because I wan't to learn more about it :P
 
I'm afraid this isn't quite correct. I think someone's sold you the 64-bits = 2 * 32-bits story! Btw being able to 'send' twice as much data at once is just not right!

Anyway, besides the increase in memory space - the most significant change from 32->64-bit is that the part which can do arithmetic can handle larger numbers. This is only relevant in certain situations.
Consider this (practical) example: 32-bit number * 32-bit number = 64 bit number.

With a 32-bit CPU this can be done with one 32-bit multiply instruction.
However, if you want to do 64-bit * 64-bit number = 128-bit number on a 32-bit CPU you've got to do:
1) multiply the low 32-bits of the left number by the low 32-bits of the right
2) multiply the low 32 of the left by the high 32 of the right
3) multiply the high 32 of the left by the low 32 of the right
4) multiply the high 32 of the left by the high 32 of the right
5) combine the results with ands, shifts and ors

with a 64-bit CPU this would be just one 64-bit multiply. So that's a minimum of 5 steps on a 32-bit machine to do a task which would take just one on a 64-bit machine - similar things apply when you want to add 64-bit numbers together with a 32-bit CPU.

However the amount of times you actually need to work with 64-bit integer numbers as a programmer are quite low - 32-bits are normally more than what you need, and 16- is often quite adequate. The reason going from 8-16 and 16-32 was important was because 16- and 32-bit maths are actually useful (and accelerating them is good) but 64-bit integer maths is quite useless.

yes great stuff but having the ability to access 6 gig ram instead of 3-3.5 is a good step forward and a reason to avance rather than plugging the 64bit calculations.

I deffo need 4 gig of ram at least so stop out of memory crashes with FSX. So I need a 64 bit machine.
 
To be fair theres no reason why windows shouldn't be releasing 64bit Os as normal. those who dont understand computers do understand one key thing....more ram!! although its very easy to flog someone 8gb of ram with 32bit os and they would be none the wiser. With systems constantly being released with 4gb ram in the middle and soon lower sestors, is 32 bit becoming redundant? the only reason I got vista 64 and 8gb of ram was 64 was the exact same price as 32.
 
SIMD can apply a Single Instruction to Multiple Data.

SIMD is only useful if you have extremely parallel non interdependent data...This is the way graphics cards are architected these days and why they are fast at these workloads
Click. Good man, that's just made many things make sense. Thank you.


Just a quick mention, when I said about Windows leading the front in OS's, I mean in a non commercial market such as you and I. Obviously companies such as Linux and Mac can make their operating systems 64bit/128bit, but I bet it wouldn't make much of an impact compared to if Microsoft did. By this I mean things such as general day to day applications, games, including A.I, video editing etc. Linux and Mac are mainly for workstation/business environment, I'm talking about something that would impact everyone who uses a computer. etc

I fear we may have to agree to disagree here. There are a few very large assumptions in your argument, from linux not being able to run applications or do video editing, through everyone in the world using windows, to Linux being a company (!). Based upon this I suspect your ideas to be a consequence of very limited experience outside of the m$ world rather than trolling, so I'll have a go at explaining myself.

The scientific community (physicists at least) all seem to run a version of fedora (ok, red hat..) called scientific linux. I'm yet to meet a computer scientist running windows unless forced to. Servers tend to run unix of one form or another, the artistic design community is in love with macintosh.
Windows is only universal for the 'home user' with minimal interest in how computers work, and who would be using whatever operating system their computer shipped with none the wiser, gamers who are sadly are bound to it by game developers and economics, and as an interface layer in thin client networks to provide a familiar face to employees.

Sadly I am bound to it by the computer aided design work I enjoy, but in time wine will catch up, virtualbox will be able to run 3D cleanly or cad will be compiled for unix as well as for windows and I'll be gone. Even now my windows install doesn't have access to the internet, it's just not worth the effort of protecting it.

Interest in computers is a good thing I think, and I'm sorry if the above looks quite harsh. I'm confident that you'll look back in a few years and laugh. I highly recommend giving linux a try, the system lets you control far more of your computer than windows does, and the variety out there is astonishing. You can download an ubuntu live cd, boot from it, and after a few minutes you'll see a very alien desktop which behaves exactly as you expect it to. Or you could attempt to compile gentoo from scratch and have a very rough time of it, but its all very well documented and well worth getting into. Especially for someone studying computing.
 
Click. Good man, that's just made many things make sense. Thank you.




I fear we may have to agree to disagree here. There are a few very large assumptions in your argument, from linux not being able to run applications or do video editing, through everyone in the world using windows, to Linux being a company (!). Based upon this I suspect your ideas to be a consequence of very limited experience outside of the m$ world rather than trolling, so I'll have a go at explaining myself.

The scientific community (physicists at least) all seem to run a version of fedora (ok, red hat..) called scientific linux. I'm yet to meet a computer scientist running windows unless forced to. Servers tend to run unix of one form or another, the artistic design community is in love with macintosh.
Windows is only universal for the 'home user' with minimal interest in how computers work, and who would be using whatever operating system their computer shipped with none the wiser, gamers who are sadly are bound to it by game developers and economics, and as an interface layer in thin client networks to provide a familiar face to employees.

Sadly I am bound to it by the computer aided design work I enjoy, but in time wine will catch up, virtualbox will be able to run 3D cleanly or cad will be compiled for unix as well as for windows and I'll be gone. Even now my windows install doesn't have access to the internet, it's just not worth the effort of protecting it.

Interest in computers is a good thing I think, and I'm sorry if the above looks quite harsh. I'm confident that you'll look back in a few years and laugh. I highly recommend giving linux a try, the system lets you control far more of your computer than windows does, and the variety out there is astonishing. You can download an ubuntu live cd, boot from it, and after a few minutes you'll see a very alien desktop which behaves exactly as you expect it to. Or you could attempt to compile gentoo from scratch and have a very rough time of it, but its all very well documented and well worth getting into. Especially for someone studying computing.

No I understand your argument and you make a lot of sense, I have had experience with Linux (not Mac) however there was no point me adopting it because the main things I do on a computer is my college work (which I have to use Microsoft Office) and play games.. which I also have to use Windows for. I also do the odd bit of video/picture editing in which Windows is just fine for me, so there's not point me getting a different operating system.

After being taught a little bit about 64bit vs 32bit and so on by the people in this thread I will agree with you that 64bit + would make more of a difference on operating systems such as Mac and Linux. It would be nice however if Microsoft would properly utilise it. However Microsoft seem to be more interested in fancie pictures, themes and animation by which time half your pc is used up already lol
 
Even now my windows install doesn't have access to the internet, it's just not worth the effort of protecting it.

It always bemuses me when people grossly exaggerate the flaws of Windows like this. You make it sound like Windows is unsuitable for connection to the internet without some kind of complex and laborious security regime. I'm sure utilising a decent firewall, running anti-virus software, keeping the OS up to date and not executing unknown programs would be sufficient. At least three of those four measures are sensible on any OS, even Linux.
 
Intersting asides:
because the main things I do on a computer is my college work (which I have to use Microsoft Office)

Will they accept PDFs as documents? You'd think they should really. If they don't, press them on this, normally they only say 'MS office only' because that's what they have.

OpenOffice [google it] is a eally rather good Office alternative, free [as in beer] and outputs PDFs quite happily for most of the documentation. And if you Save As Office 2000/XP format, 90% of the time it doesn't utterly **** up the formatting, unlike a couple of years ago.

Jon is spot on though regarding the Linux userbase - it's used in everything other than home PCs. Got a PVR in your dorm? Runs linux. Mate got a T-Mobile G1? Linux based OS. Most websites are hosted on Linux stacks, with large one especially keen on running LAMP farms - Linux, Apache [web] MySQL [Database] PHP [scripting]. I can't be bothered to check, but I bet we are abusing a LAMP stack right now on this very site. Google uses a custom, parrellelised version for, er, everything, IIRC - feel free to correct me on this, anyone, but I don't believe I'm wrong.

Also, every VMware ESX box out there [a LOT] is basically running a highly optimised virtualisation hypevisor with a custom linux kernal doing the base level loading and interface kit. Wanna go into your ESX box and manually change something not in the Infrastructure Client UI? You're in Linux userland to do that...

In my experience Windows tends to be used as a server mainly where you have a windows based LDAP structure, and that's mainly because Novell Netware, or anything *nixy scares people a bit. It's lovely, lovely stuff when you click with it, but most IT techs find WinServer familiar and easier to admin in the short term - unless they just have no choice in the matter, as I have several times.

I will never understand why people insist on making simple web server platform run on IIS with the likes of Biztalk and MS SQL Sever behind them - you could spend the £8000 on a getting a decent programmer in for a month to come up with an open source alternative that won't rape you to death in licensing costs every year, eat your flesh when it comes to bug fixing and patching, then sew your skin into it's chassis when it comes to app/OS upgrade time. If you're lucky, it's in that order too...

In my experience, WinServer takes a hell of a lot more admin long term. Linux kit, while not infallible to any degree [like any complex system, careful setup is key in both environments...] tends to just run and run when you get it right. But you can't run GPO or SMS in Linux LDAP stacks, which companies like as it gives them easy tickboxes when it comes to ITIL conformity. Which CTOs like, as it makes them look like competent IT managers as opposed to glorified beancounters. ;)

It's well worth playing with though - get yourself a copy of VirtualBox [free, as in beer, again - google it] and install Ubuntu in it - it's a nice easy ride into the world of Linux and does everything you need to do day to day, with the option of ripping the guts out of it should you wish. Want to see how something works? Right click, open with a text editor, bang, there's the code. Which is very handy when it comes to troubleshooting. Don't get that with Windows DLLs :cool:

One the subject of Virtualbox, VB 3.0 is out in Beta now, and it has hardware DX9/OpenGL2 support.

I'll be playing with it as soon as I get a chance - I'm hoping the DX9 support is hardware passthrough for guests - IE Run Linux, install VB3, get DX9 support in Windows guests. I'm going to check it later and confirm/deny. If it is DX support in Windows guests on Linux hosts, I'll be well pleased. I think I have read it wrong though and DX9 will only be supported on Windows host machines :(

And another thing, what can our forum programming experts tell us about the impact OpenCL will have on application acceleration? I am assuming you are all aware of this fun, being implemented [in apples way] in Snow Leopard this year, with other implementations [including MSs presumably non-standard version called DirectXXX, arf] following.

Reckon it's a quantum shift? Or will it not be of much use to anyone till they program for it properly? I got the impression that it's basically an API that allows general purpose instruction sets to be run on GPU hardware [GPGPU] without masses of parralellised coding being involved - anyone care to enlighten us further?
 
It always bemuses me when people grossly exaggerate the flaws of Windows like this.

Sure, I'll react to that :D

Windows + antivirus + antispyware + firewall is safe enough perhaps. It doesn't give me much faith that without these it fails rather fast.

No root password on windows, so if access is gained the intruder can do as he pleases. It is therefore significantly more difficult to secure windows. Running firefox in a chroot jail for example is not possible. So one must rely on competing firewall products. I trust ip tables more than a firewall which repeatedly asks me questions like whether 'firefox' should be permitted to access the internet though they do a similar job.

Re. antivirus etc. I consider prevention better than cure. I'd rather have a system that is very difficult to attack with viruses than use one which certainly appears to be rather easily assaulted with hundreds of tools to remove the infection after the event.

So when linux is available to access the internet with, I see no point to going to the effort of attaching antivirus programs, firewalls, antispyware systems to windows when I could just leave it without ethernet drivers. Were i an online gamer I would feel differently, but as all I use windows for is cad work, it's just not worth the time.

p.s. executing unknown programs isn't so scary when you can see their source code and restrict their access to the rest of your system, and debian stable seems to do just fine with very few updates. failing to confiure ip tables is unlikely to hurt either, so I don't think you need to worry about any of those things so much as typing a command in wrong that suddenly kills things. Sadly I frequently break things, but I'm reckless and ill-educated so it's hardly a surprise :)

One the subject of Virtualbox, VB 3.0 is out in Beta now, and it has hardware DX9/OpenGL2 support.

If I understand that right, when that's running on linux hosts, windows guests at least, cad work will finally be possible in virtualbox. That would be wildly exciting. Sadly I fear I'm going to be disappointed, I cant see accessing quadro drivers being important to virtualbox.
 
Last edited:
It always bemuses me when people grossly exaggerate the flaws of Windows like this. You make it sound like Windows is unsuitable for connection to the internet without some kind of complex and laborious security regime. I'm sure utilising a decent firewall, running anti-virus software, keeping the OS up to date and not executing unknown programs would be sufficient. At least three of those four measures are sensible on any OS, even Linux.



I tend to agree - I look after about sixty machines, and none of my users have had any problems, thanks to judicious firewalling [IE blocking anything that isn't essential] at the gateway, decent AV, and threatening anyone who downloads stuff with ultraviolence. They're a pretty good lot.

I have worked in a school before though, and I wish to god we could have dumped Windows, spent more time rebuilding virus riddled machines [the resourceful little tykes always found a way to get around the measures we used...] than I did actually doing useful stuff, like, you know, making sure their exam results got through to the qualifying authorities on time.

No, Linux/Mac wouldn't have solved the problem, but by god the attack vectors would have been much slimmer!
 
Jon, RE teh quadro drivers, if it's given direct access tothe PCI bus via an abstraction layer, then it shouldn't be an issue - the Windows drivers will take care of it from the guest OS, much like plain CPU VT based hypervisors. That's a mighty big IF though.

As I say, I'm going to have a play just now - see if I can get Compiz working in an Ubuntu guest under Win7 VB, and then see if I can get Aero working in a Win7 guest under Ubuntu VB. If I can get LifeForSpeed working under DX9 mode, I will be jumping for bloody joy!
 
I really hope so, that would be fantastic. I'm scared of beta software though, so I think I'll wait. The last versions implementation of 3D graphics didn't go so well.

This could mean computer games in virtualbox, something I didn't expect for years and years
 
Well, providing they can get the rest of the DirectX spec [DirectSHOW, etc] to work, so that joypads etc will work out of the box. They already have USB support.

Beta? That's good enough for me. I was running Ubuntu 8.04 when it was alpha code because I am a hardcore ninja hacker, or something.

Although slackware users tell me that Ubuntu is for girls.

But then slackware users tend to smell of wee and don't wash food stains out of their clothes...;)
 
OH MY GIDDY AUNT :cool::cool::cool:

So far, VB 3.0 and Ubuntu Karmic Alpha. took a bit of work [multiple segfaults, etc, could be either seeing as both are prerelease] but a bit of patience got me this:
3d.png


Graphics acceleration - that is, full OpenGL passthrough - works a treat under Win7 host with Ubuntu guest.

Right, off to boot into Ubuntu and see if I can get Win7 Aero effects to work...;) updates soon kids.
 
Windows under Linux host update: It woks. Not quick though, MX440 speeds [remember those...?]

3Dmark01...




Needs work, but looking promising :)

Apologies for the thumbnails, but I was too lazy to edit the pictures properly :cool:
 
Back
Top Bottom