DP, what kind of computational equivalent do you think would be comparative to the processing and computational power of the Human Brain?
It is hard to answer because the processing power of the human brain comes through massive parallelisation of simple computing units at relatively slow speeds while even the supposed parallel super computers have a relative small number of very fast very complex computing units.
A Neuron has a firing rate of about 10ms, so the human brain is really only about a 10Hz processor. In the human brain signals are generally fed through 5-10 neural layers making a processing latency time of 50-100ms. Hence 10Hz is about the limit of human perception and a TV at 25HZ is perceived as fluid motion.
Silicon transistors operate at thousands of MHz so are massively quicker to pass through a simple signal, but millions of transistors and cycles are needed to compute basic information. The biggets super computers have thousands of processors, the human brain has billions of neurons, so there is several orders of magnitude difference
IBM have a project where they try to simulate a human brain,
http://www.scientificamerican.com/a...ulates-4-percent-human-brain-all-of-cat-brain
But that is fundamentally much slower than a human brain because it is simulating the chemical-physical processes within a neuron and synapse, which takes quite a lot more computation that merely the computation of the information itself. The chemical processes within a neuron and the synapse don't actually require any computation in the brain, they are just chemical reactions, so it is quite inefficient if we try to model these chemical reactions. As an aside, the speed of chemical reactions has sparked great interest in building chemical processors.
I think the only real way to judge the computational power of the human brain is to build computational systems that can surpass human abilities in a range of tasks normally associated with human intelligence. Biological brains are fundamentally different to artificial processors, so trying to get them both to do the same thing with the same method is flawed. Trying to get them both to do the same tasks using whichever method is appropriate has some merit.
IBM's Depp blue chess paying computers and the new Watson jeopardy playing computer show how it is possible to reach human intelligence within a single domain. My manger created the worlds first Bridge playing software which succeeded in beating the best human experts in the world. He recently developed a crossword solver that ranked highly in the world championships recently, and his software runs on a standard laptop.
The future decades will see a massive rise in the abilities of computer software to beat human experts. These systems will also become smaller and simpler, and will be able to be run from the future iphones etc rather than desktops or large servers.
These systems may not be intelligent in a natural sense and will solve these tasks in alternative means, but that is to be expected format he differences in hardware.
New hardware design may alleviate many of the limitations of current designs. Chemical or genetic computers would allow massive parallelisation.
Massive parallelisation allows computer systems to break some of the hard rules of computational theory. E.g., it is provable that sorting algorithms run a on Turing computer can be no faster than N*LogN where N is the size of the set of numbers to sort. Alternative computing technologies can do this faster, perhaps LogN time. Another example is the Travelling Salesman Problem (given a set of cities find the shortest path that connects all the cities). This is an extremely hard problem for computers, NP-Hard to be exact. If there are N cities then the run-time is in order N!, (N factorial) so is impractical for large values of N. Slime mold bacteria have been shown to solve the TSP problem in constant time irrespective of the number of cities due to massive parallel computation arising from billions upon billions of bacteria cells.