The paradox of self-replicating machines - An OcUK theorem

Like the arm robots in Lexx?

Stanley, I'm armed! "We fight for Xev":p

I have lots of favourite Lexx episodes, but one of my favourite favourites was the episode "The Game" where Kai plays Chess with Prince, a tribute to the Ingmar Bergman film "The Seventh Seal"
 
Japanese sumo bots. Watch this. Seriously, just watch it. This is mental. And going by the feet in the background, this isn't sped up either.

 
Had a few beers and was pondering...

I think self replicating machines of any significant complexity must be paradoxical which disproves the whole von neumann probes theory of why the universe should be teaming with life... the great silence.... kardashev scales etc. you name it, it is gone!

Lets break it down... If we can define the complexity of a machine by its abilites such as the following list (for example). Obviously the more features the greater the complexity sum if you will.

Propulsion and navigation
Sensing/resoruce hunting
Descision making
Material and Energy Extraction

(to name but a few)

Lets say we add all of these together and gain a combined complexity of U+V+W+X=Z.... But now we have to incorporate the self replicating parameter... The lay person may think ok, just add Y in so the total complexity now becomes U+V+W+X+Y=Z+Y... Simple right? Well no because the paradox states that the fundamental need need to make something self-replicating will influence every parameter.... So maybe we should say (U+Z1)+(V+Z2)+(W+Z3)+(X+Z4)=Z+Z1+Z2+Z3+Z4+Z5.... And now we have the paradox because we also need to include the added complexity of self replication to the basic self replication complexity.... and then we just get complexity that equates to a sum to infinity (by definition).

In simple terms the added complexity of self replication of various abilities adds an ever growing level of complexity that must then be self replicable... and so on and so on..

Did a quick google but didnt hit any citations for self-replicating machine paradoxes so I think I must be on to something... I haven't fully thought this through and I am sure the maths is a little sketchy but I really believe I have come up with a unique bit of academic work here. Is anyone on here willing to help me flesh this out and perhaps we can look to publish a peer-reviewed journal paper? Would be pretty cool if we could list OCUK in the acknowledgements section - maybe we get some free stuff?

Recursion is not a paradox. Nor have you shown that added complexity must be greater than the ability to manage it.

If I give you a series such as 1 + 0.1 + 0.01 + 0.001 + 0.0001 [...] that is an ever increasing sequence. It goes on forever (shut up, Planck!) and gets larger st every step. And yet it tends towards 1.111111 recurring. It gets larger all the time, yet never reaches 2. Your conceptual error (the chief one, anyway) is to presume that increase in complexity does not tend towards a maximum. Your Z1, Z2, etc., can be smaller everytime.

Furthermore, as an engineer rather than a scientist, I assess things by comparison to reality. ;) I can look in the mirror and see a self-replicating machine any time I choose. If theory says something is not possible but evidence says otherwise, then the theory, no matter how much you love it, is wrong.
 
I disagree with the OPs statement that all the factors are changed by the addition of self replication as a function. Taking the list of functions given in the original post:

Propulsion and navigation, Sensing/resource hunting & material and energy extraction - all of these are generic functions which would be needed for general travel and self repairs. As such you don't need to make changes to them just change the incoming required "shopping list" in the case of a replication over the ability to use them for self repair.

Decision making - this would require an extention to allow for the choice to fully replicate. Information on how to create the replica would just be a superset of the information on how to repair individual parts of the unit.

You would need some physical capacity to create the replica taking the input from above functions but again there would be a large overlap with any self repair function. Think about building a processing unit for a replica compared with creating one to replace part of the original unit. The only difference is the platform it is being installed into, the gathering of resources and their processing is the same in both cases.
 
I disagree with the OPs statement that all the factors are changed by the addition of self replication as a function. Taking the list of functions given in the original post:

Propulsion and navigation, Sensing/resource hunting & material and energy extraction - all of these are generic functions which would be needed for general travel and self repairs. As such you don't need to make changes to them just change the incoming required "shopping list" in the case of a replication over the ability to use them for self repair.

Decision making - this would require an extention to allow for the choice to fully replicate. Information on how to create the replica would just be a superset of the information on how to repair individual parts of the unit.

You would need some physical capacity to create the replica taking the input from above functions but again there would be a large overlap with any self repair function. Think about building a processing unit for a replica compared with creating one to replace part of the original unit. The only difference is the platform it is being installed into, the gathering of resources and their processing is the same in both cases.

Good, practical post.
 
If we look at known examples on Earth that already do similar then it's inevitable that self replicating machine will continually improve itself over time.

All life on this planet, given the right circumstances, improves itself with each generation (more correctly it adapts to the environment with each generation). I think it's inevitable that we will one day create machines capable of that.
 
I don't see self replication as a paradox. hell, we're here, and many forms of organic machines, barely identifiable as 'alive'. Virii, bacterias. Self replicating the self replicating processes seems trivial, and not that paradoxical. I'm not well verse in the subject, but I wonder if we've synthesised self replicating molecules already.
 
Organic life self replicates, so your theory is flawed.

The self replicating machines that we eventually create will use single celled organisms for their inspiration.
 
Dont worry I havent abandoned this.... I have worked on the theory a bit more... I need to put the Maths together but I can give you a basic explanation.

Machine need to make drill bit (for example), (let's call it part C)... Machine needs to use part A + part B in order to make part C... Machine now able to make part C... Machine need to make part A incase it breaks during making of part C.... Machines need to make part D in order to make Part A...... Machine need to make part E in order to make part D... And so on and so forth... We havent even considered making part B... But what if part A + part B where actually part(100) We have to consider 100 parts!!!... SO! Not only do we have a paradox we also have the fact that what happens if as part of the self replicating machine fails during replication... Machine basically screwed right? If we include a breakage kernel in our equations we essentially consign self replucation to the galactic dustbin. Bye Bye von neumann indeed! It's like playing Ultima Online on Siege Perilous server but the odds are stacked against you with it being impossible to mine more material that it takes to replace your tools before they break...

And that is for basic tools... With self replication you have the additional math that I have thrown in... Seriously I think I am on to something here... We can prove it and get a journal article in Nature etc if we can get the maths together. What do you reckon.

PS I have been drinking Gin tonight as opposed to beer, I think it has given me more insight. Would love to hear your opinion.
 
Dont worry I havent abandoned this.... I have worked on the theory a bit more... I need to put the Maths together but I can give you a basic explanation.

Machine need to make drill bit (for example), (let's call it part C)... Machine needs to use part A + part B in order to make part C... Machine now able to make part C... Machine need to make part A incase it breaks during making of part C.... Machines need to make part D in order to make Part A...... Machine need to make part E in order to make part D... And so on and so forth... We havent even considered making part B... But what if part A + part B where actually part(100) We have to consider 100 parts!!!... SO! Not only do we have a paradox we also have the fact that what happens if as part of the self replicating machine fails during replication... Machine basically screwed right? If we include a breakage kernel in our equations we essentially consign self replucation to the galactic dustbin. Bye Bye von neumann indeed! It's like playing Ultima Online on Siege Perilous server but the odds are stacked against you with it being impossible to mine more material that it takes to replace your tools before they break...

And that is for basic tools... With self replication you have the additional math that I have thrown in... Seriously I think I am on to something here... We can prove it and get a journal article in Nature etc if we can get the maths together. What do you reckon.

PS I have been drinking Gin tonight as opposed to beer, I think it has given me more insight. Would love to hear your opinion.

I'm afraid the above is not mathematics. Mathematics is precise. It is THE most precise of sciences. Throwing in undefined terms, quantities and relationships immediately prevents you drawing solid conclusions.

As a quick examples of how this leads to flaws:

"Kernel breakage". I will guess that you mean if a necessary component breaks preventing replication. But you do not quantify the likelihood of this. Suppose the chance of it breaking is 50% after each time it is used. A makes B and either it breaks in doing so, or it doesn't. Now for the next iteration either A and B both make a new one or (if A broke its component), only B does. And both may or may not individually break in doing so. So you have a 50% chance of ending up with two functioning machines on the second generation. Even if A breaks and you're left only with B functioning, you're only back where you started and have a fifty-fifty chance of ending up with B+C on the third generation (second iteration). If A doesn't break then you now have two functioning machines and at worst you end up with C and D on the third generation, but there's now a 75% chance of ending up with three or more.

So you see, even if the chance of any given machine becoming inoperable after every single reproduction ("kernel breakage"), you will inevitably hit critical mass and get a self-perpetuating population of machines.

All this is colossal simplification. The point being made is to show that loose arguments such as "include kernel breakage" cannot be called maths in any sense nor used to draw any useful conclusions. For it to be maths, you need to start doing what I do above which is to deal in actual specifics such as "let the chance of a machine becoming unable to reproduce after creating a copy of itself be X".
 
Last edited:
I think you need to do more reading, probably whilst sober.

Still, it is good for someone to be interested in this stuff and try to puzzle it out, at least. I applaud people thinking about such things rather than celebrity gossip and other such temporal topics.
 
Back
Top Bottom