Friday, November 2, 2007

Introduction to Mind Uploading

INTRODUCTION

Mind uploading refers to the hypothetical transfer of a human mind to an artificial substrate, such as a computer simulation.

Thinkers with a strongly mechanistic view of human intelligence or a strongly positive view of robot-human social integration have openly speculated about the possibility and desirability of this.

In the case where the mind is transferred into a computer, the subject would become a form of artificial intelligence, sometimes called an infomorph. In a case where it is transferred into an artificial body, to which its consciousness is confined, it would also become a robot. In either case it might claim ordinary human rights, certainly if the consciousness within was feeling (or was doing a good job of simulating) as if it were the donor.

Uploading consciousness into bodies created by robotic means is a goal of some in the artificial intelligence community. In the uploading scenario, the physical human brain does not move from its original body into a new robotic shell; rather, the consciousness is assumed to be recorded and/or transferred to a new robotic brain, which generates responses indistinguishable from the original organic brain.

The idea of uploading human consciousness in this manner raises many philosophical questions which people may find interesting and disturbing, such as matters of individuality and the soul.

Many people also wonder whether, if they were uploaded, it would be their sentience uploaded, or simply a copy.

Even if uploading is theoretically possible, there is currently no technology capable of recording or describing mind states in the way imagined, and no one knows how much computational power or storage would be needed to simulate the activity of the mind inside a computer.

On the other hand, advocates of uploading have made various estimates of the amount of computing power that would be needed to simulate a human brain, and based on this a number have estimated that uploading may become possible within decades if trends such as Moore's Law continue.

COPYING VS MOVING

With most projected mind uploading technology it is implicit that "copying" a consciousness could be as feasible as "moving" it, since these technologies generally involve simulating the human brain in a computer of some sort, and digital files such as computer programs can be copied precisely.

It is also possible that the simulation could be created without the need to destroy the original brain, so that the computer-based consciousness would be a copy of the still-living biological person, although some proposed methods such as serial sectioning of the brain would necessarily be destructive. In both cases it is usually assumed that once the two versions are exposed to different sensory inputs, their experiences would begin to diverge, but all their memories up until the moment of the copying would remain the same.

By many definitions, both copies could be considered the "same person" as the single original consciousness before it was copied. At the same time, they can be considered distinct individuals once they begin to diverge, so the issue of which copy "inherits" what could be complicated. This problem is similar to that found when considering the possibility of teleportation, where in some proposed methods it is possible to copy (rather than only move) a mind or person.

This is the classic philosophical issue of personal identity. The problem is made even more serious by the possibility of creating a potentially infinite number of initially identical copies of the original person, which would of course all exist simultaneously as distinct beings.

Philosopher John Locke published "An Essay Concerning Human Understanding" in 1689, in which he proposed the following criterion for personal identity: if you remember thinking something in the past, then you are the same person as he or she who did the thinking. Later philosophers raised various logical snarls, most of them caused by applying Boolean logic, the prevalent logic system at the time. It has been proposed that modern fuzzy logic can solve those problems, showing that Locke's basic idea is sound if one treats personal identity as a continuous rather than discrete value.

In that case, when a mind is copied -- whether during mind uploading, or afterwards, or by some other means -- the two copies are initially two instances of the very same person, but over time, they will gradually become different people to an increasing degree.

The issue of copying vs moving is sometimes cited as a reason to think that destructive methods of mind uploading such as serial sectioning of the brain would actually destroy the consciousness of the original and the upload would itself be a mere "copy" of that consciousness. Whether one believes that the original consciousness of the brain would transfer to the upload, that the original consciousness would be destroyed, or that this is simply a matter of definition and the question has no single "objectively true" answer, is ultimately a philosophical question that depends on one's views of philosophy of mind.

Because of these philosophical questions about the survival of consciousness, there are some who would feel more comfortable about a method of uploading where the transfer is gradual, replacing the original brain with a new substrate over an extended period of time, during which the subject appears to be fully conscious (this can be seen as analogous to the natural biological replacement of molecules in our brains with new ones taken in from eating and breathing, which may lead to almost all the matter in our brains being replaced in as little as a few months).

This would likely take place as a result of gradual cyborging, either nanoscopically or macroscopically, wherein the brain (the original copy) would slowly be replaced bit by bit with artificial parts that function in a near-identical manner, and assuming this was possible at all, the person would not necessarily notice any difference as more and more of his brain became artificial.

No comments: