Name: Anonymous 2008-04-21 16:24
[edit] Copying vs. moving
With most projected mind uploading technology it is implicit that "copying" a consciousness could be as feasible as "moving" it, since these technologies generally involve simulating the human brain in a computer of some sort, and digital files such as computer programs can be copied precisely. It is also possible that the simulation could be created without the need to destroy the original brain, so that the computer-based consciousness would be a copy of the still-living biological person, although some proposed methods such as serial sectioning of the brain would necessarily be destructive. In both cases it is usually assumed that once the two versions are exposed to different sensory inputs, their experiences would begin to diverge, but all their memories up until the moment of the copying would remain the same.
By many definitions, both copies could be considered the "same person" as the single original consciousness before it was copied. At the same time, they can be considered distinct individuals once they begin to diverge, so the issue of which copy "inherits" what could be complicated. This problem is similar to that found when considering the possibility of teleportation, where in some proposed methods it is possible to copy (rather than only move) a mind or person. This is the classic philosophical issue of personal identity. The problem is made even more serious by the possibility of creating a potentially infinite number of initially identical copies of the original person, which would of course all exist simultaneously as distinct beings.
Philosopher John Locke published "An Essay Concerning Human Understanding" in 1689, in which he proposed the following criterion for personal identity: if you remember thinking something in the past, then you are the same person as he or she who did the thinking. Later philosophers raised various logical snarls, most of them caused by applying Boolean logic, the prevalent logic system at the time. It has been proposed that modern fuzzy logic can solve those problems,[11] showing that Locke's basic idea is sound if one treats personal identity as a continuous rather than discrete value.
In that case, when a mind is copied -- whether during mind uploading, or afterwards, or by some other means -- the two copies are initially two instances of the very same person, but over time, they will gradually become different people to an increasing degree.
The issue of copying vs moving is sometimes cited as a reason to think that destructive methods of mind uploading such as serial sectioning of the brain would actually destroy the consciousness of the original and the upload would itself be a mere "copy" of that consciousness. Whether one believes that the original consciousness of the brain would transfer to the upload, that the original consciousness would be destroyed, or that this is simply a matter of definition and the question has no single "objectively true" answer, is ultimately a philosophical question that depends on one's views of philosophy of mind.
Because of these philosophical questions about the survival of consciousness, there are some who would feel more comfortable about a method of uploading where the transfer is gradual, replacing the original brain with a new substrate over an extended period of time, during which the subject appears to be fully conscious (this can be seen as analogous to the natural biological replacement of molecules in our brains with new ones taken in from eating and breathing, which may lead to almost all the matter in our brains being replaced in as little as a few months[12]). As mentioned above, this would likely take place as a result of gradual cyborging, either nanoscopically or macroscopically, wherein the brain (the original copy) would slowly be replaced bit by bit with artificial parts that function in a near-identical manner, and assuming this was possible at all, the person would not necessarily notice any difference as more and more of their brain became artificial. A gradual transfer also brings up questions of identity similar to the classical Ship of Theseus paradox, although the above-mentioned natural replacement of molecules in the brain through eating and breathing brings up these questions as well.
With most projected mind uploading technology it is implicit that "copying" a consciousness could be as feasible as "moving" it, since these technologies generally involve simulating the human brain in a computer of some sort, and digital files such as computer programs can be copied precisely. It is also possible that the simulation could be created without the need to destroy the original brain, so that the computer-based consciousness would be a copy of the still-living biological person, although some proposed methods such as serial sectioning of the brain would necessarily be destructive. In both cases it is usually assumed that once the two versions are exposed to different sensory inputs, their experiences would begin to diverge, but all their memories up until the moment of the copying would remain the same.
By many definitions, both copies could be considered the "same person" as the single original consciousness before it was copied. At the same time, they can be considered distinct individuals once they begin to diverge, so the issue of which copy "inherits" what could be complicated. This problem is similar to that found when considering the possibility of teleportation, where in some proposed methods it is possible to copy (rather than only move) a mind or person. This is the classic philosophical issue of personal identity. The problem is made even more serious by the possibility of creating a potentially infinite number of initially identical copies of the original person, which would of course all exist simultaneously as distinct beings.
Philosopher John Locke published "An Essay Concerning Human Understanding" in 1689, in which he proposed the following criterion for personal identity: if you remember thinking something in the past, then you are the same person as he or she who did the thinking. Later philosophers raised various logical snarls, most of them caused by applying Boolean logic, the prevalent logic system at the time. It has been proposed that modern fuzzy logic can solve those problems,[11] showing that Locke's basic idea is sound if one treats personal identity as a continuous rather than discrete value.
In that case, when a mind is copied -- whether during mind uploading, or afterwards, or by some other means -- the two copies are initially two instances of the very same person, but over time, they will gradually become different people to an increasing degree.
The issue of copying vs moving is sometimes cited as a reason to think that destructive methods of mind uploading such as serial sectioning of the brain would actually destroy the consciousness of the original and the upload would itself be a mere "copy" of that consciousness. Whether one believes that the original consciousness of the brain would transfer to the upload, that the original consciousness would be destroyed, or that this is simply a matter of definition and the question has no single "objectively true" answer, is ultimately a philosophical question that depends on one's views of philosophy of mind.
Because of these philosophical questions about the survival of consciousness, there are some who would feel more comfortable about a method of uploading where the transfer is gradual, replacing the original brain with a new substrate over an extended period of time, during which the subject appears to be fully conscious (this can be seen as analogous to the natural biological replacement of molecules in our brains with new ones taken in from eating and breathing, which may lead to almost all the matter in our brains being replaced in as little as a few months[12]). As mentioned above, this would likely take place as a result of gradual cyborging, either nanoscopically or macroscopically, wherein the brain (the original copy) would slowly be replaced bit by bit with artificial parts that function in a near-identical manner, and assuming this was possible at all, the person would not necessarily notice any difference as more and more of their brain became artificial. A gradual transfer also brings up questions of identity similar to the classical Ship of Theseus paradox, although the above-mentioned natural replacement of molecules in the brain through eating and breathing brings up these questions as well.