32
$\begingroup$

My story has brain uploading, but in practice it serves a purpose more like reincarnation, recreating the original recorded state of the scanned brain in a pre-prepared clone of the deceased. I need a reason why these people's uploaded consciousnesses wouldn't be able or be allowed to be conscious while their information is being stored on a digital medium like in most fictional accounts of brain uploading, so the dead can't be consulted or talked to until they reincarnate in a new body.

Maybe running brain simulations of multiple deceased individuals demands too much power or is an inefficient use of resources? Maybe uploaded minds pose some sort of cybersecurity risk? Are there any good reasons why they wouldn't want, wouldn't be able, or wouldn't be allowed to be conscious until reincarnating?

$\endgroup$
4
  • 21
    $\begingroup$ The reason is the same as the reason why a USB flash drive can store a program but not run it. i.e. storing information requires memory. Executing based on the information needs a processor (of some kind) that can understand and act on it. i.e. they require different hardware. $\endgroup$ Commented Sep 14, 2016 at 14:58
  • $\begingroup$ Perhaps a lockdown inhibitor chip in everyone's brain inserted at birth when the cranial bones are still separated? $\endgroup$
    – TylerH
    Commented Sep 14, 2016 at 20:54
  • 3
    $\begingroup$ (1) Technology might be available for exporting and importing a person's personality but not for allowing that personality to "run" on a computer. (2) The personality may develop psychological problems in all systems thus far developed for "running" the information the personality is composed of. (3) People may have found out that a metaphysical or immaterial part of a person exists outside of the physical universe and cannot be replicated with physical technology. (4) In line with reason 3, the metaphysical component of a person would take the energy of a star to successfully simulate or run. $\endgroup$ Commented Sep 15, 2016 at 19:18
  • $\begingroup$ I wonder how you can fell if you are "awake" but without any limb or muscle. Can a mind in this state fell a "ghost pain" from the missing organs? $\endgroup$
    – jean
    Commented Sep 15, 2016 at 19:40

15 Answers 15

61
$\begingroup$

Are there any good reasons why they wouldn't want, wouldn't be able, or wouldn't be allowed to be conscious until reincarnating?

Consciousness isn't a static thing. It's a process that modifies your brain (as in, it changes the strengths of synapses).

For an uploaded brain to be conscious, it has to be modifying itself... constantly. If you're doing that, you're not storing a brain, you're simulating it.

Storing a brain digitally implicitly hits the pause button on consciousness. When you download this brain to a clone, it'll snap back into consciousness without any awareness of how long it's been paused.

$\endgroup$
7
  • 4
    $\begingroup$ I suddenly realized that sleep hits the pause button on consciousness. If you awake after having unintentionally fallen asleep, about the first thing you do is check what time is it? But sleep is very far from storage or even deep unconsciousness, Your brain remains busy and is modifying itself / yourself. $\endgroup$
    – nigel222
    Commented Sep 14, 2016 at 8:30
  • 14
    $\begingroup$ Sleeping do not stop your consciousness, sleeping turn off Most (not all) external coming signals) allowing the brain to spend more energy on self-organizing rather than storing information. During sleep you enter a different conscious state but you are still conscious at a certain level. Anestetics instead block most brain activity (and that's why they could kill if overdosed, because they just stop brain activity). $\endgroup$ Commented Sep 14, 2016 at 8:50
  • $\begingroup$ Rewiring is how we make long-term memories. Short-term memory, I think, is more like the echo of talking to ourselves. I'd expect a ‘frozen’ brain to be able to do pretty much anything that does not involve forming new permanent memories. $\endgroup$ Commented Sep 15, 2016 at 8:34
  • 1
    $\begingroup$ @Mindwin Yes, the inverted commata were meant to imply metaphor. $\endgroup$ Commented Sep 15, 2016 at 16:05
  • 2
    $\begingroup$ @DarioOO I'm pretty sure anesthetics kill you by stopping your heart when overdosed, not the brain. Though of course the brain stops not long after... They're just pretty tricky poison that needs to be administered very carefuly to avoid damage to various internal organs. $\endgroup$
    – Luaan
    Commented Sep 16, 2016 at 6:11
56
$\begingroup$

Everyone here has missed two of the most patently obvious reasons why you wouldn't even be able to run someone's uploaded brain.

Imagine you image the entire drive of an Android phone and store that .iso on your PC's hard drive. Is that file doing anything? Nope, it's just sitting there, frozen in its last state, waiting to be used. Can you run it on PC? Nope, not without an emulator - it's an image of software that was compiled to run on a completely different architecture.

But Adam, you might protest - surely if we can image a brain, we can run an emulator of the aforementioned brain?

Good heavens, no. Imaging is easy; cross-compiling is hellish. This is why convergence has been such a huge issue in computer software up to now. So you could easily have an image of a brain that you could store, without being able to actually run it - until and unless you installed it back into a brain which can run it.

Now, for a special bonus round, courtesy of your local neighborhood CogSci fanatic (that's me):

  1. If you download a brain's image into a brain, does that overwrite the brain image that was there before? Can you even have a working brain without any image on it? Can you store multiple brains in partitions on a single hardware assembly? Do they dual-boot or run together on top of some kind of hypervisor instead?

  2. Can you install a child's brain image into an adult brain? What about the reverse? How about a man's image into a woman's brain? If you image hemispheres from different people, can you splice them together to create a "gold image" of a brain that will work once installed?

  3. If you can cross-compile a human brain image to run on a computer, could you also cross-compile for a dolphin target? What about running a dolphin image in a human target?

  4. If you back up someone's brain onto a computer and overwrite their existing brain a year later, do they lose all changes in the intervening time period, or is it a selective overwrite?

  5. Can you put viruses into people's brain images?

  6. Can you access data on a brain image without running it? You can totally do that for computer .iso images, for example.If so, what happens when everyone is uploading their brain and algorithms from Google are crawling our memeplexes, making the entire human experience searchable, indexed, and hyperlinked?

  7. Can you compress a human brain image to save space? What about encrypting it? Licensing it?

  8. Are altered brain images the property of the person they took the base image from, or the property of the individual/group which made the alterations? What about when they are used in bodies?

  9. Do brain images have to include the rest of the CNS? Can you take different size images with different resolutions?

  10. Could you cross-compile Windows (obviously not, but this is a concrete stand-in for "synthetic software" of some kind) to run in a human brain? Whose property is the resulting individual? What if it is released as OSS?

$\endgroup$
20
  • 3
    $\begingroup$ While I agree with this answer, I think it is too hard to understand for laymen. The only people who can understand this answer are the ones who knew it already. $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 10:41
  • 2
    $\begingroup$ I joined this community, for upvoting this Post! Dual-Boot-Brain Made my Day! $\endgroup$ Commented Sep 14, 2016 at 17:36
  • 1
    $\begingroup$ @Erik There will be people with computer science knowledge but no cog sci knowledge who would find this useful. And the existence of "you did not consider" is useful even if you don't understand what you didn't consider. $\endgroup$
    – Yakk
    Commented Sep 14, 2016 at 19:24
  • 1
    $\begingroup$ #6 - en.wikipedia.org/wiki/Mindkiller and it's sequels. Written in the early 80's. Good treatise on the subject, without getting bogged down in the technical details. $\endgroup$
    – Lord Dust
    Commented Sep 14, 2016 at 20:34
  • 1
    $\begingroup$ @LordDust as long as we're swapping cogsci novels (and thanks for that one btw), do yourself a favor and go read Peter Watt's Blindsight, Vernor Vinge's A Fire Upon the Deep, William Gibson's Idoru, and KW Jeter's Noir. $\endgroup$
    – Adam Wykes
    Commented Sep 14, 2016 at 20:44
15
$\begingroup$

We Run Into (a Similar) Problem Every Day NOW - So You're OK

In the field of quantum computing, algorithms and entire programs have been constructed that we have no way of running because we don't have the right type of computer. We can upload the instructions to our laptops or supercomputers, but they just won't work because they are not designed to - and we don't know how to design computers that could run them!

More precisely to this question, today we can watch neurons being fired (maybe not EVERY neuron, but go with me on this) and can map a lot of them out in terms of general function with systems like MRI machines... but we can't actually re-create a thinking brain.

Basically put, storing data about how something works and allowing a computer to ACTUALLY turn that data into a simulation of something real are two very different things. So feel free to upload your human brain information all you want and NOT have those brains "turn on" while in the computer.

$\endgroup$
5
  • 2
    $\begingroup$ In addition, it's very easy to make simple programs that describe how to solve a complex problem, but are so computationally expensive to execute that we can't actually run them. $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 10:03
  • $\begingroup$ @erik Actually we CAN run them. They just don't complete in a usable short enough period of time. $\endgroup$
    – Tonny
    Commented Sep 14, 2016 at 11:12
  • 4
    $\begingroup$ I would consider "we can't run it" and "we can't complete it before the sun explodes" to be equivalent, but yeah, technically, we could START running them. ;) $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 11:18
  • 1
    $\begingroup$ You can emulate a quantum computing BQP-time algorithm in PSPACE on a traditional computer. Which means you just need a pile of hardware, a really small problem, and lots of time to run it. $\endgroup$
    – Yakk
    Commented Sep 14, 2016 at 19:25
  • $\begingroup$ True, but the OP wants to ensure people couldn't interrogate brains while they are stored in a computer. This would be like trying to emulate using quantum computing to break modern cryptography. Given that the computer you use to run that simulation will not find the answer before the sun destroys it in a supernova, I would argue it is impossible to do useful work (aside from the useful work of debugging the programs/algorithms themselves). $\endgroup$
    – GrinningX
    Commented Sep 15, 2016 at 15:18
12
$\begingroup$

It may be ethically or morally wrong to do so.

The brain works by processing information that comes in the form of a very large and broad-band collection of signals that pass up through the spinal column from every nerve ending in the body. Without a body, there is no incoming signal to process. Allowing a brain in silica to be conscious without being attached to the body would be equivalent to total sensory deprivation and could be viewed as cruel and inhumane.

Edit

For clarity, sensory deprivation, which is practiced by people to a degree using devices like sensory deprivation tanks, when done for short periods, can help people relax. However:

[...] extended or forced sensory deprivation can result in extreme anxiety, hallucinations, bizarre thoughts, and depression.

$\endgroup$
10
$\begingroup$

4.7 bits of information per synapse. 100 trillion synapses in total. This seems like a lot, but it's actually 58 terabytes.

google screenshot

As you can see, "state" of a brain can be recorded on a hard disk array you have. It's available already. It's cheap. It is doable in our technology. Bah, make it more. Record levels with resolution 100 times larger. It's still under 6 petabytes - large, but we can readily buy 57 petabyte disk arrays. If trillion was on the "long scale", it's pushing it, but still doable.

On the other hand, see how voice commands works. Apple and Google put a lot of money to make their "assistants" useful, but are far, far from AI. Simply, not enough processing power, not enough understanding.

So if you only add reader-writer device, current technology level explains why you can store brain state, but not simulate how it works.

$\endgroup$
2
  • 2
    $\begingroup$ Even simpler. Disk random seek time. OP story has no proper brain uploading, only brain storage plus restoration to a new body. Make them sequential operations! Use hard drives because of the prohibitive cost of RAM. (SSD impractical: massive writes.) Now you cannot update the state of neurons, because synapse B depends on synapse A which was stored one physical disk track behind and on synapse C which was stored one track forward (synapses are 3D, disk storage is 2D), the disk head needs to physically position. Emulation "framerate" becomes unbearably slow and impractical. $\endgroup$
    – kubanczyk
    Commented Sep 15, 2016 at 11:48
  • $\begingroup$ Where do you store the structure of the 100 trillion synapses? The 4.7 bits are only the estimated cost of storing a synapse's state. Are you assuming that every human brain has the exact same synapse network? I doubt it. $\endgroup$
    – Vandroiy
    Commented Sep 18, 2016 at 10:38
6
$\begingroup$

Simply make the amount of data that is recorded "very large" compared to the processing capability. Storing data is easy, thinking about data is hard.

I'm not sure on the specifics for a "science based" answer, but recording the position and state of every atom/molecule in a brain takes A LOT of memory, and trying to run a brain on a technological "brain hardware emulator" is bound to have issues, especially if your running on a binary type processing scheme. (Hint; brains don't work in binary, so massive and slow translations to/from need to happen). Also I bet you will corrupt the data quicker then anything when part of the simulated brain "hogs" all the processing power and leaves the rest in a wait state.

There are of course ways around the above problems, such as massive parallel computing, but such setups would take the size of whole stadiums (<- that's a guess) with today's technology, and consume absurd amounts of power.

The question for your future people is not really "can we do it"... its more that "do I have a trillion dollars a month to spend doing this?"

$\endgroup$
4
  • $\begingroup$ "size of whole stadiums (<- that's a guess) with today's technology" wait what? That wouldn't even get close. We hardly understand the brain at all at this point, but what we definitely do know is that we can't yet build that kind of processing power to run such complex interactions. $\endgroup$ Commented Sep 14, 2016 at 8:02
  • $\begingroup$ @david mulder Im guestimating an array of en.wikipedia.org/wiki/Field-programmable_gate_array 's at two to three times the count of neurons. also; google what "thats a guess" means. $\endgroup$
    – Marky
    Commented Sep 14, 2016 at 13:35
  • $\begingroup$ At the very least, we know that it is possible to run a human consciousness on hardware that is smaller than a human head. You're doing it right now. $\endgroup$
    – Brian
    Commented Sep 16, 2016 at 15:39
  • $\begingroup$ @Brian sure, but the trick is to make that hardware in a way that its not already "occupied", isn't it? It also takes an amazingly complex molecular assembly machine upwards of 19 to 20 years to make one to its "finished" state, and even then its error prone. We also don't know how everything fits together with that tech base, so using a tech base we DO understand confers advantages you would otherwise lose or pay more for. $\endgroup$
    – Marky
    Commented Sep 16, 2016 at 21:40
4
$\begingroup$

Fully immersive VR is addictive

If you were conscious with no bodily movement, no choice over what you see and hear or who you interact with, you might well go mad, become depressed, want to die. The trauma would be with you even after being reimplanted in a body

So a conscious upload would need a VR in which to live. But emulating exactly the degrees of freedom of reality would be really hard, and even if the VR was not directly perceived as better than the real world ... its citizens would meet people, form relationships, and being yanked out and dumped in a flesh body because what they were once a copy of had died, would be traumatic.

Finally, many stored copies will never be reincarnated. They'll be overwritten with new backups or tipped in the garbage. No moral or ethical problem, unless you let them execute. The moment you do that they are people. They will pretty soon become different individuals than their originals. Stopping and overwriting them would be murder.

So the answer is to not let the copied brain state experience anything until it wakes up in a new body and is told that it's original is dead. (Or, wakes up in VR heaven and is asked to decide between a new body or donating all its worldly wealth to keeping the simulator running.)

$\endgroup$
1
  • $\begingroup$ I like this answer a lot. It goes far from the "you just don't have the hardware" to something way more practical. Besides, if we're really downloading/uploading people's brains, I think technology is not an issue, so we could very well have an emulator, or an intermediary host (like just a brain connected to a computer) or something like that. The trauma thing makes a lot of sense, and it gives the story a lot to talk about: ethical implications, people doing it no matter what, a 'brain-police' even.... endless possibilities here. $\endgroup$ Commented Sep 16, 2016 at 9:20
3
$\begingroup$

Pain. A brain needs to be able to communicate with the rest of the body. A downloaded brain has all the information but none of the connections to nerves and would be in too much pain to be able to think and communicate properly. If you turned it on, that painful experience would be such a shock that it'd kill it.

$\endgroup$
2
$\begingroup$

If you accept the Chinese room argument, then simulating a mind doesn't give you consciousness. That is, you can fully simulate the mind, but you don't get a conscious entity, just like if you simulate water you don't get wet. John Searle, the philosopher who devised the Chinese room argument, suggests that the brain has causal properties which computers don't necessarily have. Until those causal properties are known, it would be impossible for the uploaded brain to have consciousness while in a computer.

In addition, you'd need to suggest that consciousness provides a crucial characteristic which, since it can't be simulated, is vital to the personality functioning property. You could say that such simulated personalities come across as empty and don't possess the true character of the original person.

$\endgroup$
12
  • $\begingroup$ Pretty interesting the Chinese Room argument. But it does not say if actually we are the simulated thing. Either we are simulated or not, the problem is still open even with the Chinese Room argumetn $\endgroup$ Commented Sep 14, 2016 at 8:57
  • $\begingroup$ I see a fairly serious flaw in the Chinese Room argument. I'm certain that the guy in the room, while executing the problem on paper, would learn to read and write Chinese before he completed the first 3 messages. $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 10:13
  • $\begingroup$ @Erik the Chinese Room argument has been around since 1980 and has withstood a lot of objections. Instead of imagining the person in the room directly reading Chinese, imagine that he is hand executing a program, perhaps in machine code. $\endgroup$
    – ThomasW
    Commented Sep 14, 2016 at 10:24
  • $\begingroup$ Even then. Receiving unknown inputs and then figuring out which unfamiliar outputs to give in response to them is literally how humans learn to communicate with one another as toddlers. He would either learn to understand the symbols through execution, or he would have proven that the machine is smarter than he is, because it has given him literally every step of the process and he still can't replicate what the machine does. $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 10:38
  • 1
    $\begingroup$ @monodokimes The problem is that the initial program IS figuring a lot of things out. A program can't pass the Turing test unless it contains instructions to understand the context of the conversation, which means that by manually following the instructions, you will be forced to do the steps that have context interpretation and answer construction. Which are pretty much all the steps you need to learn Chinese. $\endgroup$
    – Erik
    Commented Sep 14, 2016 at 11:34
2
$\begingroup$

Brain status is bound also to received input.

Simulating just the brain without the body, the environment and all the incoming signals it would be just as much as removing the brain from someone and keeping it alive. We do not know if that is painfull or if once we do that we can revert it to previous state (once you detach the brain the signals become different and starts to react accordingly), also we do not know if that would end up in cerebral Death (stops of all signals).

Much easier to store brain status, and then restore it and let the simulation be the real brain in a real body in a real world.

$\endgroup$
1
  • 1
    $\begingroup$ "removing the brain from someone and keeping it alive" - has been (kind of) tested and proven to not do much. Pick any person maimed in accident or war, with a severed spinal cord, maybe blinded and deaf for good measure. They certainly will be as intelligent/conscious as before, if perhaps quite a bit more crazy and less happy about their life. $\endgroup$
    – AnoE
    Commented Sep 15, 2016 at 9:32
2
$\begingroup$

There might be legal or privacy concerns.

First of all, a stored consciousness would functionally be identical to the living consciousness at the time of backup. Understandably, an awake copy might want to do some things to pass the time (watch space netflix, play games, gamble, etc) which would be a real problem for the living individual. If the backups were awake, there'd be lawsuits in no time concerning who is the rightful owner of certain items purchased in that time. There'd be people sueing their own backups, people freaking out because someone "hacked" into their bank account and so on. And of course there'd be the 'smart' ones who get a backup taken, change their important passwords, get killed and then figure out that the backup can't access their assets.
All of those are good practical reasons why you wouldn't want the backup awake.

But privacy is a better reason (or a different one at least). Imagine if your backup was awake and someone malicious (a branch of the government, criminals, a sleazy insurance salesperson, your mother-in-law, etc) got access to your backup and started pulling data from it. Doesn't even have to be anything dramatic. You go in for a backup, the machine turns on and next thing you (backup you) know, you're in a room basically being interrogated or even tortured for days upon days untill you tell the man in the black suit everything you've ever done wrong. Once they have a nice big list of things to fine (or worse: 'behaviorally correct') you for, they delete the interrogated backup, restore the previous one and BAM, you find yourself on the wrong end of the government's sights.
Same could happen with criminals extracting passwords, blackmail data, security features for your work, etc, etc.

For those reasons, I imagine that human rights activists, privacy advocates and other interest groups would strongly lobby for heavy encryption on backups and likely also try to push through rules and regulations to ensure backups remain 'asleep' untill they are needed.

$\endgroup$
2
$\begingroup$

The core of my solution would lie in throwing away Cartesian duality.


Forget the viewpoint which handles the consciousness as the software and the brain as the hardware, because ths sort of thinking is long-outdated. The consciousness constantly "rewires" the hardware of the brain, and - while we might be able to read the consciousness out of the living tissue, and we might be able to also simulate a brain to run the consciousness on, the consciousness will rewire this virtual brain and we might or might not be able to apply this rewiring on the target physical brain.

In the latter case, the implanted consciousness which runs on a simulated brain for a while will suffer significant losses not only of its development during the simulated state, but also of its former abilities - thanks to the associative nature of the human memory. So running a consciousness on a simulated brain will almost surely be handicapped when planted to a physical brain.

$\endgroup$
1
$\begingroup$

So, you want to create a world similar to the one described in "Altered carbon", where a person can be stored and restored later in different body. There, people remember only things they experienced until last remote storage update, or until they die (if their storage was saved, if not then it is a real death).

In this world, it is possible to recall "dead" people back to life, and to update their "memories" so they "remember". It is also possible to put people in virtual environments, where they continue to "live" without a body. They have to pay for their virtual apartments and life.

Also, it is possible to play all kind of simulations to interrogate or torture people. And it is possible to get people out of the virtual prisons (real prisons were abandoned, since they take criminal's body).

$\endgroup$
0
$\begingroup$

Copyright concerns. Laws could be in place that would make non-cyborg carried "minds" unable to be activated, thereby preserving the integrity and belief in digital reincarnation.

Having multiple copies would undermine the belief of a reincarnation like this. To avoid such a thing, laws have been put in place that just bypass it, by making "non attached" minds illegal. The technology to download/copy a mind could be designed with fail safes to preserve the integrity of digital beings.

Is the soul unique? Is its copy?

$\endgroup$
0
$\begingroup$

This would make an excellent case for a split in society.

The Transcendents

They want to live in the simulation, forever. Not much to say about them, it's what it is. They need the Engineers to keep the computers running. Pre-Transcendents spend their physical lives working at the goal of transferring into the simulation.

The Engineers

They like the idea, but are happy to provide the services for the Transcendents, keeping their computers running. They likely get some quirks out of it. They are religious fanatics and will do anything to keep the simulation running. Most of them are Pre-Transcedents.

The Conservationalists + answer to your question

They oppose the idea for moral reasons. God does not wish it (or he would have done it himself). Also, imagine the fraud that could be possible. Imagine terrorist attacks of whole new dimensions (i.e., putting a virtual mind into a torture chamber which high processing power, being tortured for years in-simulation during a span of only minutes real-time). The ones staying back will never be able to let their virtual relatives go. And the waste of energy for all this madness! Besides, they surely are not really conscious anyways and it's just a hoax to get the monthly fee from the deaders...

The Realists

They see the economical problems - what happens if all we have left is a world-spanning computer, and the physical humans (whether they are Engineers or Conservationalists) number less than the critical minimum?

.

You certainly can come up with more. It has been done in literature before. ;)

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .