29
$\begingroup$

My story takes place in a near-future where robots have gotten so realistic it is impossible to tell them apart visually from humans. Hyperreal human robots are the latest fad, where the robot has its own body functions, needs to eat, and can die. Where the story takes a turn, is that a mechanic who is selling robots at an affordable price is selling real people that are convinced they are robots.

Would it be possible to convince someone that they are non-human?

Would the mechanic have to acquire them at a young age or does that not matter depending on the human and potentially the available drugs?

$\endgroup$
14
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – L.Dutch
    Commented Sep 25, 2020 at 5:41
  • 3
    $\begingroup$ Classic episode en.wikipedia.org/wiki/Insane_in_the_Mainframe $\endgroup$ Commented Sep 25, 2020 at 14:07
  • 10
    $\begingroup$ Relevant xkcd: xkcd.com/329 $\endgroup$ Commented Sep 25, 2020 at 14:09
  • 1
    $\begingroup$ There's a pretty disturbing scene in Ex Machina where the protagonist starts to wonder whether he's a robot like Ava and cuts his face to see if he bleeds. He does. $\endgroup$
    – F1Krazy
    Commented Sep 25, 2020 at 16:30
  • 6
    $\begingroup$ I assume you haven't seen Blade Runner? $\endgroup$
    – njzk2
    Commented Sep 25, 2020 at 21:47

9 Answers 9

33
$\begingroup$

Yes

Simple answer is yes. People have an image of themselves. This image changes over time and the idea of being human fits into this. But there are plenty of irregular personal views that go along with it. Some are harmless, some are just strange and some turn the world upside down.

There is a person who is convinced she's a cat and tries to become one as much as possible with plastic surgery. Some think they are truly in a TV show like the Truman show. Some think they are clones of themselves. Each is just a part of how we perceive ourselves.

The concept of being a robot isn't far off from any of these. There are indubitably already people who think that. The introduction of more and more real robots will just enhance the numbers. As you say, how can you be sure? With relatively simple ways many can be convinced to be one. Just like people can be convinced of other horrible things that never happened to them. Like people who got wrong psychological help and get convinced they have been mistreated or raped, which is a very sensitive subject.

This can be done at later ages, but it is best to do it in formative periods for our self image. Childhood and puberty, but also a part between twenty and thirty there are great changes in the search for who and what we are. The midlife crisis or when people are old and try to see what they were and who they want to be in the latest stages of life are also great contenders. But as our image is always changing and crisis of personality can appear at any age.

Being a robot with hyper real robots? It isn't just likely, but it'll happen if they arrive.

$\endgroup$
12
  • 3
    $\begingroup$ I hate to be negative, but… (consistently with the opening comment) many of the examples here have an existing pathology. (It is not so difficult to cause a pathology, but it is difficult to do this without the subject knowing that they are being assaulted.) $\endgroup$
    – Carsogrin
    Commented Sep 25, 2020 at 0:39
  • 2
    $\begingroup$ @Carsogrin you can't fool all the people all the time (I hope) but some people are vulnerable to having their self-image and "memory" changed and often do not recognise the "gaslighting" as being an assault on their self-image. If they are lucky and receive help later then they may retrospectively understand what was done to them. $\endgroup$ Commented Sep 25, 2020 at 4:42
  • $\begingroup$ @Carsogrin not entirely true. It is much easier for some people, as they have tendencies for it. But if the Milgram experiment teaches us anything, it's that in half an hour you can make someone change their perspective from "good person" to "murderer". This goes for supposedly well adapted persons as well. If you target them with the right psychological methods in a formative period, you can convince a lot of people. Even just seeding doubt will be enough for some to then convince themselves. $\endgroup$
    – Trioxidane
    Commented Sep 25, 2020 at 6:06
  • 6
    $\begingroup$ Just so you know, gender dysphoria is formally not considered pathological, and is not a "disorder". In the vast majority of cases, this is set from birth and does not change over time. It appears at the point a child becomes aware of the difference between boys and girls, around 5 or 6, and remains true. It may take much longer for the child to feel they can express it, of course. This may not be something you intended to be offensive about, sure. But it's fair to let you know it is, and it's worth deleting that sentence from an otherwise decent answer. $\endgroup$
    – Graham
    Commented Sep 25, 2020 at 7:46
  • $\begingroup$ @Graham yeah didn't want to word it as a disorder. I didn't remove it, but changed "disorder" into "irregular". $\endgroup$
    – Trioxidane
    Commented Sep 25, 2020 at 7:57
15
$\begingroup$

Yes.

As long as the question is "Can a person be brainwashed sufficiently to switch their own identity" then the answer is yes. It won't work reliably in 100% of cases, or may require a long time to work, but at least some fraction of subjects will be successfully convinced that they are, in fact, robots.

However, this will work only if this society does not have any "litmus test" to tell humans and robots apart. It is apparent that in your society legal standing of humans and non-humans is vastly different (probably similar to "A.I. Artificial Intelligence" or "Cloud Atlas"), so there would be a strong demand for such test.

$\endgroup$
5
  • 4
    $\begingroup$ Even if society has such a test, and it's ~100% accurate, some fraction of the population won't believe in it. I would indeed argue that it's even easier to instill a conspiracy theory that dismisses such a test than it is to convince someone they're a robot, and certainly no reason why it couldn't be done as a package deal. $\endgroup$
    – Gene
    Commented Sep 24, 2020 at 17:52
  • 1
    $\begingroup$ @Gene If we focus of what a person thinks, then you are correct. But for the whole scheme to work, some others have to convinced too, otherwise such person would be known to other as "denialist". $\endgroup$
    – Alexander
    Commented Sep 24, 2020 at 18:00
  • 1
    $\begingroup$ Why not conspiracy theory? How about a hyper-intelligent AI who is convincing humans they are robots in order to gain equal rights for artificial intelligences? If the "robots" who are people test as people, while insisting they are robots, the testing is discredited. Plus they have voting rights but self-identify as robots, so they can sway elections. I think I just came up with a story plot... $\endgroup$
    – DWKraus
    Commented Sep 24, 2020 at 20:05
  • 1
    $\begingroup$ This is a good point. If it happens early enough in the early roll-out stages of the hyperreal fad a test may not be available yet. Part of my story that obscures the line, even more, is that the robots are assembled from prosthetics that are grown for human use. So essentially an AI downloaded into a human body that was never alive. $\endgroup$
    – Alex
    Commented Sep 24, 2020 at 20:17
  • $\begingroup$ You can fool some of the people al of the time, all of the people some of the time, but you can not fool all of the people all of the time. $\endgroup$ Commented Sep 24, 2020 at 21:43
4
$\begingroup$

Yes, people could be fully convinced of such things

Most people usually are reasonable and believe true things, but that's not universal and exceptions are quite plausible. Of course, someone who's strongly convinced of something false would be generally considered delusional, but delusions, including unusual delusions, are not that rare.

A particular example with some similarity is the Cotard delusion where a someone is fully convinced that they are dead or that they do not exist or that some of their (actually existing) body parts are missing.

If there are at least a hundred people who believe that (here's a study of 100 such patients), then it seems quite plausible that in a robot-filled society there might be some people who falsely believe that they are robots.

However, that would be a very unusual state caused by fundamental mental problems; it's not something that could be caused by someone convincing them that it's the case - just as you can't simply convince someone with Cotard delusion that they are in fact real and alive.

$\endgroup$
2
$\begingroup$

Yes, and it already happens, you see mental illnesses are messy things and can cause your brain to act differently than usual. Of course that's a bit different than what you want, or you could just get those people with those illnesses and use them to make the job easier.

in short yes, and some people already do think they are robots due to mental issues.

$\endgroup$
4
  • $\begingroup$ If it's possible for someone to believe they are dead then it's possible for someone to believe they are a robot since the first is quite a bit more far fetched than the second. $\endgroup$
    – DKNguyen
    Commented Sep 25, 2020 at 15:37
  • $\begingroup$ "some people already do think they are robots" — Odds are you're right, but still, {{citation needed}}. $\endgroup$ Commented Sep 27, 2020 at 18:26
  • $\begingroup$ @Quuxplusone ill see if i can find a good source, the closest thing i can find is species dysphoria, my previous source was peers of mine which isnt that good. ill let you know when i find a good source $\endgroup$
    – Topcode
    Commented Sep 27, 2020 at 19:39
  • $\begingroup$ As I understand prolonged mental illness affects the user by making irregular brain patterns the usual. I know some people have an illness that would make it easier to convince they're a robot, but a problem would be their potentially unstable state of mind which would make them poor robots. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 13:53
1
$\begingroup$

Yes, because all these things work the other way round.

The limits of the "humanity" are pretty much artificial. The society may treat as a "human" only the members of the tribe, only the males, only the adults, only the citizens, only the free men, only the followers of a particular religion, etc, etc... Human beings not included in the "real humans" group can be pretty much entrenched to the state of fact, even more if they are born in such environment. The history is full of examples.

The progress of the society and the human rights in particular may be traced by the widening of the "human" definition. Today, more or less civilized jurisdictions just consider every homo sapiens as human, but this approach is rather new and not really universal.

Should we widen the "human" definition even more? Should it include sophisticated machines or some animals? We don't know for now, but the debate has already started.

So yes, you pretty much CAN shift the limit back to wherever you see fit. Technically.

Resetting a formed personality into not being a human is surely possible, but it may be easier and safer to breed slaves just like slave owners did (and probably still do).

$\endgroup$
1
  • $\begingroup$ I see what you are saying, these are some of the themes I hope to cover. The reset in personality is an evolution of slavery/ human trafficking, where it is more beneficial to have fewer humans who are more subservient than a large number that would be used as company labor (factory, agriculture, sex). Most of the mundane robot tasks are completed by non-humanoid robots, leaving more human tasks for the humanoids. If robots were designed to not always obey, how would you tell the difference? $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 12:58
1
$\begingroup$

I want to say No.

From the setting you're describing, the robots are only imitating humans. They are very good imitations and from the outside there is no way to tell the difference.

But from the inside ? Carsogrin talk about it a bit in his answer, but "Cogito ergo sum". Even now, in a world where everyone is human (or so it seems ?), most people will, at some point, think something along the lines of "What if I were the only real human and everything else is fake ?". But now, you want people to think that they are the fake ones ? It seems unlikely.

Moreover, your robots servants would probably have some kind of programmation that makes them serve and obey people. Humans wouldn't. I'm not saying programming humans isn't possible, it's just very hard and very visibile. Breaking or raising a human in servile obedience will make them stand out compared to real robots that don't need that kind of treatment. And while a robot won't be able to rebel ever, any human that tries will see that it cans.

(I'm not saying you won't have humans that think they are robots, ever. But these people will be a minority and it won't be possible to consistently train human as robots.)

What you could achieve though, is having humans that fake being robots faking being humans. But deep down they would know, in my opinion.

$\endgroup$
8
  • $\begingroup$ If people couldn't convince themselves that they're something they're not actually are, we would have less mental asylum patients overall. $\endgroup$
    – user28434
    Commented Sep 25, 2020 at 16:16
  • $\begingroup$ @user28434 Read my brackets. Some people will think they are robots, obviously. But OP is asking if you could mold people into thinking it with sufficient accuracy that it becomes a business. $\endgroup$
    – Jemox
    Commented Sep 25, 2020 at 17:34
  • $\begingroup$ One-piece I am still working out is how prevalent the robots are in society. On one hand, the girlfriend-robots would be able to walk the streets freely without anyone really knowing due to realism, but on the other hand, sex robots are expensive, and very few people today walk around with their pleasure asst. tools. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 12:41
  • $\begingroup$ Imagine one person who is super subservient and obedient to a fault (lots of these people exist already). The base code for the robots being manufactured is an edited version of that person's conciseness that is left primarily intact. The robots are conflicted at some level whether they are truly human but the personality that was copied would never challenge authority. Individual tweaks on a finer level allow for variations, which make it pretty impossible to tell if someone isn't from that consciousness. Also, people have always served people that isn't a robot characteristic. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 12:50
  • $\begingroup$ Yes but where a robot will be built in its final form (adult) with a very servile mind, a human will be raised from birth or be broken as an adult to be very servile. The very simple fact that they need to be raised/broken is contradictory with the fact that they could be robots. As I said, you might have a final product that will act as if it were a robot but it will know deep down that it isn't and anyone enquiring will find the truth very quickly (unless you also build your robots with false memories of being tortured from birth but that would be weird). $\endgroup$
    – Jemox
    Commented Sep 28, 2020 at 13:09
1
$\begingroup$

p.s. The core question is that of whether or not you are okay with a rather dark account of how the human beings are persuaded to behave as robots.

————

I think I am not unusual in being aware that I am self-aware, and knowing that there are serious issues around making a machine that is (genuinely!) self-aware. (It is (trivially?) easy to get a robot to be able to talk about itself as a distinct entity, as though it is self-aware.)

On those terms, the issue is that the subject would realise that, since they are self-aware, they must not be a robot.

One easy option, theoretically speaking (assuming the required medical knowledge and technology), would be to actually [I can’t think of the word] lesion the part of the brain that does this (without cutting the skull, of course), such that the subject actually is a robot, so to speak.

Perhaps it is possible to ambush a sleeping person and do this while they are asleep, without them knowing… the question then being whether they would thereby also not be aware that their faculties were diminished. Otherwise, if they were sufficiently young they might not even remember.

One difficulty is that, unless one does actually remove the personhood of the subject (by whatever means), there will always be the possibility of them coming to the point of revolting.

Another approach would be to try to protect the subject from ever learning that robots are not (genuinely) self-aware, but this would be out of one’s control once the subject had been sold. It would help if the general public thought that the robots’ self-awareness actually was genuine. In that vein… it might be a workable strategy to convince the subject that {the belief that robotic self-awareness was not genuine} was false.

Actually, you could have it that robots actually are self-aware. I am definitely not in this school (albeit not closed to being persuaded), but there is a respectable school of belief that, given that human beings are (genuinely) self-aware, it certainly must be possible to make robots that are. Conversely, some of these individuals simply fail to grasp the difference between being able to refer to oneself as a distinct entity, and actually being self-aware. (Some are so convinced of this that [in the computer game “The Talos Principle”] an argument is made that one certainly could make a self-aware machine out of string, as long as it mechanically replicated the pertinent brain functions. To me, this is more of a demonstration of how stupid the position is. [Actually, in “The Talos Principle”, this might be exactly what they intend; apart from the inordinate difficulty, I was turned off the game by the fact that one never knows what the philosophical commitments of the authors are… and that the game is designed poorly such that this matters. [Or maybe that is what they want you to think…])

Overall, I think the least violent scenario is one in which the general public is convinced that robots’ actually are genuinely self-aware (when in fact they are not). Indeed, as I have said, it is not only entirely possible in real life, but actually to be expected, that many persons who saw a robot referring to itself (without being genuinely self-aware) would strongly believe that it was indeed genuinely self-aware, such that they could not be convinced otherwise.

By the same token… in real life, many readers would find it perfectly plausible that robots might be made in the future that indeed are genuinely self-aware.

The corollary of all this is that, if indeed a robot is self-aware, it is defined as a person, and people start campaigning for it to be treated as such and released from slavery.

————

So…

You can take the position that robots can be genuinely self-aware. This makes it easy to convince a human being that they are a robot, but opens up a can of worms politically (inside the story).

You can take the position that robots can not be genuinely self-aware. Ostensibly, this requires a dark account of what the “robot” seller does to their victims (whether it be psychological oppression or brain lesions or what-have-you).

You can take the position that it is philosophically a contentious question. Within this, one option is to have the human “robots” kept in the dark about this (with the noted attendant difficulties). Another option is to have this a live question for the human “robots”.

As “chasly-reinstate-monica” has observed, as long as there are physical differences, that is a point of weakness for the “robot” seller.

[I am not quite 100% — somewhat distracted. I think I have covered my material, and done so in an orderly fashion, but the reader should be aware that it might be either that they need to read again more carefully or my account actually is flawed.]

p.s. Using drugs instead of (e.g.) brain lesioning is initially plausible (for the subject), but would become a difficulty when the subject had been sold (unless robots have to take pills as well). (You could hand-wave a drug that did the brain lesioning, but this is not a pivotal issue.)

Edit_01

Possibly there is a distinction to be made between being self-aware and being autonomous. (I don’t know offhand.)

$\endgroup$
4
  • $\begingroup$ Wouldn't it be darker if there was no brain operation? The research I have done post questioning seems to suggest that no drugs would be required, most people would conform to their captor's wishes with enough time. And then the mechanic has pretty much unlimited time to mess with their brains before selling them as robots. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 12:34
  • $\begingroup$ Robots at this point are self-aware. People want robots who resist, who think that they are human. Some people want to live out strange fantasies, some people just want a human companion but can't because of social anxiety. There is little physical difference between the two as well, common human prosthetics used for enhancements are used to build the robot bodies. Lookup operation MK-Ultra and the Unibomber for examples of how high doses of drugs can leave permanent effects on how people process the world around them. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 14:00
  • $\begingroup$ @Alex. • I am thinking that, in the case of any given dehumanisation… it is less dark if it is reversible. • Ceteris paribus, greater dehumanisation is morally worse and thus more dark. $\endgroup$
    – Carsogrin
    Commented Sep 28, 2020 at 17:13
  • $\begingroup$ @Alex. If the robots are self-aware, then [to the degree that they are sophisticated (like humans)] they are slaves [except with associated arguing around reason created], and the whole picture around human “robots” is just euphemistic slavery. No fancy psychology nor drug stuff is required… although it is more interesting. $\endgroup$
    – Carsogrin
    Commented Sep 28, 2020 at 17:21
0
$\begingroup$

What if robots are actually indistinguishable from humans in every sense by then?

They have cognitive processes, they have sentience (subjective experiences), they have emotions, they are self-aware.

At this point our definition and general perception of what a robot is would be not too different from what a human is. The only difference might be something minor like the place of origin. Suppose robots are created in a lab and humans are born via reproduction.

Someone who does not remember their childhood could be fed false information as to whether they came from a womb or a lab, and then they'll believe that.

$\endgroup$
4
  • $\begingroup$ Yeah, one of the main themes is whether or not robots can give consent, and what would happen if people get robots that never can say no while being programed to feel emotions. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 12:37
  • $\begingroup$ @Alex Nice, so does my answer satisfy your question? $\endgroup$ Commented Sep 28, 2020 at 12:55
  • $\begingroup$ no, unfortunately. The first question, if it would be possible to convince someone they are non-human, not necessarily the similarities between robots and humans. I was looking for examples of where this has worked in the past, hopefully, real studies, where people were convinced that they were part of a separate reality. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 13:06
  • $\begingroup$ The second question, "... acquire them at a young age or does that not matter depending on the human and potentially the available drugs?" isn't answered either. Telling someone their mother gave birth but they were actually raised in a tube wouldn't impact their actual day to day life, adopted people I know weren't told they were adopted until around 10 years old. These facts are easy to disprove or prove with records, photos, etc. Whether you are human or not I think is more complicated and would require a lot more work on the part of the mechanic. $\endgroup$
    – Alex
    Commented Sep 28, 2020 at 13:08
0
$\begingroup$

In the case that the robots are indistinguishable from humans, then they are basically clones. In other words, those robots are humans, and humans are robots. In that case, it shouldn't be too hard to convince a child that he was born of cloning.

Of course, you'd face the same slavery charges whether you are selling the robots or the humans.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .