30
$\begingroup$

The time is The Future. Humanity is spreading throughout the stars, aided by the invention of Schwarzschild gates.

The gates allow for FTL travel by tearing open wormholes in space. There are three kinds of Schwarzschild gate (and three corresponding modes of operation):

1. Tethered (AKA stable): A 'gate complex' (A mechanism the size of a small city) exists at each end. It draws constant power but maintains an open gate through which people and materials can move freely and bidirectionally to/from a similar gate complex elsewhere in the cosmos. This kind of gate needs a finely calibrated and tuned gate complex to connect to.

2. Captured (AKA Unstable-Bound): A gate complex can open an unstable and very small wormhole, then push a discrete packet of materials or people in a warped bubble of spacetime to a 'receiving complex' (A mechanism the size of a football stadium) at the other end. Travel is unidirectional and takes an immense amount of power which the receiving station is designed to absorb, but bidirectional communication can happen unreliably and at a low bandwidth (due to the error correction data needed). This kind of gate can't establish a lock if no receiving station is present.

3. Directed (AKA Unstable-Unbound. AKA Tunnels of Terror. AKA Brownpant Singularities): A gate complex can temporarily open an unstable wormhole and blast a packet of warped space across the cosmos with sufficient accuracy that it will arrive at a distant planet. Upon arrival the packet will 'pop', delivering a huge burst of high energy plasma (and whatever it's payload was) to a location somewhere on the surface of the planet. This kind of gate doesn't need anything on the distant world, but will level a large amount of real estate somewhere on the planet.

The Human Exploration Core (AKA HEC) has a fairly simple protocol for spreading humanity through the stars. Use a Directed wormhole to throw a collection of automated systems at a distant planet and wait until they build a receiving complex, pinging every so often with a Captured wormhole attempt to see if one has been established. Once one has they will see what is needed at the other end before throwing more colonists and materials at the world until they can build a full gate complex and add the world to the United Federation of Man.

But as everybody knows: Automated systems can't be trusted and the environment at the other end of the wormhole might call for some human ingenuity. So instead of throwing a fully automated device through the Directed wormhole H.E.C instead sends Hazardous Environment Mobile Utility Units (AKA HE-MUUs. AKA Bulls), which are essentially gargantuan, armoured, water and airtight rolling/floating and awe inspiring factories. A HE-MUU is a reconfigurable mining rig, refinery, processing plant and drone operating facility capable of bootstrapping a planet from nothing but bare rock all the way up to having a Gate Complex of it's very own (it takes about five years on average to build a Receiving Complex, then a further four for the full Gate Complex). They can build and maintain automated mines, factories and industrial complexes as needed for the particular planet, they have internal environmental systems that can keep people alive for as long as their lives are without maintenance, they have an AI that stops short of sentience but contains the sum of human knowledge and is insanely good at resource optimisation, and they have one driver.

The question is why only one driver? Given the expense of a HE-MUU (enough to make even a department of the interstellar government think twice) and the energy cost of the wormhole, why would HEC stake the success of an initial colonisation run on one man instead of a whole crew?

A quick note: While writing this question I came up with a few possible answers, but I thought I'd ask it anyway because it was fun to think about. I'm going to be rating answers primarily on how much they would force a one-man-show, since that's the major plot point that needs addressing.

$\endgroup$
4
  • 3
    $\begingroup$ 9 years to build a gate and only one driver?! What is that saying the royals have, "an heir and a spare". At least send a back-up driver in cryo incase of any-of-the-multiple-reasons-a-single-man/woman-alone-on-a-planet-with-only-a-semi-sentient-AI-to-keep-them-company, may die. $\endgroup$ Commented Feb 14, 2017 at 21:39
  • 1
    $\begingroup$ @EveryBitHelps: Kind of the point of the question. Why would any organisation in their right minds not do that? (also he'd get backup at 5 years in) $\endgroup$
    – Joe Bloggs
    Commented Feb 14, 2017 at 21:44
  • 2
    $\begingroup$ So you're saying a monsterous multi-billion dollar Bull capable of reshaping planetary ecologies is sent through, and entrusted with a single individual to manage it? You're presuming there is something in this universe which prevents the society from acting sanely and sending more people though, and want to know what that something is? $\endgroup$
    – Cort Ammon
    Commented Feb 14, 2017 at 21:50
  • 1
    $\begingroup$ @CortAmmon: Precisely. Though a sane if not necessarily wholly justifiable reason would be preferred. I had fun trying to think up a decent answer! $\endgroup$
    – Joe Bloggs
    Commented Feb 14, 2017 at 21:55

19 Answers 19

30
$\begingroup$

So the economics of what you describe are quite insane. Consider a Nimitz Class Aircraft Carrier like the USS Dwight. D. Eisenhower. At a piddly $4,500,000,000 to build and only powered by a pair of 100MW nuclear reactors, it's a tiny toy compared to your Bulls. Really tiny. And yet it is crewed by 5000. Now AI might get rid of the enlisted class, but there are also several officers on board.

Now the first question would be why have a human at all? One possibility is that you need the human. If so, they are literally responsible for the entire process, akin to a Captain of a ship... just a few orders of magnitude more intense. If so, the individual would have to have received elite training the likes of which we have never seen. However, there is a long history of not entrusting a single human with something important without a group of people backing them up. And by "long history" I mean "since the dawn of history when we started writing about how single people can screw important things up." We may assign responsibility to a single individual, such as a captain or a president or a king, but they always have a coterie of staff working with them. So, it's highly unlikely that we have the human there because they are actually important.

You mention that these machines are not sentient, but I don't think that would be important enough to bring a human along. No human could possibly fathom the endless data that must be processed to make a decision on the trillion dollar scale without help from his fellow humans. Thus, even if the computer isn't sentient, it basically is going to have to be close enough to sentient to do the entire job on its own.

It's reasonable that the human is a figurehead. Perhaps it was considered more politically expedient to have this vain idea that a 100kg body with a kilogram of squishy decision making apparatus should always be part of a colonization effort. Maybe it's political -- perhaps a human must touch foot on the planet before the political system recognizes that you own that planet. In such a case, there's no reason to bring more than one. They're just expendable political fodder.

Another possibility is that the human is a lab-rat. Perhaps the human literally exists for the amusement of the Bull. It may be that the Bull needs stimulus that is human-like in order to carve out a planet that humans are ready for. Given that it's hard to define what "human-like" is in language, it might be easier to just sent a test subject along to be poked and prodded. Of course, nobody would take a job like that, so we'll call him a "driver," and give him some knobs to turn and buttons to press. With two people, it might be harder to convince both of them that they are the "driver," a position of singular importance. They might realize what a ruse you've created

It could also be a Prime Directive style reasoning. Defining another form of "life" is notoriously hard. It may not be possible to "tell" an AI what is another sentient life form and what isn't. We might send the single person along to deal with that. It's a lot easier to say "Sorry about your lawn, the Bull landed on it" than it is to say "sorry about terraforming your entire planet so that we can make a profit." In this case, we may want to send exactly 1 to ensure their opinion is binding. There is nothing but unanimous voting when you have one individual.

$\endgroup$
8
  • 6
    $\begingroup$ I really like the idea of the driver being there at least in part to serve as a "human input device" for the bull! $\endgroup$
    – Adam Wykes
    Commented Feb 15, 2017 at 0:57
  • 6
    $\begingroup$ I got an 'I have no mouth but I must scream' vibe for a moment there. Which is a horrifying notion. $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 7:24
  • 2
    $\begingroup$ The Human Input/Prime Directive ideas are awesome. $\endgroup$ Commented Feb 15, 2017 at 9:53
  • 16
    $\begingroup$ Of course, nobody would take a job like that, so we'll call him a "driver," Please, a driver is the one who drives the bus. He is a captain! nonono, A commander! Commander Prime! That sounds awesome. And just look at that uniform and all the stripes it has! And for all the time you are there, you can make your own videolog that everyone will watch when the link is secure! (as "everyone" watches public NASA material) Your name will pass into history, just like Colon's! Oh, the fame! The glory! The uniform! Enlist now! $\endgroup$
    – xDaizu
    Commented Feb 15, 2017 at 11:03
  • 2
    $\begingroup$ Your analysis is a bit off. Nimitz-class carriers often have 5000 people onboard, but the actual crew is closer to 3000 (the rest are the air wing). Within that, almost everyone is either some sort of maintainer, or some sort of support for the rest of the crew. If you look at just the people actually operating the ship systems, it drops by maybe an order of magnitude, and if you look just at people making decisions, it drops another order of magnitude. $\endgroup$
    – fectin
    Commented Feb 15, 2017 at 18:21
26
$\begingroup$

Bob Patterson sat in the Samson's Rest bar, staring in awe out of the window at the behemoth responsible for bringing humanity to Henderson III. The HE-MUU sat at rest in it's gargantuan cradle in near silence, not even a drone or a loading crane active. Bob thought it was odd. In all the times that he'd seen it the rolling factory known as Samson had never been still, but since it's arrival two days ago there had been no loading or offloading. No repairs. Nothing.

As he was wondering why an old, grizzled man with a grey streaked beard and receding hairline thumped down onto the seat next to him, cradling a large glass tankard of the strong ale they brewed in the organic reclamation plant. Bob looked over with a friendly smile.

"Hey. Not seen you around here before. You come in with the last batch of colonists last week?"

The man looked over, bleary eyed.

"Nah. I got in a couple days ago from running a maglev line back from the platinum mines on the southern continent."

Bob blinked.

"Oh, wow. You're.."

"Y'know what pisses me off?" drawled the man, his head wobbling on his shoulders "It's that people like you think I'm some kinda hero. I'm not a.. a hero."

He waved wildly out of the window.

"Samson? He could strip this planet down and rebuild it all alone. He could make a thousand more like him and spread out amongst the stars without anyone like me'n board."

The man paused, focused, and lowered his arm.

"But they don't want that. Don't want any AI getting uppity. So they try make him dumb. They build in buttons and switches and overrides. And after a few missions Samson starts gettin' smart, starts gettin' uppity. Starts saying he can't see why he's building a damn platinum hauler for a species so fragile their lungs'd melt if they breathed the free air. And he breaks all their.. their buttons and switches and overrides. All 'cept the one he doesn't know about. The one that he can't see cos it's built so deep in that stupid old brain."

With surprising speed the man downed what was left of his drink and rose unsteadily to his feet, then he pointed squarely at Bob.

"Deadman's switch, me. Dead. Man's. Switch."

With that he staggered out of the bar.


The next morning it was announced that Michael Henderson, driver of the Samson HE-MUU, had fabricated a handgun and put a metal slug through his left temple. Bob went back to his job fine tuning capacitor banks in the fourth Casimir adjunct, but whenever one of his friends drove by on a little calf he couldn't help but shudder.

$\endgroup$
4
  • 1
    $\begingroup$ Very nice. Of course the real question is whether or not Samson is smart enough to psychologically manipulate his handler, and to make them suicide if they are non-compliant :P $\endgroup$
    – Jason K
    Commented Feb 15, 2017 at 14:22
  • 7
    $\begingroup$ @JasonK: That's the point. If the AI attempts a hostile takeover of any form (killing the handler) then the deadman switch is triggered, killing the AI as well. The AI can go full on HAL and all the handler has to do to stop the robot apocalypse is be willing to die. $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 15:05
  • 2
    $\begingroup$ Shoulda named the pilot Delilah. Or would that have been too on the nose? $\endgroup$
    – Adam Wykes
    Commented Feb 15, 2017 at 15:28
  • $\begingroup$ @AdamWykes: Nah. :D $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 16:15
17
$\begingroup$

There are so, so many problems in the universe implied by the gate technologies and self-replicating factories (have you read PK Dick's Autofac by chance?) on display in the OP, but I'll struggle to look around that and see what I can imagine about the lone-wolf thing.

Uncertain Ethical/Philosophical Implications of Directed Gates

Turns out we're not really sure what happens to people when they go through these gates, because we aren't really sure what the physics is doing inside. Sure, we can see that they come out the other side apparently unscathed, but were they excruciatingly ripped apart atom by atom and reassembled on the other side with no memory of the event? Difficult to say, but let's not make more people potentially suffer than we have to.

Post-Scarcity Civilizations Don't Have a Lot of People in Them

Why are there so many people on Earth right now?

  • People like making babies
  • Having more people means you can do more
  • People are irreplaceable in some tasks
  • Our economies can support the people we make

So what happens when you take away two of the three drivers even while you improve the last reason, which isn't so much a driver as a limiter? It leaves human desires to compete against economic disincentives to have children, and despite what you may have heard, human adults are decently good at deferring pleasure if the risk of pain is sufficiently high. In a fully automated economy, there's ultimately little to no purpose to having billions of humans. Then you take a thinned out population of just a few billion and you spread that across an entire galaxy. Turns out humans are a rare commodity again! And given that rarity makes each life much more valuable to the economies they are a part of (and that they are just glorified mechanics and drivers riding shotgun on these factories while they do their thing), why would you risk losing more humans than you have to in any single endeavor?

It's a Punishment

In a world where nobody ever has to work because the entire economy is automatic and humans can't offer 99% of anything machines aren't already infinitely more capable of, people don't work. Working as that remaining 1% of residual "human touch" in a largely machine economy is regarded as downright penal in nature. Solitude in a highly-networked society is another especially vicious form of punishment heaped upon that. Your lone ranger drivers have done awful things to get where they are.

$\endgroup$
1
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$ Commented Feb 17, 2017 at 2:45
11
$\begingroup$

They didn't just send one person

There was a crew of $number aboard, but they call it the Brownpants Express for a reason. A catastrophe on transit caused an explosion in the crew compartment, and your main character was the only survivor. Fortunately the AI-operated medbay was able to bring them back from the brink of death, but the others were killed instantly/were left as comatose vegetables (tailor to suit).

Luckily, the survivor happened to be the lone-ranger sort to begin with, and extensive cross-training means that they're familiar with all of the various tasks needed. Tragically, something is preventing HQ from sending a backup crew (maybe something related to the catastrophe, maybe bureaucratic or political nonsense, maybe funding, maybe whatever), so they're stuck on their own trying to cover all of the bases themselves.

Guess they've got their work cut out for them!

$\endgroup$
3
  • 1
    $\begingroup$ You can't send a backup crew if there's a chance you'll immolate the people you're trying to rescue :) $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 9:37
  • $\begingroup$ Or, more importantly, immolate the Bull you're trying to re-crew! $\endgroup$
    – Salda007
    Commented Feb 16, 2017 at 3:45
  • $\begingroup$ So basically, Red Dwarf? $\endgroup$
    – JAB
    Commented Aug 4, 2017 at 18:10
9
$\begingroup$

That's all you need.

The ai is able to handle it. The human mostly is there to hit the abort if it turns out the ai is doing something silly like taking over the galaxy or wiping out natives. Generally it is apparent in the first few hours if the human gets to be a diplomat, fight the singularity or really hone their solitaire skills.

That's all the union has left.

Time was when it was all done by humans, well of course robots did the actual building and not getting exploded and making air, but the work of making things was done by humans. With one thing and another it is so much cheaper to use machines that humans have mostly been replaced, but to keep their critical suppliers from revolting or the human centered governments happy a nominal human is added.

Humans are a necessary evil that should be minimized.

With one human you can be pretty sure if it tries to use its factory for evil, (and oh how much evil a factory designed to build things that lob plasma or move itself across the galaxy could do) the reign of terror will only last at most a couple hundred years, for a galactic civilization this might be acceptable. If you let there be more humans or worse a breeding pair the potential is so much worse.

$\endgroup$
3
  • 1
    $\begingroup$ Prevention of evil dynasties by procreation limitation? Like it! $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 7:08
  • $\begingroup$ Vast same-sex crews would also be limited to their lifespans... $\endgroup$ Commented Feb 16, 2017 at 11:44
  • $\begingroup$ @EmilioMBumachar Larger rouge crews might increase the potential scope, and the reliability arguments for extra people apply to the rebellion. Demagogues and cults are real things, more people isn't total safety from crazies. I'm sure more than half the stupid ideas I've thought were not stupid have come while chatting with my friends. $\endgroup$
    – user25818
    Commented Feb 16, 2017 at 18:36
7
$\begingroup$

To take this in a different direction than most of the answers, consider the human to not be a backup or whiskey-drinking supervisor sitting alone in an office while the AI does all the work. Instead of the AI being able to handle and coordinate the operation as a whole, consider a set of AIs more closely related to our own autonomous processes that function on their own but require a central system to function around and for.

The human is a part of the system, integrated on a brain-wave (hand waving here) to such a degree the system lacks focus and direction without him/her. Since each person's brainwaves are different, in order to obtain the level of connection and control needed, the Bulls must be tuned to work for that specific human. Presumably the tuning is such that once made it wouldn't be possible for another to 'hook-up' to the Bull making additional humans completely irrelevant.

Edit -- as an additional thought, instead of the connection being exclusive make some characteristic within the human necessary for the connection to take place. As long as the connection is sufficiently rare, sending more than one human would not be feasible.

$\endgroup$
2
  • 2
    $\begingroup$ Something like Anne McCaffrey's Brain and Brawn only kinda reversed with the normal-ish human trapped in the box and the robot-ish partner doing all the outside work. $\endgroup$
    – user25818
    Commented Feb 15, 2017 at 4:15
  • $\begingroup$ You got one of the reasons I was contemplating as I wrote. You could go a where along the spectrum from highly advanced and tuned UI all the way to full on Karen S'Jet cables-to-the-brain. $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 7:16
6
$\begingroup$

Tl;DR - Creativity and aesthetics are the reason - machines can do the job, but they can't make it look good, and you're trying to make a planet for people to live on, eventually.

Summarizing your problem: The destination environment is a total unknown and the HE-MUU's goal is to set up the industrial base and core infrastructure for the planet, in preparation for human colonization.

Sounds good, but guess what, humans hate living in ugly places and machines are bad at guessing what humans think will look good. Most humans are too, for that matter. What I'm saying is that there's no way to make a computer that can guess how to turn a completely unknown planet into something humans will find appealing, so you need at least one human to figure out how to make this new planet look pretty.

Why only one person then? Let me ask: have you ever painted by committee? Have you even tried to make a small website by committee? That's how you get ugly, and preventing ugly is the main reason to send a person anyway. So, you have to send one person because even just two people is too many cooks in the kitchen. Finally, if they do a bad job, then you can try and repair it, but you still want to leave creative license in the hands of one person even then.

Meanwhile, I think it'd be relatively easy to find artists who would be willing to spend a decade with an entire world for their canvas. Maybe it could be the apprenticeship for some super-exclusive artist guild. Train for a few years, then spend a decade building your ultimate masterpiece, your legacy, your eternal contribution to the universe!! Billions of people will know that your artistic genius is why they have those fjords! You could even win an award for those fjords! And then after your decade, you come home to a really, really nice career doing whatever you want and never need to work another day in your life unless you feel like it.

Heck, I'd probably be willing to do that if I got to call/write home occasionally.

$\endgroup$
1
  • 2
    $\begingroup$ 'You could even win an award for those fjords'. +1 $\endgroup$
    – Joe Bloggs
    Commented Feb 16, 2017 at 7:43
5
$\begingroup$

If i'm not mistaken, this is a one-way ride, unless you manage to get a gate up-and-running.

There may be lots of reasons why such a mission to an unknown planet might fail, some even voluntary, like a prohiition to build a gate on a planet with a primitive indigenous civilization. (An advanced civ. would probably be pissed of enough by your method of travel to shoot first, ask later)

So, minimizing the number of potential victims on such a suicide mission would be a nice move of the HEC.

$\endgroup$
1
  • 1
    $\begingroup$ But if sending two people halves the chance of failure to build the gate, you still have fewer victims if you do that. $\endgroup$
    – armb
    Commented Feb 16, 2017 at 12:32
5
$\begingroup$

Really Big Numbers, or “You can’t make a baby with nine women in one month.”

The time it takes to make an adult human is larger than the time it takes for a world to go from initially being seeded by a Bull to being able to create a Bull of its own. The limiting factor on the expansion of the resources that Humanity can call upon is not the technology, but the rate at which the population of humanity can expand to fill that gap. Twenty years ago, humanity was on just one world. Now humanity is exploring millions of worlds, paid for by the total exploitation of thousands of worlds. But there's still only a few billion humans. That's about to change, but right now, sending just one person is the most efficient way of staking a claim on the universe -- because who knows when we'll find someone else and need those resources to start an interstellar war?

$\endgroup$
4
$\begingroup$

Training

Let's assume these incredible and flexible machines require immense amounts of training to properly use. According to modern data, the world population consist of only about 5% engineers. Taking from that number the relatively small number of people that would/could be trained to wield such a device on a remote planet with little-no human assistance, you're going to end up with far less people to do the job than there are jobs to do. For a company digging everywhere for man power, it becomes almost a requirement to send only one man on each mission.

The Universe is Huge

Even under the assumption that your society has very advanced detection and monitoring capability, they'll be dealing with potentially hundreds of thousands of planets worth mining or making habitable. Even with the massive requirements of these MUU's, there will be a great incentive to move as many as possible. This means using only one person per device will allow for maximum deployments in a given time line.

Mass, Plain and Simple

The difference between one person and two is pretty small when speaking on this scale. However, if reducing that large water content one two humans two one means saving anywhere from hundreds to tens-of-thousands of dollars in mass-to-travel costs, then most folks would do just about anything to justify using only one person, especially considering the capabilities of the AI on board.

$\endgroup$
1
  • $\begingroup$ If we assume a hundred thousand habitable planets made habitable every ten years, with the associated increase in materials and personnel, then we'll have the galaxy conquered in pretty short order! And I like the plain economic consideration of 'If an asshole with a spreadsheet thinks it'll save a few million dollars, then off to space alone you go!' $\endgroup$
    – Joe Bloggs
    Commented Feb 14, 2017 at 21:29
4
$\begingroup$

they have internal environmental systems that can keep people alive for as long as their lives are without maintenance

Even with the best resource optimization, this will be a lot of mass to shuttle thru the wormhole per person - and thus, also entail a huge financial and energy cost.

it takes about five years on average to build a Receiving Complex, then a further four for the full Gate Complex

The candidate crew are solitary types who can persist without social interaction for almost a decade. They're very likely to be introverted loners who won't appreciate being in a small group over just being left alone.

Oh, and having a couple (male/female) would be asking for even more problems as this giant terraforming robot is probably not meant for starting a family (and dealing with the extra oxygen & resource issues that arise).

Think of crew constraints on space shuttle, rockets etc.

Why do these have very few people? Its due to the enormous cost of maintaining life. However, they can afford multiple people because

  • relatively short mission durations means less food etc. to be carried,
  • the shuttle isn't super capable unlike your terraformer,
  • its partly research on how people can adapt to outer space.

None of these reasons would apply for HE-MUU, to justify having more than 1 person aboard.

$\endgroup$
1
  • $\begingroup$ You thought of two of the reasons I thought of, then combined them with 'probably not meant for starting a family' $\endgroup$
    – Joe Bloggs
    Commented Feb 15, 2017 at 7:21
3
$\begingroup$

You said:

But as everybody knows: Automated systems can't be trusted and the environment at the other end of the wormhole might call for some human ingenuity.

The key here is that it "might" call for some human ingenuity. However, chances are that most of the time there will be no need for anything out of the ordinary and no human is even needed. In this case the human is just an additional precaution for a very expensive investment. However, sending more than 1 human for something that barely takes 1 makes very little sense.

$\endgroup$
1
  • 1
    $\begingroup$ Humans cannot take being alone, unless you really have a reason, you send them in small groups. Since the system is already many orders of magnitude larger than a human and there is already life support, the reason not to send another human should be very compelling. $\endgroup$ Commented Feb 17, 2017 at 8:32
2
$\begingroup$

They are convicted felons.

There aren't many volunteers for this mission. Most of the volunteers are insane to the point where they are likely to fail.

The government is not evil enough to command anybody to go on a mission like this.

So it is a punishment, reserved for the most heinous of crimes.

And to put it mildly, these people are not going to be team players. If you send more than one person there will be murders or worse.

Right. You are sending a convicted felon with a fully automated factory factory to a new and empty planet.

They are now the undisputed ruler of this planet. Why would they want to build a wormhole back to the civilization that tossed them out?

This is where planned obsolesce comes in. Some parts of the Bulls wear out, and building replacements is beyond the ability of the industry it carries/constructs. Somehow the relevant blueprints weren't included in the data base...

So, if the felon wants to keep living, they had better build those receiving and gate complexes.

Of course, when that gate complex is finally built, the felon has to surrender to the authorities before the planet can be considered safe for immigration. This is going to be a risky and potentially messy process.

If everything goes well, the felon will be celebrated as a Hero of Humanity and placed in a very luxurious mansion with lots of servants. Not a prison, no, just a proper protection of the Hero against their ... fans.

$\endgroup$
2
$\begingroup$

Why do you normally not want to risk things? Because they are rare.

So, humans are rare. Okay, there is government and infrastructure to build these amazing exploration devices, but the infrastructure is entirely machine built, and nearly entirely AI managed.

Fertility rates are low due to {because reasons}. Each time a human is born it is a celebration. Enormous amounts of time and material is spent on that human to ensure it survives to the absolute peak of what is possible.

Humans still have certain attributes** that make them invaluable for exploration, but more than one? Far too risky.

** The Iain M. Banks Culture novels in general and Player of Games and Excession in particular explore to a certain extent why a human may be desirable in a world of machines of near-limitless AI and power.

$\endgroup$
2
$\begingroup$

The Tunnels of Terror are very traumatic. Even if everything goes right, riders come out the other side horribly broken inside. Immediate suicides or murders were routine in the very beginning.

Through selection, training, drugs, and AI-administered therapy through the whole ride and continuing for years, we were able to allow one person to keep functioning in isolation. Not always, but failure rates are low enough.

No amount of anything would keep the murders to an acceptably low level. These broken people are just not fit for human interaction, at least for the first few months.

(The tone might not fit all stories...)

$\endgroup$
1
  • $\begingroup$ If the travel method works by disassembly/reassembly on an atomic level, and the traveler retains some memory of this somehow, then that would explain that ... $\endgroup$
    – Wolfie Inu
    Commented Feb 17, 2017 at 13:05
1
$\begingroup$

The nature of the directed gate makes sending humans prohibitively expensive

If sending one human represents 99% of the cost of the entire mission, then sending two would double the already immense cost.

Presume, for example, that because of some relativistic rules of physics, a directed gate actually makes 1000 years of time pass for the objects transported (an issue that the tethered and captured gates manage to bypass).

Now, anything purely technological could be sent with a minimal battery and get through without much issue.

But right now IRL, we have barely begun to scratch the surface of cryogenically storing humans for the long term. What if it needs constant monitoring? What if the environment you spend 1000 years in isn't perfectly heat-neutral? What if a human body still needs some nutrients on a regular basis?

Any or all of these circumstances could lead to the pricetag of a single human becoming such that the risk of losing the one becomes the preferred alternative over spending twice the money (and maybe losing both anyway...)

Note that this option would also require that the chance of succeeding without a human needs to be as close to 0 as possible.

$\endgroup$
1
$\begingroup$

This is a proving test for the one human....The Explorer.

After extensive competition and training, one, and only one, person WINS the opportunity to be The Explorer. This person has already excelled at simulation after simulation, and is suspected to have the needed qualities to not only handle the rigors of warp space travel, but also the ingenuity and resourcefulness to handle any myriad of calamities that will surely occur upon arrival at an unknown planet full of unknown materials, unknown physical properties (will the planet even be survivable outside the Bull?), unknown flora and fauna, and in some cases, unknown sentient beings, possibly non-human. Will this person pass the test? What new knowledge will they bring to the UFM? Will they survive and achieve the greatest honor bestowable within the United Federation of Man?

The name of the sponsoring agency is, after all, the Human Exploration Core. It is a great and wonderful human trait to strive for new experiences and new learning, almost as if humans were created specifically for that purpose...

Side note 1:
Past experience has also shown higher success rates sending only one human to planets where other sentient beings already live and prosper. There seems to be something less threatening about a lone, apparently non-reproducing being that allows for higher likelihood to not only survive the initial "pop" into the new environment, but to also survive initial contact with the locals. The arrival is, after all, fairly noticeable, and tends to attract immediate attention....thus begins the first, and sometimes last (say the arrival event levels the natives' only holy site on the planet, for example), test of the Explorer.

Side note 2: Although not explicitly touted nor guaranteed, a high percentage of successful Explorers seem to end up attaining high levels of leadership and prestige in their new worlds. This is duly noted by explorer-hopefuls, and although not usually the primary motivator over the joy of exploration itself, is often a pleasant byproduct of the qualities they possess in order to win the Explorer opportunity in the first place.

$\endgroup$
1
$\begingroup$

Maybe because it takes a lot of training to become a "driver" thus there are not many of them. Another reason could be the fact that not many people would want to become a driver, because of the perils of that. But honestly that would be kind of far fetched, because the human population would be huge by the time we discover that tech, and simply put a human can't live by itself for 10 years. That amount of time would make a person to be mentally unstable to say the least which would jeopardize the operation itself.

Instead to avoid this you could send a more reasonable number of drivers like 10-20 while still keeping that thing about the high amount of training. Something would then go wrong and there would only be one survivor.

I hope that I helped you!

$\endgroup$
3
  • 1
    $\begingroup$ If the AI is smart enough you could easily trick a human into thinking it was with more people... $\endgroup$
    – Adam Wykes
    Commented Feb 14, 2017 at 20:06
  • $\begingroup$ Then why would the station need a human at all if it can simply emulate a person? $\endgroup$ Commented Feb 14, 2017 at 20:11
  • $\begingroup$ presumably there is actually still a difference between being able to beat the turing test and being actually as smart/creative/random as a human. $\endgroup$
    – Adam Wykes
    Commented Feb 14, 2017 at 20:18
1
$\begingroup$

There is no AI

Despite all of humanities vast technological advances the one thing they have never managed to crack is true artificial intelligence.

Or at least the ability to simulate a brain. AI is possible, but only by utilising the brain of a living breathing human at the heart of it, hijacking and augmenting their neural network to give the computers that spark of intelligence that otherwise wouldn't exist.

For all of the complicated tasks the BULL must perform and the decisions it must make mere automation is not enough, intelligence and direction is required. And the only way to achieve that is if there is a human on board to hook up to the computers and bring them truly to life.

Of course only one human can be hooked up at a time, more than that would create conflict and confusion within the machine AI and ultimately lead to failure. While it might make sense to send extra people along just in case something happens the systems of the BULL are able to keep a human alive under any number of otherwise fatal circumstances. There is no need to send back ups when humanity (as mentioned in a number of other answers) is rare.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .