4
$\begingroup$

So I've been thinking up a (alt-history)setting that aims to be a classic space opera, with the caveat that it will be completely hard science, with the only handwaving being what technologies do or do not exist. I don't want much computerisation or automation, since that will encourage manned spaceflight to develop. Arthur C. Clark speculated about telecommunications satellites, but thought that they would have to be manned stations since that the time only vacuum-tube based computers were available. In my setting this is the case, as solid state electronics were never invented / are not possible for some reason.

My question is this: If solid state electronics like transistors and diodes imply the possibility of eventually creating microchips(since they operate on the same fundamental principals), does the existence of vacuum-tube based computers also imply that solid state devices would eventually be developed?

I don't want there to be an inherent contradiction in having the one but not the other. Imagine how little sense it would make if we had two-stroke engines, but not four-stroke, and that is what I'm trying to avoid.

$\endgroup$
4
  • 4
    $\begingroup$ In electronics, "solid state" and "semiconductor" are synonymuos. While a diode can be either a vacuum tube or a semiconductor device, a "triode" is always a vacuum tube; the corresponding solid-state device is called a "transistor". Crystal radios were known since the beginning of the 20th century. Lilienfeld discovered a primitive FET transistor between the two world wars. Shockley and his team discovered the bipolar transistor in 1947; the rest is just technological development. $\endgroup$
    – AlexP
    Commented Nov 26, 2017 at 4:23
  • $\begingroup$ You can produced free electrons in a piece of hot metal, the electrons will be attracted by another piece of metal(cathode) vs a transistor, electron simply tunnels through a non-conductor(T&C apply) don't get me wrong both are quantum mechanics but the cost, size, efficiency, and energy consumption differs greatly... $\endgroup$
    – user6760
    Commented Nov 26, 2017 at 5:57
  • 1
    $\begingroup$ Microchips are solid state electronics like transistors. Miniaturization happened within a decade or so of the invention of the first BJT. Would've happened sooner, but TI thought it was just a fad and sold the rights to Sony after sitting on the technology for over 5 years $\endgroup$
    – nzaman
    Commented Nov 26, 2017 at 14:10
  • $\begingroup$ This questions is deeply problematic - you can't make semiconductors impossible without changing physics and invalidating the hard science premise, so the only real reason you don't end up with them in a technological society is to develop something better first. But to do that in the realm of hard science, you need something that seems technically plausible, which raises the obvious issue of why we aren't using that instead of semiconductors. Hence you're limited to the thin band of ideas we think should work, but which haven't yet been reduced to practice. $\endgroup$ Commented Nov 26, 2017 at 21:39

5 Answers 5

6
$\begingroup$

Your alternate history is very plausible because most people don't really think about the key product that changed all our lives: transistor radios

Those of us who lived through the late 60s to mid 70s remember transistor radios. They were amazing! A radio you could hold in one hand, that ran for a decent amount of time on a battery! You can make a lot of arguments about how history transpired, but most technology is only created in a laboratory. It's develpped due to need.

And need comes in many packages. The military needed a fast way to compute artillery trajectories (ballistics). That drove early computation (aka, computers). But that didn't drive the fundamental technology behind modern digital computers. Transistor radios did. And the need was commercialism. The bipolar transistor came along at a time when vacuum tube technology was stagnating and once people started buying the little bounders in the bazillions... vacuum tubes were dropped like yesterday's used underwear.

So, rather than figuring out a way to remove semiconductors from history (which is impossible), let's change the order of how things happened and relegate digital computing to the backwater that analog computing enjoys today.

The path we're replacing is this:

  • Vacuum tubes lead to radio & early computing.

  • Bipolar transistors make radios small, cheap, and practical.

  • Teenagers buy transistor radios by the bazillions.

  • Miniturization uses bipolar and FET transistors to create integrated circuits.

  • Millions begin to play solitare without ever having to buy "the devil's picture book."

A lot of tube development in the 40s was stymied due to competing patents and the all-too-common habit of manufacturers creating unique pin-outs (the configuration of how the pins on electronic devices are used). Let's assume that royal mess didn't happen. Further, let's assume that someone makes a world-altering breakthrough: tube miniturization.

This is an important issue. Transistors needed (simplistically) only one miniturization path: semiconductor manufacturing. Tubes would require multiple miniturization paths: vacuum tech, glass/enclosure tech, anode/cathode element tech... etc. But, what if some enterprising German (it's always a german, right? And the Nazi's were renowned for their scientists...) came up with a way to mount the anode and cathode arrays on a substrate rather than standing them up like a light bulb....

  • Vacuum tubes lead to radio & early computing.
  • Fictional scientist Herr Franz Voight invents inexpensive vacuum tube miniturization in the late 40s, scaring the crap out of the Allied military. He's smuggled to the U.S. via operation paperclip where he continues his research. Vaccum tubes are replaced by glass-coated substrate-mounted radiotronic arrays. The military start calling these devices "glarrys" (glass-array devices).
  • General Electric introduces the first miniturized glarry radio in the mid 50s.
  • Teenagers buy these new radios by the bazillions, driving post-war consumerism.

And here's the second thing we need: to replace binary computing with something more valuable. Transistors made binary computing an easy path, but what if someone has the foresight to see what computing could become? Now that we have small, fast, efficient glarrys, we need to build a computer based on decimal math rather than binary math....

  • Texas Instruments develops analog computing in response to the need in the financial markets for greater computing capacity.
  • Texas Instruments creates the first glarry integrated circuit in the 60s. The world comes to know them as "glics" (pronounced "glicks," which is much easier to say than "eye-seas").
  • Steve Wozniak and Steve Jobs create the first popular home computer: the banana, using off-the-shelf glics.
  • Bill Gates yet again addicts the world to solitare.
  • Silicon-based digital computing is popular as a Heathkit through the 90s, but never really picks up steam.

Market inertia makes the adoption of digital computing nearly impossible: but it wouldn't stop the development of semicondutor-based diode technology, which has uses outside of computing. Power supplies and light-emitting diodes come to mind....

$\endgroup$
11
  • 3
    $\begingroup$ Problem with this approach: vacuum tubes work on thermionic emission. That means they generate a lot of heat. It wasn't that they didn't want to make them smaller, it was simply that that was the smallest they could get without spontaneously combusting. Even compact fluorescent lams came to be through the use of high frequency switching generated by an electronic circuit to minimise heating effects $\endgroup$
    – nzaman
    Commented Nov 26, 2017 at 14:15
  • 2
    $\begingroup$ @nzaman, it was the smallest they could get with the materials that they had under the conditions and technologies available to them. The same could be said for early transistors which required more power and generated more heat. Early engineers felt they couldn't make smaller ICs due to heat ... until they did. Make the "tubes" smaller, using better materials and more precise manufacturing, and you can reduce the power and thereby the heat. Besides, we're not looking for a perfect solution, we're looking for a believable solution for a story. $\endgroup$
    – JBH
    Commented Nov 26, 2017 at 14:24
  • 1
    $\begingroup$ @MiguelBartelsman, I've been an electrical engineer for 30 years. In that time I've seen technology shrink from LSI to VLSI to ULSI to the point where we no longer apply "...LSI" anymore because it just keeps getting smaller. Note that I doubt radiotronics could shrink anywhere near to where semiconductors have (there are physical limits to how close an anode/cathode can be and still have the radiothermic effect), but I do believe they could easily shrink to the millis and possibly to micros. Note that the efficiency of analog computing would delay the need to shift to semiconductors. $\endgroup$
    – JBH
    Commented Nov 26, 2017 at 14:50
  • 1
    $\begingroup$ @ChrisStratton, did I miss your answer, Chris? I'm sure it's around here somewhere. It'll be the one that proves tubes couldn't be miniturized further than they were and that heat can't be localized more than it was (despite tubes undergoing both miniturization and heat localization during their history). I'm sure your answer also posits a "suspension of disbelief" quality description of a non-semiconductor basis for computers that's better than semiconductors themselves. If I could just find that answer.... $\endgroup$
    – JBH
    Commented Nov 26, 2017 at 20:52
  • 1
    $\begingroup$ Let us continue this discussion in chat. $\endgroup$
    – JBH
    Commented Nov 26, 2017 at 21:35
2
$\begingroup$

This is a tough one. Reason being, there are several elements which can evince semiconductor effects. According to U of Illinois, some of the biggies are silicon or germanium, or compounds such as gallium arsenide or cadmium selenide.

These elements are hard to get rid of, as they come from different element groups:

elts

Note also that this kind of behavior was discovered early.

"The first documented observation of a semiconductor effect is that of Michael Faraday (1833), who noticed that the resistance of silver sulfide decreased with temperature, which was different than the dependence observed in metals" (https://djena.engineering.cornell.edu/hws/history_of_semiconductors.pdf)

Meaning that it's hard to not notice.

So my best first advice is to posit a shortage of gallium, germanium, arsenic, and selenium on your planet. I'm not totally happy with this, because it's pretty arbitrary, but it's a start.

$\endgroup$
1
  • $\begingroup$ A decent theoretical start, but all you really need are Silicon, and Phosphorous and Boron (or maybe even Aluminum) as dopants. Silicon seems unavoidable, phosphorous is key to life as we know it. And aluminum is absurdly abundant. $\endgroup$ Commented Nov 26, 2017 at 20:20
2
$\begingroup$

No, the existence of tube-based electronics does not necessarily imply that solid-state electronics will be developed, or even are physically possible. A transistor isn't just a more highly-evolved triode, it's a completely separate technology which happens to do the same thing using different underlying physical processes, so it's entirely plausible that one could work while the other doesn't.

Of course, if you fully reason things out, then removing the quantum tunneling effects used by solid-state electronics would also affect a large number of other processes, most likely including some of those required for life as we know it to exist. However, given that tube-based retro-sci-fi is a well-known subgenre, you should be able to simply handwave that without affecting suspension of disbelief.

$\endgroup$
1
$\begingroup$

Yes

Technology allows us to do things we wouldn't be able to do otherwise, as well as make what we can do easier. My opinion is that any civilization that attempts to enter space will have a fundamental drive to do one or both of these things. Progress is inevitable. Particularly in the case of space flight, where you're trying to skim as much weight off of things as possible.

I'm having a tough time playing devil's advocate with myself. On the one hand, I try to think about a world with less competition, less of a hurry. However, I'm getting stuck with that one person who looks at a room full of vacuum tubes and doesn't think "there has got to be a better way". How did they get the room full of vacuum tubes if they were happy with pen/paper?

$\endgroup$
1
$\begingroup$

Well, I interpret the question as "what can computing be based on?". And the answer is: literally anything.

Vacuum tubes and miniaturisation

Tubes were around for a quite a time, there were some efforts to miniaturise them. Also, as the folk propaganda says, they are less affected by EMP, which is important in the Cuba crisis era.

There were real computers built with tubes. Efforts were made to miniaturise. Most prominent non-computing special purpose: radars and microwave ovens.

Why transistors?

Transistors can be manufactured relatively easily, they can be easily scaled down, creating a microchip, they can switch very fast, raising the tact frequency and hence the computational power.

However, at least in the very beginning, other technological bases were viable.

Pneumatics

Neil Stephenson made a fictional, but a very convincing example of an early computer based on compressed air and ventils, i.e. pneumatics.

Gears

Your favourite steampunk example is the Difference engine (never built in reality at that time, modern replicas to some extent were built).

Relays

Follow the steps of Zuse: why bother with transistors (not there yet) or tubes (meh), if you could use relays, that clacking switch things.

Biology

Use some cells capable of switching or accumulating signals, if you go very soft Sci-Fi to biopunk, maybe the neural cells as such.

Analogue computing

Span a resin sheet of given surface with given properties. Place some balls of given weight at finely calculated positions. Watch them move. Read up the final position. Voila, you have just solved a differential equation with an analogue computer.

Many more schemes were viable, from abacus to somewhat advanced physical simulations. They all died when digital computers became feasible enough.

Quantum computing

Even we (as in: humanity) have not really got there yet, but it does not mean, someone might not be faster or better. If you focus on reactors and jet engines instead of silicon, you might develop the physics for quantum computer faster than we did, even with some... deficits in computing power.

Basically, quantum computing is misusing the states of the matter to do you computing tasks, sort of similar to analogue computers, but on a wholly different level.

$\endgroup$
1
  • $\begingroup$ Biological (possibly DNA-based) alternatives likely have the most story potential, being something that seems vaguely possible but with the engineering not yet worked out in our own world. Quantum computing is likely to be more of a computational "accelerator" rather than a basic technology, ie, it will require something simpler for interface. $\endgroup$ Commented Nov 26, 2017 at 20:22

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .