4
$\begingroup$

I have a world where both computers and biological technology have matured to serve the needs of my people. Both have numerous people researching and making use of their equipment. For example, biological computers can implement things like logic gates, parallel processing, store massive amounts of data, simulate things like neural nets on a truly massive scale while at the same time simulating things like games, movies or dreams etc. On a more machine-like scale, a biological exoskeleton of sorts exist that people can step into and use. Humans in this world discovered and used biological computers well before the first mechanical computer was even developed. Such biological technology is usually something a human interfaces with and operates with their mind.

Computers in my world also exist but they obviously came later. The major advantage they have over existing bio computers and technology is that they are persistent. A mechanical or digital computer doesn't have to be regrown, they can operate without human input, they can survive certain austere environments better and don't consume resources outside of electricity once built. The major downside is getting the metals and resources for computers requires expeditionary operations to other floating islands that they coast by. However, my floating islanders have been able to construct computers that resemble that of the mid to early 1990s. Especially for military applications.

How can I stop or significantly slow computer development from advancing past the early to mid 1990s?

Year range is 1990-1995 Notes:

  1. The primary users of computers are the military, they have the most incentive to push boundaries on computer development. If anyone is to be impacted the most, it is them.
  2. All floating islands are moving, though they take time. Smaller islands fall to the surface of the planet eventually. Through a process new islands are essentially lifted back into the sky.
  3. There exists two tiers of islands with multiple layers of clouds. Past the first few cloud layers is the first floating island. Radio and wireless communications don't work in this area. Only wired communications work. Past the cloud layer above it, there exists another island. These islands are a bit barren but all forms of radio and wireless communications work, this is also where most combat takes place. Clouds can "poison" both living organisms and non-refined metals when a newly formed island lifts up. The peoples' island in both tiers have been tethered to each other and travel in the same direction.
  4. Silicon is the primary semiconductor used and harvested for computers.
  5. Setting takes place on planet that is not Earth. Different materials, metals, weather, and particles exist.
$\endgroup$
6
  • 1
    $\begingroup$ What you mean by limiting computer development past 2010? The computing technology of 2010 was not qualitatively different from what we have in 2022. In fact, a large part of the computers in use today were made before 2010. I have no idea how old you are, but in 2010 we had all the operating systems and usual applications in common use today (for example, Windows 7 is from 2009, Office aka 365 with VBA is from 2000), we had smartphones which were not significantly different from those of 2022, we had massive storage arrays, high-performance digital cameras and so on. $\endgroup$
    – AlexP
    Commented Nov 29, 2022 at 12:25
  • $\begingroup$ @AlexP Silicon, my bad I edited it to be the correct form. As for the 2010, the reason I chose it was because the transistor counts in chip blew up. And size also got significantly smaller. GPUs became significantly more powerful as well, which allowed for an explosion into AI research and bringing it into the mainstream effectively. IBM Watson, 4G, Azure, IPV 6, cloud computing, smart phone processors increased in capability. Data also increased during this era, allowing us to start training automation and AI data based on previous information. Position services for uber also took off. $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 12:31
  • $\begingroup$ @AlexP The 2010 isn't a hard and fast rule, however. Just an upper limit. Limitations can be in 2000 as well where both software and hardware were different compared to now. I'm mainly looking to avoid the massive computing power that we have now, especially on such small devices. Computers across the board have gotten significantly more powerful and capable. $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 12:42
  • $\begingroup$ I think that the correct threshold would be at some point in the early 1990s; what happened in the 2000s is mostly the software catching up with the capabilities of hardware. IBM Watson was in rude health in 2010, making ready for its famous public appearance in 2011. IPv6 was introduced in 1995 -- by 2010 I already had already done some projects for customers who wanted to be prepared. Microsoft Azure became fully commercially available in 2010. And as for position services, I had a GPS-based turn-by-turn navigation app on my (cheap) 2006 Glofiish. $\endgroup$
    – AlexP
    Commented Nov 29, 2022 at 12:44
  • $\begingroup$ @AlexP I can update it in that case. I'm just now realizing that the broader infrastructure and even something as critical as aviation or space systems aren't running on the latest computer tech released by something like Nvidia or Intel. That said, an upper limit of the early 90s even w/ hardware would still be going for what I want which is limited computing capability $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 12:55

5 Answers 5

5
$\begingroup$

The number one way to limit computer development is to have a lower limit on feature size on a microchip.

It was once thought that the Intel Pentium line was approaching that limit, before other optical, UV, x-ray, and electron lithography techniques were developed. If only visible light can be used, a 1995 or so Pentium has features as small as they're going to get, so adding capability to the chip will mean making the actual die larger (much larger), which cuts down on the number of dies on a wafer and increases costs, as well as increasing power consumption and producing cooling bottlenecks.

If your floating island culture never developed technologies like electron microscopy (if they didn't have CRT television, they probably didn't) or lasers (a UV laser source was the first method used to reduce feature size beyond the limits of visible light optics) they might come up against a physical limit that prevents building chips more capable than an original single-core Pentium at around 300 MHz clock. RAM would be limited to around 1 MB on a single chip (8 or 9 chips on a module), which would put the practical limit for a computer at around 32-64 MB -- once again, right in line with mid-1990s development in our world.

$\endgroup$
7
  • 1
    $\begingroup$ An interesting factoid - the limit on the size of the silicon wafer used was not cost - it was distance. Today's processors are so fast, that in the time it would take for a signal to get from one side of a large wafer to the other (we are talking inches or centimeters here) the chip would have gone trough maybe 100 cycles and the information would be ancient history. Chips need to be small in order for the signal travel distance to be less than the clock speed. $\endgroup$ Commented Nov 29, 2022 at 20:20
  • $\begingroup$ Continued. On a modern gaming computer, by the time the image is displayed on the monitor, the game is probably many, many clock cycles ahead, and the processor has made umpteen more decisions, making the screen display an historical artifact based on where the actual game is. In other words, you are actually dead and out of the game in the virtual world long before you know it in the real world. $\endgroup$ Commented Nov 29, 2022 at 20:24
  • $\begingroup$ Interesting. Would the presence of LCD screens help off shoot the lack of CRT screens without causing any issues? As I understand the technology was developed in the early 1900s. I do have laser designators in the setting, though they are in the 1000nm range. That can be pretty easily handwaved though for why lower nm ranges aren't used for etching. $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 23:19
  • $\begingroup$ Why use LCD or CRT if have OLED like things. In Your setting is easier to get light emiting biosystems than electronic ones. You can give them imput from computer. $\endgroup$
    – Kamitergh
    Commented Nov 30, 2022 at 8:15
  • $\begingroup$ @Kamitergh OLED actually makes a lot of sense and does remove the CRT issue. The negatives should be offset since new screens can just be made. $\endgroup$
    – FIRES_ICE
    Commented Nov 30, 2022 at 9:45
2
$\begingroup$

There is no reason to use clasical computers more advancend than early '80 ones with that good biological computers. There will be no pressure to make them beter and more complicated.

Classical computers will be v.good in calculations but not need to make them big and energy hungry. Read about RISC and AMR architecture. Thats all what needed for most calculations.

Organic computers will be much beter in any targeting systems, any tracking and any other military aplications. Even shielding can be thiner than on clasical computers due to auto repair systems and redundancy.

$\endgroup$
4
  • $\begingroup$ Both radio and radar require metal parts as well as things like analog to digital converters. For military applications outside of something in the infrared spectrum, beyond visual range, you're going to need computers for number crunching. That said, most flight software and automation software can be written in lower level languages like C. That said, I'm assuming that things like flight software and automation software can be written in pure assembly? $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 12:39
  • $\begingroup$ Radio and radar need only shielding on neural net interface converter. Visible spectrum is lots wider for biological aplications than actually used in digital ones. Flight software can be easier done on neural nets based on neurons - and in Your scenario easier to make this way. No need automation software if can make smal group of neurons to do that at easy. $\endgroup$
    – Kamitergh
    Commented Nov 29, 2022 at 12:50
  • $\begingroup$ Command guidance would be an issue though, unless I can somehow connect a biological computer to a digital one. That said, I didn't realize that the visible spectrum could be that large. Biological computers inside a military aircraft strike me as something that would warrant at least a computer or mechanical backup at the very least. These bio computers decay out after all. $\endgroup$
    – FIRES_ICE
    Commented Nov 29, 2022 at 13:06
  • $\begingroup$ Computers decay too. You have oxidation, diffrent thermal expansion and lots of other factors. Biological systems can repair themselvs computers not. You can build bio repair for computer but on this level is easier to repair bio systems. Sensors - in theory You can use any vave wich is absorbed by atom with electron emmision in mechanical and any vave wich is absorbed by atom with electron emmision or electron agitation in bio system :) $\endgroup$
    – Kamitergh
    Commented Nov 30, 2022 at 8:02
2
$\begingroup$

Much of our computer development was driven by military requirements and then spun off for business use. For example, cannon needed range tables calculated which was one of the first uses of computers. Nuclear weapons have driven computer development (see Livermore Labs). Even today, some of the biggest and fastest computers are being developed for military weapons development and for spying on everyone else. In our world, the only limit on military computing is how much money is available.

If you don't want that development, have different military technology, have no money for military, find a way to do that kind of research with biological computers, or have alternative ways to settle disputes. (Borrow from the Bonobos and have big orgies instead of fighting?)

$\endgroup$
2
$\begingroup$

The crucial limitation on computers was heat. The pre-1990's technology was based on five volts, and speed was dependent on the current. Even at only 5 volts, a LOT of heat was produced to get the speed. When chip voltages dropped to 3.3 volts, speed increased dramatically. The jump from 5 volts to 3.3 volts was not an intuitive thing, an entirely new chip manufacturing technology had to be developed. Without that, computing power would have been capped. We just can not get todays performance at 5 volts.

$\endgroup$
0
$\begingroup$

It's always possible to have social or political factors hinder technological progress: laws, taboos, some selective anti-science movement.

It also might be possible that some particularly frequent and widespread electro-magnetic pulses would prevent the development of electrical circuitry below a certain scale. The advancement of electronics in the real-world has a lot to do with its components becoming ever smaller, but also more susceptible to the "E1" component of EMPs. Electrical circuits with larger -- and hence slower -- components (particularly vacuum tubes) maybe more robust against this.

That leaves the question of why the EMPs happen. Nuclear ordnance randomly going off in the ionosphere? Some other source of regular, extreme ionizing radiation up there? Perhaps a pulsar would have this effect if one of its beams sweeps over the planet at regular intervals. (This is very speculative.)

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .