11
$\begingroup$

An environmental phenomenon, cloud or mist of sorts, rolls into an area that greatly affects "thinking machines". Basically, any types of computers or logic instrument based on electrical impulse/inputs are severely degraded in their performance or outright stop working while under the duration of this phenomenon. While most machines regain their ability to reliably work again after some repair, without adequate protection such machines are simply are irreparable or so damaged that they're written off.

Low level electrical devices are fine for the most part. Sure, the light bulbs might flicker a bit, but it's not like they shut off completely or explode.

What type of weather phenomenon (natural or manmade and fictional obviously) would cause technology that computes using electricity to become defective while under the duration of said phenomenon? But also leaves low-level electrical systems intact?

Basically, computers may not work the best, but lightbulbs still do.

Edit: Defective in this case means that a computer doesn't have to outright break with sparks coming out of it or anything like that. It could be as simple as that the machine is giving wrong outputs or wrong calculations (see the issues of bitflips for example). Essentially this would make computers unreliable. Similarly, repair doesn't necessitate going in with a hammer or wrench. It could be as simple as a hardware or software reset.

Furthermore, any society living through such conditions wouldn't use integrated chips for their lower-level electrical applications like lightbulbs. So, it can be safely assumed that the lower level electrical devices would be fine. Even if they're considered outdated by our standards. The main thing affected here are computers.

$\endgroup$
8
  • $\begingroup$ Does solar weather count? If so, I direct your attention there. $\endgroup$
    – Topcode
    Commented Nov 1, 2022 at 2:45
  • $\begingroup$ Your title asks "why?" while the body asks "What?". Those are two different questions $\endgroup$
    – L.Dutch
    Commented Nov 1, 2022 at 4:27
  • $\begingroup$ @L.Dutch Fixed the title. Ultimately any answer would answer the "why" and "what" right? By virtue of listing a name as an answer and its explanation. Or do I have things mistaken? $\endgroup$
    – FIRES_ICE
    Commented Nov 1, 2022 at 4:49
  • $\begingroup$ @Topcode Would repeated solar attacks on Earth cause any major unintended consequences? This type of phenomenon wouldn't be something that happens once in a blue moon. It would be frequent enough such that people design their society around it. $\endgroup$
    – FIRES_ICE
    Commented Nov 1, 2022 at 4:54
  • $\begingroup$ I strongly recommend A fire upon the deep by Vinge, if you haven't read it! $\endgroup$
    – Fattie
    Commented Nov 1, 2022 at 19:18

6 Answers 6

20
$\begingroup$

You have a problem, let's call it problem #4, but we'll discuss that in a later. First, let's discuss what you mean by "low level" electrical applications

Ignoring the issue I'll bring up momentarily, a "low level" electrical application differs from a "thinking machine" application from a physics standpoint in only two ways:

  1. "Low level" electrical applications have long wires.

  2. "Thinking machines" depend heavily (but less every day) on magnetism.

When you hear threatening terms like "Electromagnetic Pulse" (EMP) and the usually Hollywoodesque consequence that everything electrical dies, what you're really hearing about is a powerful enough electromagnetic signal (no different from radio other than the amplitude is overwhelming) that those long wires (like your house wires) suddenly start acting like antennae and channel all that energy into unprotected circuits. Lights, cheap electronics like your clock radio (stuff that's slowly going out of style...) are all examples of unprotected circuits. Your lights are especially a good example. Most of the time they're not a grounded circuit other than to ensure the fixture never energizes if there's a short.

I've actually lived through this kind of an event. A lightning strike occurred close enough to my house to fry phone and electrical wires inside the wall. It coupled energy on the Centex-style parallel cable between my computer and printer. It didn't hurt the protected computer, but it fried the data input board on the printer. Phones died. Lights died. I'm glad to have not suffered worse because my house could have burned down.

In a similar manner, a strong magnetic pulse can damage computer circuitry. More specifically, it can damage memory. Hard drives, some on-board memory types, etc. (Stuff that's slowly going out of style.) A big enough pulse, such as one that can be created with a nuclear explosion, can scramble magnetism-dependent circuitry. To my knowledge, there's not a way to naturally generate a magnetic pulse without the electrical component.

But what's the opposite? I want to damage just the "thinking machines."

Let's assume you're looking to damage the machines and not simply wipe the memories clean. In that case, we're dealing with short wire antennae. Really short wire antennae. As in you're trying to couple energy onto the traces between the major components inside the computer chips. Or at least the data busses inside those chips (the longest wires short of the power and ground planes, which aren't really "wires").

This means the wavelength or frequency of the electromagnetic pulse is important.

Simplifying things a bit, the length of an antenna is 1/4 the wavelength it's expected to deal with. A 100 Khz radio station is broadcasting on a wavelength that's 3km long. The ideal 1/4 wavelength antenna length is therefore 750 meters long. We don't have the time to go into antenna physics here. Suffice it to say that you can use harmonic fractions of the ideal 1/4 wavelength... but now you know why old style AM radio with short antenna had a really limited range or they had a honking tall antenna.

We're talking about picking up a wavelength that's 4X the length of the average bus wire in an integrated circuit. That might - maybe - be as long as 2mm. That's a frequency of 150Ghz. That's in the microwave frequencies.

  • And that brings us to problem #1: an EMP capable of coupling energy onto the bus lines of integrated circuits would cook every animal within its range. Including humans. It would vaporize water and shatter trees because of it. There's a price to be paid to do what you're trying to do. Why don't cell phones cook our brains? Well... Ignoring certain realities[1][2][3], the amplitude of the signals isn't anywhere near what's needed to heat water, which is how microwave ovens cook. Let's ignore all this for now.

However, if you did this, your lights would flicker but not be affected. The frequency is too high to efficiently couple energy onto lines as long as your house wires or the cross-country power lines.

  • Problem #2: Well, maybe... The physics of coupling energy get a bit funny when you're trying to affect something really small (I have two words for you, just two words: impedance matching). Let's just sweep this under the rug.

When I mentioned cellphones, you might have thought, "if Ghz transmissions are dangerous to computers, why doesn't my cellphone burn itself up?" Remember earlier when I mentioned protected and unprotected circuits? The amplitude of cellphone signals aren't high enough to get past protected circuits.

  • Problem #3: You have a real problem here. My computer wasn't damaged directly by the lightning strike because the frequency of the EMP was way too low (we'll ignore the fried surge protector that, had it not been in place, would have damaged the computer's power supply. But that's not damaging the computer per se.) Here's your problem: every pin on every critical integrated circuit is protected with what's called ESD Circuitry. "ESD" means "Electrostatic discharge." Up until now, ESD meant things like picking up the chip without first grounding yourself. You know those little electrical sparks you used to annoy your younger siblings? That kind of thing wreaks royal havoc with electronic circuits. That's ESD — and every pin has circuitry to protect the inner circuits from it. Problem #3 is that a Ghz-level EMP looks an awful lot like ESD. Now, that could work in your favor from a suspension-of-disbelief perspective by suggesting a great enough amplitude that the ESD circuity burned out, effectively disconnecting the pins from the internal circuitry. Of course everyone's BBQ, but let's ignore all of this, too.

OK... so what's the real problem? Bring on Problem #4 already!

  • Problem #4: Everything's a "thinking machine" today.

You're 30 years out of date in the way you're thinking. LED lights have control circuitry that's nothing more than simple integrated circuits. Any natural EMP that could hurt a computer would hurt the computer in your LED lights. And your cell phone. And your car. And your washing machine. And your refrigerator.

Problem #4 if that you're too late. Even if we come up with a believable natural event that could damage short-wire computers and not long-wire house circuitry, all that long-wire house circuitry powers short-wire computers. Many electric razors have integrated circuits inside them. Everything would burn up. Instant 1968.

But you won't mind because you're BBQ anyway.

But let's ignore all that.

What natural phenomena could produce a 150Ghz pulse with enough amplitude to couple energy onto a significant number of computer chips, frying them like so much dough in hot oil?

To be honest, I don't know of one. Even a nuclear blast is slow compared to what you need. I'm sure I don't have a complete meteorological knowledge to judge what conditions could exist, but everything else I can think of (electrically charging the atmosphere, raising the ground plane voltage in the soil, solar emissions of any kind...) simply can't produce the nasty mess you need to cook humans damage computers.

A fast enough lightning strike could do it, but (a) it must strike someplace where the energy can dissipate really quickly, like the middle of a lake where there aren't any computers to blow and (b) lightning has a very limited range. A bolt with enough energy to affect a really wide area would require a lake of exactly the right salinity and purity to dissipate the energy while it's vaporizing. Water really isn't a great conductor.

Gratefully, you're using the tag

At the low end of suspension-of-disbelief is the idea of a super-lightning storm that produces strikes that achieve what you need.

Middle-of-the-road suspension-of-disbelief would be a condition that causes very powerful ball lightning using something along the lines of the proposed Microwave cavity hypothesis. This could conceivably produce a magnetic pulse (vs. an EMP). Remember, magnetic memory is disappearing. You'll probably see that in your lifetime. And the result would only be scrambled memory, not damaged computers. But, hey... it's something.

Also in the middle are solar events like flares or mass ejections. This stuff wreaks havoc with satellites, but very rarely affects the Earth. When it does, it tends to affect long power lines (antennae...) and not computers. But SciFi has been blaming solar flares for everything for a long time, so your readers won't notice.

At the high end of suspension-of-disbelief is an atmospheric meteor strike that produces an EMP equivalent to an atmospheric nuclear blast but for whatever reason (maybe it's not dense enough), it doesn't hit the ground causing an extinction-level event.

$\endgroup$
9
  • $\begingroup$ Your correct in that destroying the circuity would be hard. Thus, impacting reliable output is fine. Bit flipping was something I was looking into; in this case it would cause computers to give wrong output. On a lower level, gates and flops use leading and falling edges to trigger, damaging or inserting extra energy/impeding it would cause logical errors. Going lower, FETs rely on voltage for modulating conductivity. Messing w/ the voltage would cause issues. In this case would sci fi lightning work at such a level? I have a CS background so I'm not entirely sure about the EE side of things $\endgroup$
    – FIRES_ICE
    Commented Nov 1, 2022 at 6:40
  • 2
    $\begingroup$ Unfortunately, your idea about interfering with transitions makes everything worse. FET gate transitions (and that's what you're dealing with, whether you think in terms of flip-flops or not) happen much, much faster than the clock. Most transitions in an IC happen in picoseconds (vs. Ghz micro-to-nanoseconds). Besides, modern ICs have both hardware and software that detects unexpected errors and self-corrects. If you're going to delve into the ugly details, the answer is no natural event can do what you want. You're going to need to avoid the specifics. $\endgroup$
    – JBH
    Commented Nov 1, 2022 at 7:13
  • 2
    $\begingroup$ Bit flipping or bit rot is where the memory cell loses the charge. Space radiation has been proposed as one cause of such. However, the level of radiation needed to make that a common problem (instead of the datacenter problem) would also cause a lot of problems with human health. $\endgroup$
    – David R
    Commented Nov 1, 2022 at 14:21
  • $\begingroup$ @DavidR then, it's a new kind of radiation that affects silicon much more strongly than it affects DNA! $\endgroup$ Commented Nov 1, 2022 at 16:18
  • $\begingroup$ An LED lightbulb has a relatively simple integrated circuit in it. A computer has many very complex ones, and they all have to work properly or it crashes. Just a few bitflips in the right place could cause any computer to crash, while the lightbulb might just flicker for a moment because it's not nearly so intricate. (Wi-Fi lightbulbs excepted) $\endgroup$ Commented Nov 1, 2022 at 16:19
11
$\begingroup$

Bit-flips induced by radiation storm

A computer works by loading an instruction from its working memory (you know it as RAM), doing whatever that instruction says, and repeating that cycle several billion times per second. Instructions are very basic things like "add these numbers". Only after obeying some billions of instructions choreographed by computer engineers do we get pretty cat pictures displayed on our screen.

How do the instructions get into working memory? Other instructions put them there! Except for the first few instructions, which are hard-wired. When your PC or cellphone starts up, it loads its own instructions from the hard disk, and stores them into its working memory, so it can follow them later. (At least, that's the essence of what it's doing)

That means after the computer has started up, it's running on "circular logic". Everything relies on that working memory being intact, and should that memory somehow get corrupted (i.e. not contain the instructions it's supposed to) it has no reliable way to fix itself because the instructions to detect the corruption and fix it would also be stored in the working memory.

At the same time, working memory is usually a technology called DRAM, which is physically quite delicate. Bits are stored using small electric charges. The more bits the designers fit into the same space, the smaller each bit is, and the smaller charge is used to store it. If the electric charge is disrupted, a 1 can be "flipped" to a 0 or vice versa.

Radiation from cosmic rays can induce or disrupt small electric charges. It's been shown that even accessing from the same part of it over and over very quickly. (In fact, charge even leaks away naturally. So every few milliseconds the memory system checks the charge level of every bit, and resets it to a full charge or no charge.)

So, your environmental phenomenon can be a radiation storm. Make up a new kind of radiation that strongly affects silicon, but not DNA.

When a few bits get flipped, it can randomly crash programs, crash the entire computer, corrupt data, or have no effect at all. If it's happening a lot, though, it's guaranteed to crash the entire computer sooner or later. Your phenomenon is quite strong and flips bits all over the place.


Many devices like microwave ovens also run on little computers, called microcontrollers. These use a different memory type called SRAM, which is pretty robust against radiation. The bits are also much bigger, since these are optimized for low-cost manufacturing instead of fitting lots of memory in a small space - it doesn't take much to run a microwave.

Microcontrollers often have the ability to detect crashes and reboot automatically. are often able to reboot automatically after crashing, which takes some milliseconds (not seconds or minutes like a PC).

The instructions for a microcontroller are stored in a separate program memory, which is usually flash memory. Although flash memory uses electric charge, just like DRAM, it's done in a different way which requires huge electric forces to flip bits. On the other hand, it doesn't self-correct every few milliseconds, so cumulative exposure to radiation may still flip them, causing the device to malfunction. Assuming the chip isn't actually damaged, this can be repaired by re-programming it, by someone with specialized hardware and a copy of the correct program.


Although LED lightbulbs use microchips, they use much simpler microchips which are hard-wired for one purpose only - to convert electricity from one form to another. Often these chips don't have any working memory at all. Sometimes the chip does contain a few SRAM bits for various purposes - like remembering which part of a cycle it is doing. If your radiation does manage to flip one, it will cause some little hiccup or glitch rather than a complete system "crash".

Wi-Fi lightbulbs are an exception, since they are based on microcontrollers.

$\endgroup$
3
  • 2
    $\begingroup$ No need to make up a new type of radiation. The old-fashioned stuff works just fine, since electronics are more sensitive to radiation than humans are. $\endgroup$
    – Mark
    Commented Nov 1, 2022 at 22:14
  • 3
    $\begingroup$ @Mark Yeah, but I don't know how much is needed to make computers crash so frequently as to be unusable, without causing cancer. $\endgroup$ Commented Nov 1, 2022 at 23:11
  • 4
    $\begingroup$ Not much. The South Atlantic Anomaly is plenty powerful enough to crash spacecraft computers, but only produces a dose of around 250-400 microSieverts per day; measurable cancer risk starts at 100 milliSieverts. $\endgroup$
    – Mark
    Commented Nov 2, 2022 at 1:48
7
$\begingroup$

Solar wind-like radiation

Space probes we send have specially designed protections and fault recovery circuits because the radiation in space affects their circuits.

Earth is protected by this radiation because of its Ozone layer. So, if you have local holes in it, the radiation will pass through and can fry electronics circuits!

Some will say that this will also affect vegetal/animal life, and thus we will be fried. But they also forget that even without the Ozone layer, there is still the atmosphere.

We are safe! and all electronic circuits are cooked to your taste :-)

EDIT : Apparently the real radiation is enough to affect circuits without affecting life, so I removed the section about having to handwave some details of it.

$\endgroup$
3
  • 2
    $\begingroup$ Ditch the "story-handwavium" nonsense, and you'd actually have a good answer. Microelectronics are far more sensitive to radiation than humans: when the International Space Station passes through the South Atlantic Anomaly, non-hardened electronic devices tend to crash or reset, while the astronauts merely get a tiny-fraction-of-a-percent increase in their lifetime cancer risk. $\endgroup$
    – Mark
    Commented Nov 1, 2022 at 22:13
  • $\begingroup$ @Mark I though solar wind was a problem for us big enough to require astronauts to have shielded suits. But OK, I removed the section. $\endgroup$ Commented Nov 3, 2022 at 13:27
  • $\begingroup$ Space suits provide protection against all sorts of things, but radiation isn't one of them. You might be thinking of UV protection: with no air to block ultraviolet, spacesuit helmets need more UV shielding than ordinary glass can provide. $\endgroup$
    – Mark
    Commented Nov 3, 2022 at 22:16
0
$\begingroup$

Your run of the mill RF jamming should do the trick. I'll focus here on jamming usb cables and WiFi network. This fits into your definition of computers not working (since computer without keyboard is not much of a computer), but your toilet can still flush water since its microcontroller is alive.

From personal experience: I had 50W RF transmitter (probably at 14Mhz or 21Mhz) just below my computer monitor and next to USB keyboard. It had coaxial cable running across the room and outside of the window. When it was transmitting USB keyboard would start sending junk to my computer. I could fire up notepad and see random key presses incoming. And that was all not even from antenna, it was leakage through the coaxial cable. (Take that all EE teachers who claim coax don't leak!)

On bright side, RF frequency is not ionizing and shouldn't cause (a lot of) cancer. And powers might be low enough not to cook people alive. On not so bright side, it would take about 1 hour for someone to ask question on stack overflow on how to protect against this weird weather and get told to start using ferrites and shorten the cables and build house from reinforced concrete.

If for whatever reason you want to cause even more damage, next easiest thing to attack would be Wi-Fi. I tried jamming my cell phone by placing it right next to a Wi-Fi router, but it totally did not care, so we know that 100mW is not enough to jam. I have no idea how much power you'd need to apply there. Aliexpress claims 1W can jam the room. If they are correct, that should be low enough not to cook humans alive.

Problem here is that defense against this is also pretty simple. You either move to 5Ghz wifi, or you use your cellular phone, so it is one trick pony.

Unfortunately, I can't think of any natural phenomena which would cause such amount of interference over long time (as opposed to EMP which just burns devices instantly). If you want to go deep into science fiction, plopping a nuclear reactor on a plane, attaching it giant amplifier and flying it over whoever you want to jam might work.

Some back of the napkin math: Free space path loss at distance of 1km is 57dB @ 24Mhz and 97db @ 2.4Ghz. If you want to induce 100mW to device you are jamming you should need 100KW transmitter @ 24Mhz (for usb jamming) and 1GW @ 2.4Ghz for Wi-Fi Jamming. If you want to be safe, maybe bump these numbers 1000x more to compensate for those pesky reinforced concrete walls.

$\endgroup$
2
  • $\begingroup$ FWIW: random keypresses suggests noise on the wiring inside the keyboard, to the keys, rather than the USB cable. Noise on the USB cable is likely to cause it to disconnect entirely. $\endgroup$ Commented Nov 2, 2022 at 15:32
  • 1
    $\begingroup$ @user253751 why would adding ferrite or moving cables to be orthogonal instead parallel to the coax solve the problem? Though it is interesting train of though. $\endgroup$
    – vguberinic
    Commented Nov 2, 2022 at 16:29
0
$\begingroup$

From all the answers given so far, it's clear that a type of radiation (unless you go completely hand-waving) is difficult to use unless you want to fry humans too. DNA is not more resistant than memory...

Then, issue #4 from @JBH answer is a real problem, but it could be maybe the solution at least for the next 10 years. You can use a super computer virus. It acts at a very low level, to some characteristics that all the modern CPUs have (like the speculative execution thing...), but it's fast to propagate, very fast, and it basically stops or even fry every CPU on the planet. In milliseconds.

Simpler things haven't got modern/complex CPUs yet. Or have they?

$\endgroup$
0
$\begingroup$

Are "thinking machines" necessarily silicon-based electronics like we have today?

With the surging popularity of machine learning, some tech companies are building custom chips optimized for the types of computation needed to simulate a neural network. Today, these are still silicon ICs with zillions of transistors.

But perhaps somebody will invent a neural network computer that uses a different underlying technology, and that's the part that's corruptible by an environmental factor.

For example, maybe they find a way to manufacture actual biological neural networks (which are more efficient because they directly model the solution rather than simulate it). Maybe those are only used in larger more demanding scenarios because they're not cost effective for small scale devices.

A manufactured biological net might be susceptible to microbes or enzymes that break down essential neuro transmitters.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .