118
$\begingroup$

Let's assume, without revoking any of today's science, that the world is a simulation.

What would a bug look like?

I'm assuming that "the eiffel tower suddenly being bent at 45°" is rather unlikely, the same way you don't see a bunch of clowns appearing in the middle of a game of need for speed. So what is likely?

  • repetitions / deja-vus?
  • changes in gravity / speed of light?
  • ...
$\endgroup$
24
  • 38
    $\begingroup$ The speed of light is 300,000km per second. However, in the original spec, there were no upper limits. It turns out that there's a problem in the update loop with values above 300,000km/s, though.... $\endgroup$
    – user2839
    Commented Nov 6, 2014 at 23:37
  • 10
    $\begingroup$ A big blue sky with white writing on it...Abort, Retry, Fail. $\endgroup$
    – Oldcat
    Commented Nov 21, 2014 at 1:24
  • 13
    $\begingroup$ It could be a StackOverflow! $\endgroup$ Commented Nov 21, 2014 at 16:27
  • 7
    $\begingroup$ Quantum physics in it's entirety - they're often working in ways that don't match the 'universal' laws we see elsewhere. What better definition of a 'bug'? $\endgroup$
    – Sobrique
    Commented Jan 5, 2015 at 13:54
  • 11
    $\begingroup$ XKCD suggests it'd be a flash of light: xkcd.com/505 $\endgroup$ Commented Mar 27, 2015 at 4:08

29 Answers 29

125
$\begingroup$

A bug is just an undocumented feature.

Anything we see from within the simulation is just going to be a part of the simulation. The only way to tell that what you see is a bug is by knowing what the expected behaviour of the program is, and god alone knows what that is (literally, in this case). Even seeing the Eiffel Tower do a dance would more likely be caused by the interference of whoever is running a simulation, rather than a bug in the simulation itself.

Furthermore, assuming that the universe is simulated at the level of elementary particles, bugs would be most likely to show up there. It would be hard to trace how these bugs would affect the macroscopic world, and we'd probably just see them as particularly bizarre rules. Even if your neutrons occasionally violated the law of conservation of mass and disappeared, physicists wouldn't cry out "The world is wrong!" They'd figure out when and why this stuff happens.

That said, here's a few common bugs that simulation software written in a language like the ones we use today could have. In all cases, I assume that the simulation manages to not crash. Also, various bugs assume different things as "fundamental" -- all this probably disagrees with real physics somewhere.

  • The machines running the software may run out of memory. If something were to split into multiple pieces, some pieces may mysteriously vanish.

  • Memory may be managed incorrectly, resulting in two objects appearing to exist in the same place in memory (not space!). Influencing one would influence the other as well. Wait, that sounds familiar...

  • The time counter may roll over. If the universal constants change over time, this could cause them to be reset to what they were at the Big Bang. I suspect humans wouldn't survive, though I'm not sure.

  • The world may have a maximal precision. In that case we can observe a particle at point a, or at point b, but not anywhere between the two. Or maybe a particle can have energy level 1, or 2, but not 1.5...

  • If the system is distributed, connection problems may lead to synchronisation issues. That is, those things simulated on server A see one sequence of events unfold, while those on server B see a different sequence, and then these are somehow merged into a single timeline.

  • Memory corruption can make things suddenly change value. That's not very specific, because memory corruption isn't very specific; pretty much anything could happen, though it would probably be a lot of chaotic changes.

It is unlikely that any of the above would be able to explain magic in the usual sense of the word. Most magic is highly structured, allowing you to create and direct complex systems. A bug that lets you shoot fireballs is very strange indeed: it basically means the universe "knows" what a fireball is and can keep one together for you. In a universe built up from particles, this is not going to happen reliably by mere chance.


In response to the suggestion that a different level of simulation would be more interesting: that could very well be the case. I just can't imagine how it would fit.

It is not that hard to suppose that quantum phenomena are fundamental and that they somehow add up to normality. I'm not a physicist and don't know how this happens, but I believe that's how it happens in reality, and so I'm willing to believe that simulating quantum phenomena will also simulate normality.

Going in the opposite direction is much harder. Suppose the main objects in a simulation are living beings. For some reason, lower-level phenomena are still observed. I see two ways this can play out:

The low-level phenomena may just be there for décor. They can be observed but they don't have any further effects on reality. This can be seen (at a somewhat higher level) in strategy games, when a unit constructs a building. The animation gives the impression of work being done, but it's just for the sake of the viewer. The building will go up even if the animation is changed to show something else.

In such a case, learning about how low-level things behave would give you only very tentative predictions about how the world behaves. Things like chemistry would be approximate at best.

Alternatively, the universe may be able to add arbitrarily precise details to any place which is observed, and these details have to have an actual effect on reality. The problem is that any inconsistency in these effects with the macroscopic approximation leads to observations influencing results.

Effectively, you end up splitting everything into three "sizes":

  • The décor: you can see, but what you see doesn't mean anything.
  • The inconsistent: you can see, but your results change if you do.
  • The normal: you can see, and can explain everything in terms of the smallest "normal" level.

If you put molecules at the normal level, the behaviour of humans is going to follow from the behaviour of molecules. If you put them at the inconsistent level, chemistry isn't going to work quite as well as it does. You can't have your cake and eat it too.

$\endgroup$
16
  • 72
    $\begingroup$ Also, not really bugs, but if our universe was a simulation, it would have distance culling optimizations (for example, finite speed of propagation of light) and lazy evaluation at the microscopic level (for example, a particle would be in an indecisive state until observed). Thank goodness we don't have that kind of nonsense. $\endgroup$
    – Sigma Ori
    Commented Oct 26, 2014 at 21:37
  • 16
    $\begingroup$ A bug is undocumented behaviour. A feature should be documented by defintion. Do our lives come with documentation about how the world works? Nope - it is all a bug. $\endgroup$
    – Gusdor
    Commented Oct 27, 2014 at 8:07
  • 4
    $\begingroup$ In reality, quantum entanglement actually works differently from multiple references to a single object - just to be clear. $\endgroup$
    – David Z
    Commented Oct 27, 2014 at 8:10
  • 5
    $\begingroup$ My thoughts exactly @TheodorosChatzigiannakis. If I were creating a complex Universe simulation, I'd place most of the Universe at long distances so I didn't need to provide high levels of details to the observers. Then I'd use "simple" probability formula to manage the behavior of bits not normally observed and only switch over to actual particle motion when necessary due to direct observation. To observers inside the simulation, it'd look like there were two completely incompatible formula managing things in the Universe. I'm glad we don't have that! $\endgroup$
    – Jim2B
    Commented Mar 31, 2015 at 15:04
  • 5
    $\begingroup$ @Gusdor As a software quality assurance tester, I document bugs all the time. They're still bugs after I document them, and remain so until they are fixed. (Sad truth of software quality assurance: The most consistently accurate definition of 'bug' we've been able to come up with is "something that your boss agrees is a bug.") $\endgroup$
    – user867
    Commented Feb 19, 2016 at 6:48
30
$\begingroup$

If a universe is a simulation, then, logically, it must have all the natural laws built into it. Agreed? Now, if it is a deterministic universe - that is a universe where, theoretically, you could predict its entire future if you knew everything about it at a certain point in time - these laws would be all that is needed to run the universe. It's sort of like The Game of Life - you input some data and let the thing go.

Now, we live in a universe where quantum mechanics exists, and thus probability exists. This has given a lot of people a lot of headaches, because there are loads of events we can't predict. In other words, you would have a harder time programming in natural laws than you would in a deterministic universe, because you would have to determine some random variables. If a universe is a simulation, then there would have to be an algorithm running in the computer(s) controlling it that determines these random variables - which would not make them random at all.

In a deterministic universe, it would be easy to see a glitch. In a certain spot at a certain time, some phenomenon would occur that violates at least one law of science. For example, perhaps a falling ball moves a few nanometers to one side when it shouldn't have. Given the complexity of a large enough simulation, this could happen quite a bit at small scales. Maybe a photon travels in a vacuum at a slightly slower or faster speed than it should have. Perhaps a new particle appears (or disappears) into (or out of) thin air. Any of these things could be a bug, and they would probably happen a lot. But they would be so minor. It would be very rare for large-scale bug (e.g. the Earth suddenly moves 10 million miles in one direction) to happen.

But we live in a universe where quantum mechanics rules on some scales, which gives us a very nice little loophole. If there was a bug, it could actually follow the rules of quantum mechanics. How? Well, the Heisenberg Uncertainty Principle says in part that conservation of energy can be violated on tiny scales for tiny amounts of time. So a particle suddenly appearing and disappearing could actually fit right in. There is a tiny probability in the universe that a lot of odd things could happen - quantum tunneling, for instance - that shouldn't. A bug could masquerade as any of these.

So it's fair to say that small bugs could happen that would merely appear to be quantum phenomena. We would write them off as products of uncertainty and chance, and they would go by without anybody thinking that they were bugs. And in a simulation, small bugs would probably be very likely.


I'm a bit bored, so I thought I might come up with a list of some of the bugs that might show up in the simulation. Taking some inspiration from the Wikipedia article on software bugs:

  • Infinite loop - I guess the equivalent here would include time travel and all the assorted issues that come with it. This could include time paradoxes, which give everyone headaches, or closed timelike curves, which also give people headaches. Both would involve odd problems with causality - that is, either one thing causes another thing which causes the first thing or one thing causes another thing that makes the first thing impossible. Savvy?
  • Division by zero - This would be guaranteed to annoy the runners of the simulation. It annoys the heck out of me when I accidentally do it with a pocket calculator; on a scale like this, it would be catastrophic. But what would a manifestation of division by zero look like? Well, a singularity, probably. If they try to simulate what happens at the exact center of a black hole. . . Ouch. The computer wouldn't be able to handle it - just like if you asked a computer to figure out $f(0)$, where $f(x)=\frac{1}{x}$, if the computer wasn't pre-programmed to know that such a calculation will always lead to an undefined quantity.
  • Incorrect code transfer - This isn't really a bug so much as an error on the part of one of the programmers, and it might not even turn out to cause a problem. Say I (one of the people working on the simulation) was assigned to transcribe the equations of what we, the simulated people, know as general relativity, to the final program. I would have to transfer the main equation,

    $$R_{ab}-\frac{1}{2}Rg_{ab}+\Lambda g_{ab}=\frac{8 \pi G}{c^4}T_{ab}$$

to the program. Now, I would also have to transfer some of the intermediate steps, too, such as calculating the Christoffel symbols. Let's say, though, I didn't use the concept of Einstein summation notation but did everything out by hand. Let's also say that while I translated (in spherical coordinates) $$\frac{1}{2}\Gamma _{abc}=\left(\frac{\partial}{\partial x^c}g_{ab}+\frac{\partial}{\partial x^b}g_{ac}-\frac{\partial}{\partial x^a}g_{bc}+\right)$$ correctly for $\Gamma_{ttt}$, $\Gamma_{tt r}$, $\Gamma_{tt \theta}$, and so forth, I made a mistake for the case of, say, $\Gamma_{rt \phi}$. This would mean that the computer would make weird calculations that it shouldn't have, that could throw everything off. Now, the reason I said that this might not count as a bug would be that this program is like The Game of Life: you write up the laws and click 'start'. So that error would simply become part of the physical laws in the simulation. It wouldn't make sense, but it would be a law nonetheless.

$\endgroup$
17
  • 1
    $\begingroup$ Why would an infinite loop cause any kind of time travel effects? $\endgroup$
    – Komi Golov
    Commented Oct 26, 2014 at 19:13
  • 1
    $\begingroup$ An infinite loop is when a computer keeps repeating the same sequence of instructions. There doesn't have to be anything wrong with this (a simulation probably consists of one big loop) but it can be a bug when the code in the loop doesn't do anything useful (and the program was supposed to progress). I'm not sure I see how this would have any kind of effects on time, though, except perhaps stopping it. $\endgroup$
    – Komi Golov
    Commented Oct 26, 2014 at 19:30
  • 1
    $\begingroup$ @AntonGolov Okay, I clearly did not explain my logic well. My hypothesis was that if there was a loop where A caused B and B caused A, the computer would never be able to run it because it would always have to simulate the other event. In other words, it would keep trying to simulate each event, realize it had to simulate the event before, and just keep looping. But perhaps I use incorrect terminology; I think you know computers better than I. $\endgroup$
    – HDE 226868
    Commented Oct 26, 2014 at 19:33
  • 2
    $\begingroup$ @HDE226868: Ah, I see. This kind of thing can happen with lazy evaluation. A world based on things happening because they are required by other things could be quite interesting. $\endgroup$
    – Komi Golov
    Commented Oct 26, 2014 at 19:40
  • 5
    $\begingroup$ When I was running a simulation that had singularities (infinities, negative pressures, or divide by zeroes), I wrapped an artificial damping routine around the singularity to make it behave. You could go with the theory that black hole event horizons are this damping routine, lol. $\endgroup$
    – Jim2B
    Commented Mar 30, 2015 at 23:28
24
$\begingroup$

There's a niche branch of research within theoretical physics that deals with exactly this sort of thing: if the universe were a simulation, what would be the physical effects of limitations in the underlying system? As an example, a few years ago this paper by Beane, Davoudi, and Savage got a lot of coverage in science media and blogs (much of it rather questionable, by the way). The paper assumes that the universe is simulated on a Cartesian grid in a particular manner, and identifies three consequences we might observe due to the nonzero grid size:

  • a modification to the magnetic moment of the muon
  • an inconsistency between different methods of measuring the electromagnetic coupling
  • anisotropy (i.e. a dependence on direction) in the maximum energy of cosmic rays

If I may give a brief self-plug, at the time I wrote a blog post that explains this in some more detail.

This all assumes that the simulation works kind of like lattice QCD, namely that it simulates the fundamental quantum fields rather than individual physical objects. There's no reason one has to make that assumption, of course. But real-life experience suggests that creating a simulation which is accurate across the full range of length scales, from the structure of protons to the entire universe, is very, very difficult on the programmers if you use any method other than just simulating the basic ingredients. It's a good bet that if you want to simulate a universe, rather than coming up with tricky algorithms to represent objects, it's easier to just build a bigger computer. This means the "obvious" bugs you might think of, like disappearing objects or different parts of the universe behaving identically, just won't happen.

$\endgroup$
9
  • $\begingroup$ Is there, in this case, anything macro that would happen? $\endgroup$
    – Sheraff
    Commented Oct 27, 2014 at 8:39
  • 1
    $\begingroup$ Not as far as I know, unless a small-scale error somehow cascaded into a much larger effect. But physical systems don't really work that way. (Of course that only applies to numerical issues like what I've mentioned here. An actual bug, like just a flat-out error in the code, could cause all sorts of things to happen.) $\endgroup$
    – David Z
    Commented Oct 27, 2014 at 8:54
  • $\begingroup$ We've already developed adaptive gridding for simulations. Imagine an algorithm the increases grid resolution when people look closely at a particular problem. Then you might get a situation like we see in our universe: for macroscopic object behavior general relativity holds. When you use experiments to examine fine grained structure then the simulation would switch to QCD. Also isolating large parts of the Universe (by say being really remote) means we never get to look at the fine scale structure of matter and never need to use QCD type fine detail for it. $\endgroup$
    – Jim2B
    Commented Mar 30, 2015 at 23:34
  • $\begingroup$ @Jim2B I suppose that's possible, but it still leaves open the question of finding some general theory that reduces to QCD/QFT when evaluated on a fine grid and GR when evaluated on a coarse grid. People are already working on that, but without the grid, because honestly it's easier if you have the freedom to assume continuous spacetime. Plus, if the grid adapts well enough to never be detectable, the simplest explanation won't include a grid at all. $\endgroup$
    – David Z
    Commented Mar 31, 2015 at 5:49
  • 1
    $\begingroup$ @Jim2B that's simply not true, although this is not the place to discuss it. $\endgroup$
    – David Z
    Commented Mar 31, 2015 at 15:40
16
$\begingroup$

I'll attempt to answer this as a programmer who deals with bugs daily.

How might we simulate a universe?

The universe is big. If I was trying to simulate it I would make some optimisations to my code.

I would be tempted to only simulate in detail the portions of the universe that anyone is actually looking at, to the level of detail with which they are able to perceive that portion. I would make statistical generalisations to determine how things change when they are not being looked at. Objects that are not looked at would not be rendered so to speak.

Interestingly this actually ties up pretty nicely with the result of the double slit experiment.

This is rather like the way we encode a jpeg. Only the interesting regions are stored in detail, the lower detail sections are "derezzed" so to speak, and we get the blocky jpeg corruption we are all familiar with. Imagine a dynamic resolution resolver that modifies the detail of any particular region of space depending on whether it is being observed.

Preprocessing

I might also be tempted to engage in some preprocessing. I would prerender certain portions of the universe and mark them as such. I would make distant stars essentially static objects, since we can't perceive them in detail. I wouldn't bother rendering the dark side of the moon for example, or the core of the planet.

So what sort of bugs might we see?

Well we might expect to see different types of bugs depending on which portion of the code we are looking at. The detail rendered environment would likely be sound. When an object is not perceived we would experience the consequences of whatever simplifying assumptions the coder made about the universe and how it might change.

  • We might perceive a disjunction in the universe, a crack if you will, where time and space are not correctly joined up.
  • We might start to see unrendered portions of the universe, perhaps regions of the universe marked as rendered are actually not rendered at all. Perhaps an astronaut in orbit round the moon finds the dark side is just a blank void, impossible to look at or perceive.
  • We might see errors in simple laws of the universe. Perhaps we put the car keys down, turn away, and when we turn back they are missing (again a common experience).
  • We might see errors in arithmetic in unrendered portions of the universe. We might find 2 + 2 = 5, literally take two objects then another two, and we have five in our hand.
  • Complicated regions of space might crash, for example, your ipad might derez, and then reappear blank and clean.
  • Perhaps gravity or fire might occasionally not act correctly on an object if it is not observed. An object might be left suspended if the program fails to recognise that it's support has been removed, then crash to the floor when a person enters the room.
  • You might see shadows of objects or people which are no longer there. Perhaps objects leave a hole, or a lightwell.
  • Damaged objects might be lost and replaced with a clean version from a buffer. A damaged car might become like new again. A scratch in some paint might be erased.
  • At a more extreme level, a human might completely disappear, all memories of that person erased from the program, except perhaps a ghost, a shadow.

Pausing

It's also interesting to consider that if the universe were a simulation, and our minds constructs within it, our perception of time would be tied to the simulation. It would be possible to perhaps pause the simulation for a thousand years and none of us would even notice.

It might take a billion years of real time to render a single frame, and none of us would be any the wiser.

This assumes of course that time and space exist outside of the simulator. Perhaps the real world is something altogether more exotic.

Reference

http://en.wikipedia.org/wiki/Double-slit_experiment

$\endgroup$
9
  • 1
    $\begingroup$ Yep, the double-slit experiment is what I thought of first... $\endgroup$
    – MGOwen
    Commented Oct 27, 2014 at 7:59
  • 2
    $\begingroup$ Best answer here! (Granted, I'm a programmer myself...) $\endgroup$ Commented Oct 27, 2014 at 14:09
  • 2
    $\begingroup$ From certain perspectives, our absolute measure of elapsed time might well not be absolute at all - it might 'simply' be a measure of entropy, and the reason we have 'before' or 'after' at all, is merely because entropy has increased. If you were to reverse entropy, you might also reverse time. $\endgroup$
    – Sobrique
    Commented Jan 5, 2015 at 13:51
  • $\begingroup$ Your analogy with quantum mechanics is impressive, but in reality simulating quantum mechanics requires a lot more of computational power than tracking every particle classically. $\endgroup$
    – Anixx
    Commented Apr 8, 2015 at 22:33
  • $\begingroup$ @anixx - interesting. Could you say why? $\endgroup$ Commented Apr 9, 2015 at 6:29
13
$\begingroup$

It would look like this:

bsod

Upon looking into the sky you would see this message. Soon, after a sensation of deja vu, everything would rewind 90 sec (last backup is copied in) and everything would go on as normal. Actually this already happened before, when the dinosaurs died. Unfortunately the universe bugged right into the middle of a backup, so the operators lost a lot of data. They decided on a workaround simulating a meteor strike on earth.

References:

Adams, D. (1985) The Hitchhiker's Guide to the Galaxy

$\endgroup$
2
  • $\begingroup$ this image hurts eyes… what if someone edit it to grayscale, for example. $\endgroup$ Commented Apr 8, 2015 at 18:12
  • $\begingroup$ This is definitely not the first time I have seen this Stop error screen. But it's been a while.... $\endgroup$ Commented Jan 12, 2022 at 21:45
6
$\begingroup$

*ahem...

enter image description here

Theoretical rendering of a black hole, which can supposedly divide by zero.

$\endgroup$
4
  • 4
    $\begingroup$ Ah we never see the singularity (the divide by zero), it's always clothed to prevent us from seeing it's private parts :D $\endgroup$
    – Jim2B
    Commented Mar 31, 2015 at 14:56
  • $\begingroup$ wait... there is a spot where division by 0 occurs irl science?!? $\endgroup$
    – user64742
    Commented May 31, 2017 at 4:33
  • 1
    $\begingroup$ @TheGreatDuck Yes and no. It's a limitation of relativity theory, a place where our current mathematics can't predict what will happen. (Remember, physics is a simplification of the real world. It doesn't define what actually happens.) $\endgroup$
    – Kent
    Commented Jun 2, 2017 at 14:23
  • $\begingroup$ @Kent of course. I thought maybe that physics had defined division by 0 to measure something reasonable and possible to exist. $\endgroup$
    – user64742
    Commented Jun 2, 2017 at 14:45
6
$\begingroup$

In general, calculations that simulate all particles at once are quite expensive to run. Most simulations therefore prefer to use shortcuts for calculations.

Instead of simulating every single atom, a bunch of atoms are simulated together. That produces errors that derive from calculating every atom individually.

Another way to save computational resources is precaching. Instead of running a calculation every time, you run it once and give the same result every time it gets run in the feature.

Allowing magic would mean that the simulation takes into account the mental states of people to make decisions. If nobody is looking at a particular place, then the simulation doesn't invest much resources into getting every little detail right.

Whenever Randi makes a high stakes experiment the simulation engine pours a lot of computational power into it to make the results come out right. If however nobody does real scientific investigation, paranormal glitches can happen.

Magic is nothing more than the simulation taking into account of the mental state of the people in it. Randi strongly believes that his experiments will turn out a certain way, so they turn out that way.

On the other hand there could be other people who also get results be focusing their attention on getting a certain result and then the simulation calculating the world to get that result.

If you start with that frame you can adopt a lot of ideas out of the "Law of Attraction" community. That community mistakenly interprets the Observer effect in quantum dynamics to mean that we live in such a world.

$\endgroup$
6
$\begingroup$

I didn't see these ideas posted so I thought I'd put them here.

Spotting a Simulated Universe
If the Universe is indeed a digital simulation, then the Universe computations will be done with only a certain level of precision. Since the Universe must do this computations everywhere, it should be possible for someone doing computations within that Universe to do computations at a higher level of precision for very specific cases then that used by the general Universe simulation computations.

To the researcher, this would look like small but unexpected & unexplained deviations between our calculations of a behavior and the observed behavior.

A wonderful example of this would be the Pioneer Anomaly. Currently we think this anomaly is explained by the radiation pressure exerted by the RTG. The observed effects is within the error bounds expected of this radiation pressure.

But for someone looking for story ideas, imagine that later refinements indicated this either does not explain the effect or only explains part of it.

We might be seeing a rounding error.

For story purposes, this could lead to a general search for other such phenomena in cases in which we are capable of extremely precise measurements.

Hacking the Universe
As for hacking the Universe...
The Universe would be the most complicated program that we could imagine (or perhaps more complicated than we can imagine). Such an unbelievably large body of code would be certain to contain errors. As someone above mentioned, use the "Stack overflow" approach or other methods that make use of flaws in the code. It might take a while to find one...

Greg Bear's novels The Forge of God and Anvil of the Stars posits the ability to "write" into the registers of matter to change it. If a method of doing so was ever discovered, we could easily write to the registers giving location, etc. We could instantly teleport to anywhere or use it to change mass / momentum to objects enabling us to achieve any desired acceleration or velocity.

Jack Chalker's Well World series showed it took a super computer the size of a small moon to hack the simulator code to get the desired effects.

If we were successful in hacking the Universe simulation and gained access to the "OS" level, we could conceivable chat with other simulations running on the same "system". Alternatively, we could change our simulation or run others.

Philosophical Questions
Philosophical twists I haven't seen here:
Simulations are run for a reason. In my case, I ran a simulation to solve problems. What problem is our simulation solving?

Maybe other simulations are solving other problems. If we had access to their simulations, what might that do to solve our problems?

What happens when the creators discover that we've hacked their simulation and are no longer solving their problem?

Or that we're also hacking their other simulations and polluting their system?

What happens if we develop a Taylor Algorithms pass them up to our creators and they discover they too are simulations?

$\endgroup$
3
  • $\begingroup$ Every game is a simulation. They aren't purposefully solving anything high-level. $\endgroup$
    – SOFe
    Commented Aug 22, 2016 at 16:39
  • 1
    $\begingroup$ «twists I haven't seen here… What problem is our simulation solving?» see David Brin’s “Stones of Significance”. You can find the full text on line. $\endgroup$
    – JDługosz
    Commented Apr 21, 2017 at 5:18
  • $\begingroup$ I'd like to think that if our universe is a simulation it's only accidental. We're just a buffer overrun from the computation of a much larger problem. We've gone unnoticed so far but eventually garage collection will scrub us from existence. The progenitors of the original program will never know that we existed. $\endgroup$
    – Skek Tek
    Commented Oct 16, 2018 at 15:45
5
$\begingroup$

This is almost too short to be an answer, but I cannot help but channel the mind of a great Science Fiction author. Remember, we're looking at what the bug would look like from a perspective inside the universe.

In the words of the great Isaac Asimov:

The most exciting phrase to hear in science, the one that heralds new discoveries, is not 'Eureka!' but 'That's funny...'

$\endgroup$
4
$\begingroup$

Digital Physics posits that our universe is a computational device. More precisely it says that our universe is mathematically isomorphic to a universal Turing machine.

These theories state that our universe evolves from one state to the next in a way which is isomorphic to applying a finite number of simple rules for manipulating 1's and 0's. (A Turing machine actually uses seven rules, but there are equivalent formulations using fewer.) Physical phenomena are described by the informational content of bit strings. For example, flipping a few bits from a 1 to a 0 may describe the ionization of an atom.

The occurrence of a bug would mean that our universe would find itself in a state that is not computationally consistent with its previous state. In other words, its state does not follow from the correct application of the rules. Had the rules been applied correctly, then the universe would be different.

Assuming such a computational error is possible, the possible outcomes would range from the trivial, transient, self-correcting type of error, right up to the fatal, catastrophic, world ending type of event.

For example, if a few bits flipped causing an electron's charge to change from negative to positive just before it fell into a black hole, then it probably wouldn't matter much.

On the other hand, if the value of one of nature's fundamental constants was overwritten, then the effects would probably be catastrophic. For example, flip a few bits in the value of the strong nuclear force and we might see all atoms lose their coherence as their nuclei fell apart.

Somewhere in between, a bug would most likely manifest as a paradoxical state of affairs. Perhaps something like two different objects appearing to occupy the same volume of space, or some sort of localized infinite loop. If the program included error-correcting code, then any paradoxical behaviour would be localized and "removed from view" (computationally excluded), so that we could see whole galaxies vanish should a serious error arise. Indeed, localizing and removing from view is precisely what a black hole does.

$\endgroup$
2
  • $\begingroup$ How can a universal turing machine generate random numbers? $\endgroup$
    – JDługosz
    Commented Jan 3, 2015 at 6:07
  • 1
    $\begingroup$ @jdlugosz I don't believe that it could, though events may appear random from within. $\endgroup$
    – abcdefg
    Commented Jan 4, 2015 at 3:06
4
$\begingroup$

Surprised that no-one has mentioned quantum observable differences.

In a computer game, when the view screen is not being rendered (because the player is not looking in that direction) the graphics 'dumb down' to produce the results of whats going on, without needing to render each pixel correctly, because its cheaper computationally. In older games you can see this in distant objects being rendered poorly.

In Quantum physics, when particles are fired at a slit and watched by an observer, they fire individual particles which obey particle physics laws, yet when no observer is present (including cameras) they obey wave form physics, which would be a far cheaper computational result.

So, it could be considered a bug, that the 'results' of the particle firer can be viewed with two different results, based on if an observer is present or not. until a civilisation invents technology to observe particles, no-one would know.

$\endgroup$
1
  • $\begingroup$ I also thought of that. Btw, do you think gravitational lensing is a bug? $\endgroup$
    – Nulik
    Commented Jan 6, 2022 at 14:32
3
$\begingroup$

One quite nasty type of bug (which can especially be hard to find, and can give quite inconsistent results) is out-of-range indices, in a language which doesn't range-check indices (likely to be used in simulations because range-checks cost valuable computing time, and do nothing useful if your code is correct).

An out of range index ultimately means that values are read or written in a place where they should not have been read or written; this place may be completely unrelated to the place where the data is meant to go. Indeed, the infamous buffer overrun is a special case of out-of.range indices.

Inside the simulation, such out-of-range indices could for example manifest as strange influences between completely unrelated events (because the out-or-range read ready data belonging to the other event, or the out-of-range write alters data belonging to the other event). Such influences could violate otherwise strict laws (for example, they could easily result in faster-than-light effects, if the erroneously accessed memory belongs to a far away event — after all, far away in spacetime doesn't need top mean far away in computer memory).

Similar effects could be caused by reads of uninitialized variables which happen to contain unrelated data belonging to a different point in space.

Finally, while not really a bug, also bit flips in memory (caused e.g. by — real, not simulated — cosmic ray particles crossing the memory chip and altering the charge of a memory cell) might cause quite interesting effects in the simulation. Such events would be rare (but if the simulation runs quite slowly and the computer uses non-EEC memory, it might be not that rare if measured in simulated time). Since bit flips can also cause rather large differences in values, this would give random events that may well be measurable in-simulation (but of course would not be predictable; after all, they are not even predictable in the "outer" world).

$\endgroup$
1
  • $\begingroup$ "out of range" indices are part of higher level abstraction, like a programming language. In Assembly language there is no "out of range" error. Also this property is of only Von Neuman machines, in FPGAs/ASICs there is no "out of range" errors and they are also computing devices $\endgroup$
    – Nulik
    Commented Jan 7, 2022 at 1:08
3
$\begingroup$

As I brought up in a prevous hacking the universe topic, perhaps anyntime an error is detected then the state is rolled back, as a cancelled transaction or restore from backup.

It would make any bugs or hacking attemps unobservable. Perhaps it can be detected by what doesn't happen, as it avoids the bugs.

$\endgroup$
4
  • 3
    $\begingroup$ We're also constrained by being simulated observers as well. If the program 'hangs' then so will the observation subroutine. $\endgroup$
    – Sobrique
    Commented Jan 5, 2015 at 13:53
  • $\begingroup$ exactly my point. $\endgroup$
    – Jorge Aldo
    Commented Apr 1, 2015 at 22:45
  • $\begingroup$ This would explain deja vu... $\endgroup$
    – dcow
    Commented Dec 30, 2015 at 10:25
  • 1
    $\begingroup$ No, it would not explain deja vu since your brain state will be included in the universe. $\endgroup$
    – JDługosz
    Commented Dec 31, 2015 at 1:34
3
$\begingroup$

There have been jokes recently that the way the RF resonant cavity thruster (known as the EMDrive) works, are a recently discovered rounding error to our universe. We see rounding errors in computers often.

To quote the reddit user NoHahForACrudite, in explaining this universe bug:

Basically, when something is accelerating (speeding up in one direction or changing direction at the same or higher forward velocity), in it's [sic] own frame of reference, it will get warmer. This change in warmth is (in simplified terms) "blackbody radiation". The longer the wavelength of BBR, the "cooler" the radiation. What the article seems to be saying is that because the acceleration imparted by the microwave radiation is so immeasurably small, and because the wavelength of heat it would generate would be physically impossible in this universe, instead of that acceleration being expressed as an increase in warmth (BBR), it becomes "quantized" as a change in the object's inertia (the object gains "movement"/"push" in a certain direction). Ostensibly, doing this at a high frequency would manifest a measurable change in inertia/acceleration.

$\endgroup$
2
  • $\begingroup$ But the math promoted by the inventor doesn’t rely on rounding errors, but on improper modeling of the physics. More of introducing a resovoir for hysteresis that doesn’t conform to Lorentz invariance. $\endgroup$
    – JDługosz
    Commented Sep 3, 2016 at 13:13
  • 1
    $\begingroup$ do you think Plank's constant could be the evidence for the size of floating point precision of our Universe? $\endgroup$
    – Nulik
    Commented Jan 6, 2022 at 14:41
3
$\begingroup$

You can clearly see geometric artifacts from the texture mapping on the top of Saturn (the designers never intended us to look there):

Saturn artifact

$\endgroup$
2
$\begingroup$

There could be something like an off-by-one error, that accidentally leads to there being more matter than antimatter.

$\endgroup$
1
  • $\begingroup$ This clearly happened during the big bang; for every million particles of antimatter there was a million and 1 particles of matter, which accounts for all the matter in the universe. (Without it, none of us would be here!) $\endgroup$
    – Murphy L.
    Commented Jan 11, 2022 at 19:12
2
$\begingroup$

A crash bug would look like universe ending abruptly for no reason, so I guess it wouldn't look like anything to us since our ability to perceive the bug would disappear as soon as it happens.

Other kinds of bugs can cause almost anything: missing textures, objects being too large or too small or not there or in a wrong spot. Colours can get screwed up, sound disappearing. Repetitions are possible, though they wo

$\endgroup$
2
$\begingroup$

Mandela Effect

This is something some people actually believe in.

The Mandela Effect refers to a phenomenon in which a large number of people share false memories of past events. It was named after Nelson Mandela, whom some people erroneously believed to have died in prison in the 1980s.

The explanation could be the answer to the question: we live in a simulated world and the Mandela Effect is just a bug, like a hard drive corrupted memory.

There are many examples in the internet, here the most popular:

  • C-3PO from Star Wars was gold, actually one of his legs is silver.
  • Queen in Snow White says, “Mirror, mirror on the wall”.The correct phrase is “magic mirror on the wall”.
  • Berenstein Bears instead Berenstain Bears
  • looney toons instead looney tunes
$\endgroup$
1
$\begingroup$

It's possible the universe is bug free or our view of the universe is unable to detect bugs because it has a checksum in place to detect errors which may have been introduced during data transmission and storage.

Ya that sounds weird, but here is a complex high energy physics paper on the subject: http://arxiv.org/abs/0806.0051

$\endgroup$
1
$\begingroup$

An explosion, a black hole, or false vacuum collapse

(This answer assumes that physics works as we perceive it, and that the program does not "cheat" and store large objects independently when we are not looking at them. If macroscopic objects are stored directly in the program, then none of this applies.)

Since the "base code" of our universe appears to operate on a subatomic level, with all macroscopic objects being emergent properties of those subatomic events, it seems unlikely that macroscopic objects such as buildings or cats would experience noticeable changes that maintained their overall structure, since "building" and "cat" are not actually encoded objects, but structures formed out of encoded objects.

A bug that affected the entire simulation would simply end the universe and would therefore not "look" like anything. The devs would likely fix the bug and then, if they were able, reboot from the last clean save.

The only kind of bug that might go unnoticed or unpatched by the developers (and therefore, noticeable by us) is something that changes a fundamental value of a single particle or region of space. Most of these bugs would go unnoticed by us as well, unless that value was changed a lot from its expected amount. Maybe a single particle spontaneously gains the mass-energy equivalent of a planet, or accelerates to within a trillionth of the speed of light.

This could result in one of several possible results. If the value was changed to a high but reasonable level, it would result in an explosion of energy. This explosion could range from anywhere to a tiny pop of heat and light all the way up to supernova levels.

If the particle's mass-energy equivalent was high enough, it might also generate a black hole. A small enough black hole would quickly explode. A larger one, if it appeared on Earth, would fall into the planet and consume it from the inside out. Looking in space, we might be able to tell the difference between a black hole that appeared naturally and one that appeared spontaneously, but unless we were looking at it as it appeared this is unlikely. New black holes are generally surrounded by the exploded remains of the star that produced them, but old black holes would be virtually indistinguishable from spontaneously-generated ones.

If the energy of a region of space fell below the base energy level, or "zero point", the results would be even more catastrophic. This would create an unstoppable chain reaction that spread out through a spherical region at the speed of light, destroying everything in its path. This phenomenon is known as "false vacuum collapse" and if it were to happen, it might destroy the entire universe (eventually).

If the devs managed to hotpatch this without resetting the entire simulation by simply deleting everything within the spherical region, and they did not recreate the destroyed space accurately, we might notice the result. Perhaps a spherical chunk of space would simply be devoid of stars, or the stars in it might shift around, or we might be looking at the aftermath of such a bug millions of years after the patch and notice that the stars from this region are not moving at the speed they should be relative to the space around them.

$\endgroup$
1
$\begingroup$

Odd interactions with itself

My take on this is: how different parts of the program relate or react to each other. Somethings that works well separately don't work "right" together.

Sodium is a highly reactive substance that has a tendency to kill things in it's free state. Chlorine is also a highly reactive substance that has a tendency to kill things in it's free state. Put them together, and we have an extremely stable salt that is a requirement for many life forms.

Mercury is a highly stable element. So is gold, and happens to be normally solid at "room temperature". Add gold to a puddle of mercury, and the mercury dissolves the solid gold.

Radioactivity

Radioactivity is parts of atoms that fly away from itself at random times, for not necessarily reasonable reasons. This screams "bug" to me. It doesn't matter if we (think we) know why is does this, it just doesn't act like most other parts of the program.

This is more of a buffer overflow issue. Atoms become too large and there just isn't enough room to hold the data. The array just doesn't have enough memory allocated, we just don't know how to ReDim the situation yet. (Sorry, I couldn't resist. I should have, though.)

On the another hand, maybe there isn't a routine that can take that many parameters.

Take away

We use science to come up with reasons, rules, laws, and guesses as to why things work like they do, but that's the wrong aspect to take when writing a program. Science is documentation of the current features, not a plan for what the program "should" be doing.

Unfortunately, the program is "in the wild", and users are dealing with work-arounds. Some of them even like the work-arounds and make them work for their needs.

The programmers are afraid for their job, so they aren't going to admit that there are any bugs. They know how fragile the code is, and making a change in one area might have unintended consequences in another area.

No users are submitting bugs, so it's fine. Prayers go to an unmonitored email account, or the account was entered incorrectly, so the form isn't actually sent, even though it looks like it did.

There's no error logging, so no way for the programmers to know the users are having issues.

$\endgroup$
1
$\begingroup$

In one of the Matrix movies, they mention that all the UFO and ghost sightings are glitches in the Matrix. Deja Vus are also seen as glitches, but they mainly happen when someone changes something. A few other things that I could see as glitches in reality:

  • Missing Socks.
  • Hawking Radiation.
  • Quantum Tunneling.
  • Special Relativity's slowing of time near the speed of light.
  • Dark Matter.
$\endgroup$
1
$\begingroup$

It is impossible to answer this question by definition.

The definition of a bug is the inability to access an intended state of the program. Because we have no way to determine the intent of the creators of the simulation, we have no way to know what kind of behaviour of the system they would intend or desire. The entirety of existence as we see it now could very well be a bug if they intended for the universe to be empty and dead for eternity.

$\endgroup$
1
$\begingroup$

Stuff on Rails

Preamble

When programming videogames or simulations that follow rules there are mainly three ways to program the motion of objects, regarding complexity

  1. Very complex movement: things follow an algorhitm to decide their next move, with the routine being ran on every frame. The algorhitm takes into account the state of the environment around the object, with no default. Since an object moving this way may have to track the state of multiple other objects, which have their own routines and which may all affect each other motions, this is the most expensive mode in terms of computations. Example: "followers" in current era open world games will usually move in a way to stay within a range of distances from the player, generally taking terrain into account.

  2. Semi-complex movement: the object follows a pre-calculated path, and only changes its motion in reaction to some very specific external stimuli - usually a collision with another object, or reaching the end of its current path. This is less expensive than the complex mode. Example: most "enemies" in platformer videogames, such as the koopas in most Super Mario games - they will move forward until they bump on a wall or another non-player character, or until they reach and edge, and then they will reverse their course.

  3. Railed objects: the object simply moves according to one or more equations, never being affected by other objects' motions, positions or state. Since this mode does not need to consider the state of any other object, and it all boils down to mathing out a fixed path (which computers excel in doing), this mode is the cheapest in processing. We call the paths thse objects follow "rails". Example: the Sun and Moon(s) in games such as The Elder Scrolls and Fallout series.

In real life

The universe is very big, and contains a large number of objects moving around. We are not sure how many, but it's over nine thousand. So in order to keep things very realistic while still saving money, the devs set some faraway objects to move in semi-complex and railed modes.

Some examples:

Dark Matter

When you count the mass of all visible stars in the Milky Way and plug it into keplerian equations, adjusting for Einstein's theories of Relativity, you figure that the stars should move in a certain way. But many stars are orbiting the galactic center at speeds that are much greater than the maths predict.

Boffins have been debating for decades whether there is a a great number of invisible garage dragons and/or too-small-to-detect china teapots exerting an extra gravitational tug upon those stars, thus explaining the extra speed they have. In order to obtain research grants more easily, they have named the dragons Dark Matter and the teapots String Theory, respectively, because those names sound scientific in the same way that Pim particles from Marvel comics sound scientific. But the truth is that most stars far from the Sun are moving on invisible rails, in order to use less CPU power. We were never intended to figure it out, but we did, so the Universe is handling this exception by castigating us with overthinking.

Oumuamua

Oumuamua is an asteroid that came from out of the solar system. It flew by us and is now making its way out of the solar system. Its path is not consistent with Kepler and Relativity. It somehow accelwrated itself on its own at some point. That's because its using the semi-complex mode of motion. It is programmed to viait another star system in a few millennia, but the path it was taking would lead to it missing the next star, so its algorhitm changed velocity to correct its path.

Electrons (and most other particles)

Electons seem to behave as either a wave or a particle - until you start to observe them. That's because when calculating their movement, either wave or particle mode are really cheap to process. But there isn't enough GPU power in the universe to render every observed electron as a volumetric wave in space. So when loaded graphically they fall back to particle mode only, which only requires one pixel per particle. But in this mode they go from complex to semi-complex motion too, changing all measurements we make of their paths.


And then there is a special case, which does use the very complex movement mode:

The common swift

The common swift (Apus apus) is a less known relative of the Taylor swift (Apus cantrix) that is said to spend its whole life in flight[citation needed]. They are even able to mate while flying. Have you ever seen one land? Me neither (the fact that I don't live where these birds do doesn't matter). That's because the designers forgot to make the animations for these mobs when they are grounded, and due to that the devs never bothered coding it either.

$\endgroup$
0
$\begingroup$

From the perspective of my own comprehension I find time, particularly in the relativistic sense, very hard to grasp. Conceptually difficult areas are a ripe breeding ground for bugs and undefined behaviours in code because you can only model based on your understanding.

Consequently if I were worldbuilding for this type of setting I might look at some time-related problems. Let us imagine for a moment that something tangible could - thanks to a bug in the temporal system - pass between two disconnected points in time. So you might get a little light from a past - or future - event passing into any given present. A witness to this event might see a vague image of a person in old fashioned clothes going about their daily activity or perhaps they see the form of some distant future flying machine pass over in the sky. The light may well only escape in a specific direction, so something may be visible from one place but not others. Other tangible effects could also arise - unexplained scents, sounds, cold spots in the air and so on.

So things like ghost and UFO sightings could be explained as errors in the spacetime implementation of a universal simulation, allowing you to take that ghost story into some less predictable directions.

$\endgroup$
1
  • $\begingroup$ The Matrix describes ghost and UFO sightings as glitches, so that was the first thing I thought of! $\endgroup$
    – Murphy L.
    Commented Jan 11, 2022 at 19:03
0
$\begingroup$

Simulating a universe in universe itself can only by done on a higher abstraction level striping it of its actuallity. A simulation of the entire universe can only be the universe itself.

Some of the accidentals of the universe exists probably only in universe itself like the categories of time and space. Some laws of nature may be valid only local.

Gödel's incompleteness theorems: "The theorems are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible."

Wikipedia Gödel's incompleteness theorems

$\endgroup$
3
  • $\begingroup$ You're assuming that the universe the simulation is being run from is identical to the universe we're experiencing, but it doesn't have to be the case. $\endgroup$
    – Sheraff
    Commented Dec 21, 2016 at 15:33
  • $\begingroup$ Probably there are all possible simulations running as the universe itself in parallel. According to quantum theory we live in a multiverse and probably the very reason why there are all possible universa is the impossibility to run them as abstract model. But maybe every of our univera is a highier abstraction level of the higher 'enclosing' universe. We are so to speak a simplification of the universe of higher order simulating all specifications, exactly out of this reason, because the abstract model is to fuzzy, blurred by abstraction, absence of parameteres existing in the higher universe $\endgroup$
    – yeosan
    Commented Dec 22, 2016 at 8:23
  • $\begingroup$ The question was 'what would a bug look like' - the assumption is that this would be visible from within the universe. And yes we can only see it from within. If we take the Gödel's incompleteness theorems seriously we would not notice any of the bugs from within, we can simple not prove if they not belong to the universe. We can only speculate, what it would look like from outside, but never calculate or know. $\endgroup$
    – yeosan
    Commented Dec 22, 2016 at 8:34
0
$\begingroup$

The Random Number Gods Really Would Hate Us! No, seriously, RNG aren't truly "Random" They are at best Psuedo-Random and can be pulled from a list with a seed or key in the same order for the same key. This was used in a Doctor Who episode as the give away to this exact problem. Our "Heroes" were really just models in a hyper-realistic simulation of the world but the RNG for the human response to the prompt "Pick a Random Number" was not properly seeded for each human resulting in every answer of the question in voluntarily follow the same values in the same order. This is an easy mistake to make in coding, and is only detected on multiple runs of the same code as the RNG unseeded will produce the same list of values in the same order in all iterations.

$\endgroup$
0
0
$\begingroup$

Just wanted to channel the Stalker SF video game series because I see that something like this hasn't been mentioned.

In the game, due to Chernobyl and other experiments there are areas near the disaster site where the rules of physics have been corrupted. They are called anomalies.

This is less physicsy than other answers but could still work. Perhaps it's due to a buffering error or something you can get these anomalies.

$\endgroup$
1
  • $\begingroup$ Any reason for the downvote? I was showing another way how glitches could manifest in a simulation world $\endgroup$
    – NotCras
    Commented Jan 11, 2022 at 20:25
0
$\begingroup$

Presence of a bug in a computer program means that all results are according to our expectations but under certain condition, the program gives unexpected results and behaves in unintended ways.

Bugs in a simulated world can cause fatal effects. In 1973 movie 'Westworld', the human like androids get bugs in their programs and as a result, many humans die.

In the universe, objects are following certain laws (Kepler's laws of planetary motion, Maxwell's equations, Relativity, Gravity etc.).

When a bug is encountered, some law will be broken and some objects will behave in unexpected ways (could be fatal).

$\endgroup$

Not the answer you're looking for? Browse other questions tagged .