18

The title say sit all, but why does restarting a computer tend to fix things? It seems like IT folks always ask, "Have you restarted your PC?" But Why?

1
  • And yes.... This was a lame ploy to get some rep on SU when the site launched...
    – RSolberg
    Commented Apr 3, 2010 at 6:07

7 Answers 7

20

Basically because anything that's got in a mess gets the chance to start over. Imagine you're making toast and you burn it. Throwing it away and starting again is one way to fix that problem and will always work out better than scraping the burnt bits of toast off.

5
  • 1
    and tasty too! like the pizza you dropped before it went in the oven... far better to not pick up all the grated cheese and tomato sauce.. oh wait i think i went too far..
    – geocoin
    Commented Jul 16, 2009 at 11:19
  • Can anyone come up with a better but similar analogy? I'm not 100% happy with this one. Commented Jul 16, 2009 at 19:50
  • Yeah, restarting your computer is like scraping the burnt bits off the toast and putting it back in the toaster. What you described was reinstalling the OS ;-) Commented Jul 17, 2009 at 18:40
  • 3
    Ok, imagine you have a whiteboard where you've space to write 5 things you need to do. Every so often you scrub out a task you've completed and replace it with a new one. Now say you accidently pick up a permanent marker rather than a water soluble one to write your new task. When you come to scrub out this task you can't until you wipe the whole board clean with some alcohol. Restarting you computer is "the same' as wiping the board clean, it removes all the "stuck" code Commented Jul 17, 2009 at 21:37
  • Continuing the analogy contest, you can try to gather the spilled milk back into the cup or you can pour yourself another glass of milk.
    – emallove
    Commented Nov 8, 2013 at 15:24
8

One of the major reasons your computer slows down is that its Random Access Memory (RAM) is being used. The operating system, as well as the programs you're running, all use RAM. However, there's only so much of it, and it can only be accessed so fast. If your computer is trying to use a lot of RAM (often more than is availablee), it slows down. It needs to create files extra swap files on the hard drive to act as extra, but less efficient, "RAM". This, among other things, makes your computer slow down.

Closing some programs should free up RAM space, but memory leaks may have occurred. That means that the program may have accidentally taken up RAM that it didn't/couldn't free up when it closed. "Ahhh" you say, "it's going to eat up all my RAM!" Nope. If you restart the computer, all the RAM is cleared out. You've got more avaliable RAM, so your computer can run faster.

There are other problems that could be fixed by a restart, too. For example, if a program somehow begins to use a huge amount of processor cycles (each cycle consists of a calculation, and all of these calculations make your computer "compute", aka work). When the computer is restarted, the control of the processor is unconditionally given to the bootloader, and then it's handed off to the OS, which can start from scratch. It's no longer being dominated by the greedy program.

Yet another possibility is that the computer was overheating. Overheating, simply put, isn't good for the computer. Turning a the machine off and leaving it to cool for a few minutes couldn't hurt. In fact, some (if not all) computers are set to shut down if they reach a certain internal temperature.

In summary, a restart puts the computer into a state where the right software is controlling the right (possibly cooler) hardware, in a state this is already known to work right.

2
  • I once had a Dell Inspiron with a Pentium 4 inside. In summer, it would occasionally switch off without warning. It turned out dust had built up inside, causing it to heat up until it hit 75 degrees celsius, which is the temperature at which P4s automatically switch off..
    – John Fouhy
    Commented Jul 16, 2009 at 1:40
  • 1
    The memory leak issue isn't really that relevant with any NT based (Windows 2000 and onwards) or Linux OS. Sure it used to be the case for DOS, but modern OS's will recover all them memory a program was allocated, leaked or not, when it closes*. It's theoretically an issue for services and the like, but these are generally pretty solid in the first place. * Because the memory allocation algorithms these OS's use are not the simple mem allocs you might expect.
    – user2630
    Commented Jul 17, 2009 at 18:19
6

Good question! The short answer is "it depends"

The longer answer is that Windows has limited resources for applications to use (Memory, Window Handles, File Handles etc.) if a badly written application doesn't give these resources back to Windows when it's finished you Windows run out of resources. This causes problems with other applications. Obviously the same applies to all other operating systems too

6

Two reasons:

  • The OS and software gets to start with a clean slate
  • Any OS / driver updates or installs that have occurred since the latest reboot may need a chance to be part of the boot sequence
3

I know this is an ancient thread, but I feel like this post by a Microsoft developer explains why:

  1. Restarts are often necessary after software upgrades/changes.
  2. This is by design.
  3. This is the way it should be.
  4. This is better than the alternative (and how the alternative works).

Gradual slowness and other restart-needing issues can often be chalked up to memory leaks. Contrary to @user2630's comments, this is still a very real problem in modern Windows. Either from services/system components that stay running, preventing their memory being reclaimed on quit, or just from a plurality of running applications that a user started, leaks occur all the time--sometimes severely. In the latter case of running applications, it's often just simpler for an IT guy to say "just restart it", instead of "close all of your apps, check the task tray to make sure they're really gone, make sure they're not running any background processes or services..." you get the idea.

As was mentioned elsewhere here, a lot of other restart-needing problems are from plain old bad/broken software (hung services, infinite waiting on shared resources, etc etc.). I think that leaks and pending library changes explain the majority of the boilerplate-restart-troubleshooting out there, though.

5
  • Thanks for the link to Raymond Chen's article. I don't think your summary accurately reflects the author's views. He doesn't say it should be this way. He concludes: "So it's not that Windows has to restart after replacing a file that is in use. It's just that it would rather not deal with the complexity that results if it doesn't. Engineering is a set of trade-offs." It makes me wonder: What trade-offs did the Linux developers choose? (Linux is noted for requiring a restart less frequently.) Do they deal with the complexity, or do they just break things? Commented Sep 28, 2013 at 14:01
  • This is opinion, but a few things come to mind: Linux systems that upgrade libraries in-place can often leave other programs running that are linked to old versions of those libraries. There are a lot of systems that try to prevent this, but the complexity discussed in the Microsoft post is still present and isn't always abstracted away, so library-versioning bloat is something that occurs often, for better or worse.
    – Zac B
    Commented Sep 29, 2013 at 15:58
  • Linux also tends towards a more strict regime of dependence modularity, rather than proliferating "used by everything ever" libraries. Those still exist (as do problems caused by in-place upgrades leading to reload-related problems), but are less prevalent than on Windows. IMO, a lot of that reduced prevalence has to do with Windows being developed in a much more agglomerated way (with a persistent goal of backwards compatibility) than Linux, which has an architecture that is, if not more consistent, usually interacted with in a more consistent way.
    – Zac B
    Commented Sep 29, 2013 at 15:59
  • TL;DR: Linux often makes the tradeoff in favor of the rigor and development time necessary to engage with the complexity you mentioned. Having a modular, consistent architecture helps as well.
    – Zac B
    Commented Sep 29, 2013 at 16:00
  • MinWin appears to be Microsoft's push in the same direction. Robert McLaws on Windows Server Core: "Microsoft started to chart out the entire Windows dependency graph. And as they saw things that started calling up the stack when they weren’t supposed to, they rearranged the APIs to create a clean separation in the OS. MinWin is the result of that work. It is not a complete rewrite of the kernel, but a reorganization of the APIs, so that components only call down the stack, and not up it." Commented Nov 15, 2013 at 13:50
2

Have you ever watched "The IT Crowd?"

IT support people use "Try restarting it" as the first response because:

  1. It often will make the problem go away, at least temporarily.
  2. They don't need to exert any further effort.
  3. They don't need to have face to face contact a human being.
1
  • 1
    my wife worked in a place where 'have you tried turning it off and on' was the official first response. she had a problem that caused her desktop to blue screen causing loss of work on a regular basis, however she could never get a fix as 'turning it off and on' always 'fixed' the bluescreen!
    – geocoin
    Commented Jul 16, 2009 at 11:23
-1

To clean the RAM probably

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .