Hard won experience has taught me that almost everything belongs in source control. (My comments here are colored by a decade and a half developing for embedded/telecom systems on proprietary hardware with proprietary, and sometimes hard to find, tools.)
Some of the answers here say "don't put binaries in source control". That's wrong. When you're working on a product with lots of third party code and lots of binary libraries from vendors, you check in the binary libraries. Because, if you don't, then at some point you're going to upgrade and you'll run into trouble: the build breaks because the build machine doesn't have the latest version; someone gives the new guy the old CDs to install from; the project wiki has stale instructions regarding what version to install; etc. Worse still, if you have to work closely with the vendor to resolve a particular issue and they send you five sets of libraries in a week, you must be able to track which set of binaries exhibited which behavior. The source control system is a tool that solves exactly that problem.
Some of the answers here say "don't put the toolchain in source control". I won't say it's wrong, but it's best to put the toolchain in source control unless you have a rock solid configuration management (CM) system for it. Again, consider the upgrade issue as mentioned above. Worse still, I worked on a project where there were four separate flavors of the toolchain floating around when I got hired -- all of them in active use! One of the first things I did (after I managed to get a build to work) was put the toolchain under source control. (The idea of a solid CM system was beyond hope.)
And what happens when different projects require different toolchains? Case in point: After a couple of years, one of the projects got an upgrade from a vendor and all the Makefiles broke. Turns out they were relying on a newer version of GNU make. So we all upgraded. Whoops, another project's Makefiles all broke. Lesson: commit both versions of GNU make, and run the version that comes with your project checkout.
Or, if you work in a place where everything else is wildly out of control, you have conversations like, "Hey, the new guy is starting today, where's the CD for the compiler?" "Dunno, haven't seen them since Jack quit, he was the guardian of the CDs." "Uhh, wasn't that before we moved up from the 2nd floor?" "Maybe they're in a box or something." And since the tools are three years old, there's no hope of getting that old CD from the vendor.
All of your build scripts belong in source control. Everything! All the way down to environment variables. Your build machine should be able to run a build of any of your projects by executing a single script in the root of the project. (./build
is a reasonable standard; ./configure; make
is almost as good.) The script should set up the environment as required and then launch whatever tool builds the product (make, ant, etc).
If you think it's too much work, it's not. It actually saves a ton of work. You commit the files once at the beginning of time, and then whenever you upgrade. No lone wolf can upgrade his own machine and commit a bunch of source code that depends on the latest version of some tool, breaking the build for everyone else. When you hire new developers, you can tell them to check out the project and run ./build
. When version 1.8 has a lot of performance tuning, and you tweak code, compiler flags, and environment variables, you want to make sure that the new compiler flags don't accidentally get applied to version 1.7 patch builds, because they really need the code changes that go along with them or you see some hairy race conditions.
Best of all, it will save your ass someday: imagine that you ship version 3.0.2 of your product on a Monday. Hooray, celebrate. On Tuesday morning, a VIP customer calls the support hotline, complaining about this supercritical, urgent bug in version 2.2.6 that you shipped 18 months ago. And you still contractually have to support it, and they refuse to upgrade until you can confirm for certain that the bug is fixed in the new code, and they are large enough to make you dance. There are two parallel universes:
In the universe where you don't have libraries, toolchain, and build scripts in source control, and you don't have a rock-solid CM system.... You can check out the right version of the code, but it gives you all kinds of errors when you try to build. Let's see, did we upgrade the tools in May? No, that was the libraries. Ok, go back to the old libraries -- wait, were there two upgrades? Ah yes, that looks a little better. But now this strange linker crash looks familiar. Oh, that's because the old libraries didn't work with the new toolchain, that's why we had to upgrade, right? (I'll spare you the agony of the rest of the effort. It takes two weeks and nobody is happy at the end of it, not you, not management, not the customer.)
In the universe where everything is in source control, you check out the 2.2.6 tag, have a debug build ready in an hour or so, spend a day or two recreating the "VIP bug", track down the cause, fix it in the current release, and convince the customer to upgrade. Stressful, but not nearly as bad as that other universe where your hairline is 3cm higher.
With that said, you can take it too far:
- You should have a standard OS install that you have a "gold copy" of. Document it, probably in a README that is in source control, so that future generations know that version 2.2.6 and earlier only built on RHEL 5.3 and 2.3.0 and later only built on Ubuntu 11.04. If it's easier for you to manage the toolchain this way, go for it, just make sure it's a reliable system.
- Project documentation is cumbersome to maintain in a source control system. Project docs are always ahead of the code itself, and it's not uncommon to be working on documentation for the next version while working on code for the current version. Especially if all your project docs are binary docs that you can't diff or merge.
- If you have a system that controls the versions of everything used in the build, use it! Just make sure it's easy to sync across the whole team, so that everyone (including the build machine) is pulling from the same set of tools. (I'm thinking of systems like Debian's pbuilder and responsible usage of python's virtualenv.)