4

I use Gentoo as my primary home Desktop OS. I have since ~2004. I appreciate the emerge command for managing a source based OS as well as it does. Every now and then I check on other distributions, but I'm particularly fond of Linux From Scratch - For those of you Who Don't Know. Granted, I've never been through the entire book because using Gentoo has spoiled me in that respect. I consider Gentoo to be LFS + a Package Manager. I finally decided I'm going to complete the book, so I stuck XUbuntu on a VM to simulate the newness and ...


I'm following along in the release candidate for Version 3 - Systemd of CLFS, and it hit me at Chapter 6 - Temporary System - Make. If a user needs make to compile a version of Make, the Chicken and Egg Causality Problem appears This leads me to my next logical questions.

  1. When Stuart Feldman created make in 1976, how did the computing public compile his program if their OS did not contain an OS depenent make? Am I to assume that the WikiPedia article below is true for every OS?

  2. Did he have to package make to include every OS dependent version of make to complete 1?(See Below)

  3. If I needed Program A, but it was only available to compile on OS A, did I have to buy OS A, even if I use OS B? (Please Ignore Windows here if Possible.)

Update

  1. Based on jimmij's comment, Did OS specific compilers exist in the same way that make was OS Dependent?

WikiPedia says:

Before Make's introduction, the Unix build system most commonly consisted of operating system dependent "make" and "install" shell scripts accompanying their program's source. Being able to combine the commands for the different targets into a single file and being able to abstract out dependency tracking and archive handling was an important step in the direction of modern build environments.

Also, please note that I am looking for some historical perspective here, as I was born in 1976.

14
  • 2
    What you pasted from wikipedia answers your question: there was a shell script to "make" the program.
    – psusi
    Commented Oct 10, 2014 at 18:35
  • @psusi then I'll edit it, as WikiPedia only partly answers #1, which is why I asked it. See Edited Q1
    – eyoung100
    Commented Oct 10, 2014 at 18:41
  • There's an underlying question inside your question. You could also ask "If I need GCC to compile GCC, how did GCC ever spread?" or "If I need GCC to compile GCC, how did they get GCC 0.9, in order to compile GCC 1.0?" and maybe the answer has different details, but illustrates the same principle. T-diagrams or Tombstone diagrams can be used to illustrate this general principle: en.wikipedia.org/wiki/Tombstone_diagram
    – user732
    Commented Oct 10, 2014 at 18:46
  • Needing make to make make is not an XY problem. That's a chicken & egg problem.
    – phemmer
    Commented Oct 10, 2014 at 18:51
  • I'm not sure what you mean by "every OS" nor "OS dependent version of make". The program compiled on any unix the same way. I also have no idea what you are asking in the third question.
    – psusi
    Commented Oct 10, 2014 at 18:51

1 Answer 1

4

Make is simply for convenience. You can build software without it, it is just more difficult.

When you run make to build something, it shows you the commands it is running. You can run those commands manually and get the same effect.

$ echo "int main() {}" > test.c
$ make test
cc     test.c   -o test

Which created a file called test.
I can get the same result by just doing:

cc test.c -o test

 

If make didn't exist, you could either instruct users to run this cc command by hand, or you could distribute a shell script, and instruct them to run the script.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .