6

Compiling takes computing power, and to a lesser extent, storage and memory. Back in the 70s and 80s personal computers weren't powerful enough to compile codes in high-level languages or if capable took a huge amount of time. Thus small, simple compilers sometimes were often more welcomed than sophisticated ones generating optimized binaries. Some companies had to install minis for their programmers instead of compiling on the targeted machines, and bootstrapping has always been an achievement for a new language and its compilers such as small-c, which first was refined on Unix, then approached part by part to bootstrapping.

Employers and students sent their cards and tapes to Batch processing mainframes to compile. People have cross-compiled since the 50s. Were there any cross-compiling services for the public instead of internal users, free or subscribed, that one could connect to via teletype, telex or modem, or that received letters and packages, so that codes written, checked and interpreted locally could be compiled and debugged on their much more powerful machine?

14
  • 12
    I don't know that the premise of this question is truly valid; I remember several business systems written for the Commodore PET/CBM systems written in 6502 Assembler and CBM-BASIC; I also remember business systems written for the Apple II (pre-plus) in Apple Pascal/UCSD Pascal. Turbo Pascal was an early useful compiler for CP/M systems, and was also one of the early ports to the IBM-PC, and Microsoft had an assembler and compilers that ran on and for the PC for Pascal, FORTRAN, and COBOL, with C coming later. Commented Feb 15, 2023 at 11:48
  • 3
    @JeffZeitlin: I think there is a somewhat earlier time period where it could have made sense, e.g. mid- to late 70s. A hobbyist might only be able to afford a small microprocessor and a couple KB of ROM, enough to control their toaster or whatever but not enough for a development environment. Though on the other hand, if you only have a couple KB of ROM, it's probably not too much work to just write the code on paper and assemble to machine code by hand. Commented Feb 15, 2023 at 16:12
  • 6
    question talks about "small simple compilers" vs "sophisticated ones generating optimized binaries". it should be pointed out that the "sophisticated ones" back then were not at all in the same ballpark as today's optimizing compilers - not even for mainframes or larger minis. "optimizing" technology was in its infancy. Common subexpression elimination, loop reduction/unrolling, stuff that today is totally basic. No link time code generation! No profile-guided optimizations! And those optimizations there were for generally for large numerical codes. Nothing you'd be doing on a micro.
    – davidbak
    Commented Feb 15, 2023 at 23:24
  • 2
    Anyone that had a real need for this, which was probably mostly games developers, used something like Andy Glaister's PDS in-house. retro-hardware.com/2019/05/29/…
    – Alan B
    Commented Feb 16, 2023 at 13:45
  • 2
    RE: "Back in the 70s and 80s personal computers weren't powerful enough to compile codes in high-level languages" Nope, not what happened. The real "problem" was that they were not considered appropriate targets for some languages (specifically COBOL). In fact though, compilers on PCs for virtually every other language (and yeah, there was probably even some COBOL compilers out there too). They just didn't sell very well. Commented Feb 16, 2023 at 19:54

5 Answers 5

26

I too doubt the premise of the question, on several counts:

The notion of using a teletype for access to remote compiling services seems ill-advised, since punching the object module on paper tape (the only option on a teletype) would take hours.

Assuming you'd actually use the target computer to receive the object code (thus writing it to disk, not paper tape), the economics would seem to be against you. You'd need transmit time plus wait time (for your slot) plus compilation time plus receive time to be significantly less than local compilation time.

Bureau services were also not cheap. You could likely buy a few more fast micros for the annual bureau fees.

So my answer to the posed question is "no".

With respect to Employers and students sent their cards and tapes to Batch processing mainframes to compile. As a student - no we didn't. We sent our programs to be run. The fact that they had to be compiled before running was incidental. The nature of student programming tended to be 'after one successful run we are done with it', so keeping object code, even if possible, had little point.

9
  • Loading object modules from tape was really a thing on very early micros. But then again, not exactly what someone did who had remote access to a large(r) machine. That was usually based on a way more integrated process. Beside that,yes. spot on.
    – Raffzahn
    Commented Feb 15, 2023 at 12:36
  • 11
    +1 for: "Bureau services were also not cheap." Such as there were they were aimed directly at enterprises. Not hobbyists, not students. If for no other reason than that they would have had no way to invoice you, credit cards (back then) being not nearly the universal currency facilitator they are now.
    – davidbak
    Commented Feb 15, 2023 at 17:02
  • 1
    "Home computers weren't powerful enough" is, quite bluntly, rubbish /particularly/ by the early 1980s when discs were beginning to be affordable. By the end of the 80s you could buy '386 and use e.g. Ada with a DOS extender. Earlier than the 80s... well, there's plenty of accounts of people ferrying card or tape to a mainframe for a nighttime run, and plenty of accounts of individuals or small businesses hiring minis: but the state of MODEM comms precluded submitting source and having the binary sent back to you. Commented Feb 17, 2023 at 21:00
  • "punching the object module on paper tape (the only option on a teletype) would take hours" No, if you needed large quantities of output in whatever format (green bar printouts, tape, disk pack, cards, microfilm), the bureau would send it via courier, and you'd get it within hours, if so desired. Similarly, large inputs would be picked up. Often you'd have an overnight cycle, whatever was done by the end of the day was sent in, and delivered the next morning, basically the same as the "nightly" builds before continuous integration.
    – user71659
    Commented Feb 17, 2023 at 21:25
  • Well, ok, but that's scarcely 'using a teletype'; my own access to an IBM 7094 was via the UK postal service. And I think it unlikely that the target micro, the one that is too underpowered to run the compiler (the premise of the question), will have a card reader, proper magtape drive, or removable disk pack drives.
    – dave
    Commented Feb 17, 2023 at 23:13
15

TL;DR: Yes,

such services existed.

But

usage was quite limited in time and audience.

Those that existed may be rightfully considered exotic fringe cases for very special situations.


Back in the 70s and 80s

Now, that covers too wide a range of applications and use cases for a coherent answer.

personal computers weren't powerful enough to compile codes in high-level languages or if capable took huge amount of time.

Is that so? A 1 MHz 6502 can assemble its own BASIC in a few minutes, so nothing that can't be waited for. Once there were personal computers, compilation happened there - after all, the main advantage of personal computers were their immediate availability.

Some companies had to install minis for their programmers instead of compiling on the targeted machines,

This was rather for two facts:

  • Unified development environment, and more often than not
  • Target machines that had no OS or programming environment.

Employers and students sent their cards and tapes to Batch processing mainframes to compile.

Not really. Students usually did jobs on their institute's machines, not on some different one. Likewise, at the time real cards were still a thing in programming; there were no micros and not much cross development.

Usually the only time programming happened with cross-compilers on large remote systems was when new CPUs were introduced and the manufacturer did not have any development system already at hand.

So Intel for example offered cross assemblers for 4004 and 8008 before their ISIS systems became available. After that, development was intended to happen on here. Likewise when new CPUs, like 8048 or 8086 became available, Intel offered software packages to compile for these new models - often in combination with hardware adaptors, programmers and ICE probes.

Zilog in contrast just used Intel systems in the beginning before their own development systems were ready - the Z80 is just an extended 8080, isn't it? :))

Apart from that, cross compiling was only a thing for game systems, as these did not really provide a way to develop on target. An exception might be development for similar systems, like developing on a PET for VIC20.

Multi-platform game development was essentially the only area where cross-compilation was a big issue - and here it was always an internal one. Not just to secure the game from leaking, but, as mentioned, to provide a single development environment for all platforms.

Were there any cross-compiling services for the public instead of internal users, free or subscribed,

All of that does of course rely on your definition of "the public". In any case these services were rather limited and only viable for a sort time. A good example might be the (cross) assembler MOS offered in the beginning to their customers using a GE-based dial-in service. As so often when it's about 6502 material, Hans Otten has the documentation, here the 650X Cross Assembler Manual.

Noteworthy, and unlike in the question assumed, that assembler was not comfortable at all. It was a very primitive linear beast. This makes sense, considering that the step from "no assembler" to "some assembler" is much more supportive than "some assembler" to "comfortable assembler".

The dial-in solution was soon superseded with MOS' Resident Assembler developed for the MDT650 - the history of MOS Assemblers can be followed on Michael Steil's Pagetable.

13
  • It's interesting about the cross-assembler dial-up service; I would never have supposed it to exist. Was the rationale more that the target system had no persistent storage for source code?
    – dave
    Commented Feb 15, 2023 at 13:03
  • Students usually did jobs on their institute's machines, not for some different one That depends. For university students, yes. For me as a high-school student, the 7094 was at a remote university. (Applying the word 'student' to high school might be an Americanism; in England we were 'pupils').
    – dave
    Commented Feb 15, 2023 at 13:11
  • I find it somewhat interesting in retrospect that while some few computers were equipped to start and stop two cassette drives independently, this ability does not seem to have been substantially exploited in software development. If one were to build a cable to interface a second cassette drive to the VIC-20's user port, it would have been possible to design a Pascal compiler cartridge for the VIC-20 which could read a sequence of source files of arbitrary size from one datasette and produce on the other a tape that could be loaded and executed in a single pass, provided that...
    – supercat
    Commented Feb 15, 2023 at 15:59
  • ...the total size of the symbol data was small enough to fit in the VIC-20's 5K of RAM. If one had a number of short tapes, each with part of a source program, one could load the output tape drive with a blank tape, feed in all the source tapes one by one, and end up with a single executable tape, without needing a disk drive nor sufficient memory to hold everything at once.
    – supercat
    Commented Feb 15, 2023 at 16:03
  • 1
    @GerardoFurtado Not really. That's tokenify and interpreting. 'Assemble its own basic means building the BASIC-Interpreter, like the 12 KiB of Applesoft from it's assembler sources.
    – Raffzahn
    Commented Feb 16, 2023 at 12:16
6

The main ambiguity here for me is "the public".

I have a microprocessor course book of 1978 (A.J. Dirksen's "Microprocessors"). A part of the book presents some ways of developing:

  • Time sharing
  • Using an in-house system (from PDP-11 to IBM 370)
  • Using a development system

Rodnay Zaks (""Programming the Z80, possibly also in "From chips to systems") adds the single board computer and the home computer (but he dismisses the software support of home computers).

Both authors speak about rented terminals, but none of them talks about the data communication cost. However, just the fact that it is mentioned in a US book and a European book is enough proof that such services did exist.

But the rent of time and a terminal was rather high. Dirksen talks about 100 NLG to 500 NLG, which translates into current prices from 150 EUR to 750 EUR, per hour. So, de facto not really accessible for "the public", only for companies which could justify the costs.

4
  • In some areas in the US, unlimited local calling was available. If someone managed to get a shell, communication costs sometimes might not have to be worried about.
    – Schezuk
    Commented Feb 16, 2023 at 0:26
  • 1
    If dialed-up to a bureau, you can bet you're paying for connect time, since you're tying up real equipment: a modem port in a finite number of modems.
    – dave
    Commented Feb 16, 2023 at 5:52
  • 1
    @Schezuk there’s telephone time, and then there’s machine time. Even with unlimited local calling, it cost a lot to rent time on the remote machine.
    – RonJohn
    Commented Feb 16, 2023 at 18:13
  • 1
    Another aspect was that anyone who called you would get a “busy signal.” Voice mail did not exist. Even when answering machines were invented, they could only pick up and take a message if the line were not already in use. Neither could the modem disconnect and let the call through. In practice, if you were going to use a modem at a time of day when people would be trying to call you, you needed to pay the phone company for a second line.
    – Davislor
    Commented Feb 17, 2023 at 18:37
3

To give you some idea of how things worked back then: In 1973, a kid who lived not far from where I do now sent a letter to a ’zine called The People’s Computer Company, which they printed on page 5 of their November issue, with a handwritten note, “Somebody help him out!”

Dear Sir(s),

I have recently moved from Corvallis, Oregon to Bellevue, Washington. In Corvallis I had access to a CDC 3300 and a Digital PDP 12. A friend of mine gave me some old copies of your newspaper.

I have not had any luck in finding a computer to use. So I would greatly appreciate it if you could send me a list containing the names & addresses (and possibly more information) of your subscribers in the Seattle, Bellevue area, to aid me in my “search”, Any and all efforts will be appreciated!

Thank you!

A Friendly Computer Freak,
Stuart A . Celarier
Age 13, Grade 8

And, in 1973, they published his full home address. Corvallis is a college town, the home of Oregon State University, which is presumably where he got access to computers.

10
  • 2
    Nice, just where is the relation to cross compiling? AFAICS it's about access to any computer without specifying a use case, isn't it? (BW, interesting find. had to read every page :))
    – Raffzahn
    Commented Feb 15, 2023 at 23:12
  • 1
    @Raffzahn Thats all Stuart Celerier said about it in his letter (although he still lives nearby, so I suppose I could try to get in touch and ask). So I don’t know if he ever used a cross-compiler when he was twelve years old and borrowing time on two mainframes in Corvallis, Oregon. However, the PDP-12 did provide “an ultra-powerful, general-purpose assembler, editor, and monitor for both PDP-8 and LlNC programming,” including a Fortran compiler. So he might have had access to that.
    – Davislor
    Commented Feb 15, 2023 at 23:27
  • 1
    Hmm. Many did not have any computer access (might be 99.999% of the wold's population at the time) and some were looking to get it. Local or remote. But I wouldn't see how doing so is related.
    – Raffzahn
    Commented Feb 15, 2023 at 23:51
  • 2
    @Raffzahn Respectfully, if it was that difficult to get access to a computer at all, I think that implies something about whether there were public cross-compilation services. The closest thing to that would be university time-sharing systems open to their students.
    – Davislor
    Commented Feb 16, 2023 at 0:01
  • 1
    'sine qua non' non integrum quote quamquam . Accessus longinquus postulare potest, sed non semper ad compilationem. [Meus latinus vetus esse potest, sed adhuc ibi est]
    – Raffzahn
    Commented Feb 16, 2023 at 21:30
0

Depending on your definition of "public", Motorola provided development systems on your timesharing facility for companies who'd taken their training course on the 6800 family. This ad from Electronics, April 1976:

Motorola Training ad

(Image source: Motorola M6800 Training ad April 1976 - File:Motorola M6800 Training ad April 1976.jpg - Wikimedia Commons)

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .