13

I’m trying to figure out when IBM switched to ASCII and when ASCII became a worldwide standard.

Moreover, did IBM make ASCII standard worldwide?

What I have found:

According to Wikipedia, IBM System /360 had EBCDIC charset. It was an eight-bit character encoding, developed separately from the seven-bit ASCII encoding scheme.

On March 11, 1968, U.S. President Lyndon B. Johnson mandated that all computers purchased by the United States Federal Government support ASCII:

All computers and related equipment configurations brought into the Federal Government inventory on and after July 1, 1969, must have the capability to use the Standard Code for Information Interchange

Also I have found the following:

operating systems running on the IBM PC and its descendants use ASCII, as did AIX/370 and AIX/390 running on System/370 and System/390 mainframes

Is it safe to say that IBM moved to ASCII starting from System /370?

If so, is it safe to say that IBM started use ASCII from 1970s?

And if so, is it safe to say that System /370 had many clones, therefore ASCII became popular worldwide?

11
  • 1
    AIX is a variant of Unix. I wouldn't be at all surprised if their proper operating systems (most 370's et al do not run AIX natively) still use EBCDIC, or, if not, they'll have moved to some variant of Unicode.
    – JeremyP
    Commented Jul 8, 2020 at 7:46
  • 3
    Perhaps it is incorrect to say ASCII become a word wide standard. Better to refer to ISO-8859 (sometimes referred to as "ANSI") as the standard. The first 128 characters are the same as ASCII, but the last 128 incorporate many other elements of Western European alphabets such as accented vowels, ñ, ß, etc. Then, as ANSI extended ASCII to Western Europe, Unicode extended ANSI to the world.
    – RichF
    Commented Jul 8, 2020 at 15:25
  • 2
    operating systems running on the IBM PC and its descendants use ASCII It doesn't seem that way to me when you look at the plethora of DOS code pages. They may be similar to ASCII in the first 128 or so codepoints, but that does not make them ASCII.
    – dave
    Commented Jul 8, 2020 at 17:49
  • 1
    @another-dave That's all the codepoints ASCII has. If an OS supports a superset of ASCII, a fortiori it supports ASCII.
    – Rosie F
    Commented Jul 8, 2020 at 19:50
  • 1
    "When did IBM start to use ASCII?" (in any model of computer? teletypes?) and "When did IBM mandate using ASCII across all hardware, computers and teletypes?" are different questions. The answer to the former is apparently ~1964 (the 2260 terminal) but the answer to the latter is probably "sometimes between 1968-early 1970s".
    – smci
    Commented Jul 8, 2020 at 21:36

5 Answers 5

20

TL;DR:

  • ASCII was never intended for processing, just as an interface standard for data exchange (hence the name American Standard Code for Information Interchange)
  • IBM never switched, it still uses EBCDIC within mainframes and ASCII for communication.
  • IBM was a major proponent for ASCII, but not the sole force, and especially not international.
  • ASCII soared in international use in the 1970s as being recommended by ISO and ECMA - especially the later being the driving force due the huge variability within Europe.
  • (later) Mini computers and especially micro computers simply started out by using ASCII as well for processing as there is no reason for inventing a different (*1).

In Detail:

I’m trying to figure out when IBM switched to ASCII and when ASCII became world wide standard.

Well, IBM never switched.

EBCDIC is used due historic reasons within a /360 mainframe, but all outside connections (except for proprietary) are ASCII. In fact, as IBM was a driving force behind ASCII the /360 were prepared to switch for ASCII when it comes to BCD handling. Except, it never became useful and was dropped in the late 80s.

Moreover do IBM made ASCII world wide standard?

IBM was a major force for ASCII from the start, pushing for a standard. This included the use of preliminary standards or terminal system.. It was as well intended to be used for the /360 and any later machine, but standardisation took longer than expected, so they had to go forward with their own 8 bit code, EBCDIC, based on prior 6 bit and punch card codes.

While being a main player, IBM alone would not have been able to force it. Similar a US buying order could not do it - after all, it requested only compatibility with ASCII for information exchange, not operation in ASCII. A loophole big enough for anything in existence and to be invented to slip thru. All needed was some interface to accept ASCII data and reply using it.

ASCII only became an international standard when ECMA, the European Computer Manufacturers Association, recommended an ASCII based international variant in 1965, which became an international recommendation the in 1967 as ISO 646 and finally accepted in 1972. Here ISO 646-IRV defines a compatible base for all participating (latin) scripts. ASCII is at that point simply the US variant called ISO-646-US and relates to the 1968 version.

From the early 1970s on ISO-646 became the major encoding standard used for anything except mainframes.

Is it save to say, that IBM moved to ASCII starting from System / 370?

No. Beside the fact that /370 is more of renaming game than really a new series, the /360 was able to handle ASCII from the start. In fact, ASCII as a hardware supported feature got dropped from the line around the time it was renamed /390.

If so, is it safe to say, that IBM started use ASCII from 1970s?

No. IBM used ASCII already before it was a standard and continues to do so today.

And if so, is it safe to say, that system 370 had many clones, therefore ascii became popular world wide?

No, as /370s are still using EBCDIC as default character set. Unix on /370 and later is an exception. But ASCII can/is used on all communication (which was satisfied the mentioned law) with external systems.

In the IBM mainframe world (*2) two basic codesets were used:

  • EBCDIC for everything within the system, that is CPU, memory, disks, tapes and other storage, as well as remote systems.
  • ASCII for all communication to terminals and remote systems.

And this continues until today.

Modern (past 1970) EBCDIC became a full superset of ASCII. EBCDIC's structure reflects ASCII and is as well the reason why ISO 8859 contains two areas of control characters :)


*1 - Then again, some did, like Commodore (PET) or Apple (Apple II) that private codes may be helpful - except, these got confined to special areas and hidden beneath.

*2 - That is IBM and all hardware compatible systems like Hitachi, Fujitsu, Bull, Univac, RCA, Siemens, ...

5
  • 1
    Indeed, IBM were the (only) ones who actively opposed the complete removal of trigraphs from C++: in 2009 and in 2014.
    – Ruslan
    Commented Jul 8, 2020 at 19:16
  • @Ruslan True. IMHO removal of trigraphs is a complete useless measure, bringing no improvement but breaking compatibility in some strange corners.
    – Raffzahn
    Commented Jul 8, 2020 at 19:58
  • 1
    I quibble with "EBCDIC for everything within the System", because of z/OS UNIX System Services, which can be used to create an "island of ASCII", and because of Linux on z. Commented Jul 9, 2020 at 21:43
  • @MichaelGraf True, especially the 'Island of ASCII' part. As the island it is, the Unix services have to translate between the over all EBCDIC environment all the time.So Unix is rather the exception to prove the rule. In fact, even ASCII using UNIX programs do have to translate from EBCDIC every time they use BCD conversion instructions :)
    – Raffzahn
    Commented Jul 9, 2020 at 22:17
  • 1
    Also machines can work in ASCII. It is just yet another code page with no special status. Commented Jul 13, 2020 at 8:17
19

IBM started using ASCII before 1970; the 2260 terminal, released in 1964, used the unpublished (but ratified) 1965 version of the ASA X3.4 standard.

IBM mainframes still use EBCDIC, so I don’t think their popularity had much bearing on the popularity of ASCII (but other encodings’ popularity influenced IBM mainframes: their instruction set includes conversion instructions). The popularity of ASCII is overestimated from a Western perspective too: most Asian markets used other character encodings, and even European markets used more than ASCII (but European encodings include ASCII as a subset).

3
  • 3
    Adding to this, even modern usage of 'ASCII' is not always purely ASCII, it'a almost always as a subset of another encoding that happens to utilize ASCII for the 7-bit space represented by bytes with the MSB being 0 (such as UTF-8, Shift JIS, or an ISO 8859 variant). for compatibility reasons. Commented Jul 8, 2020 at 22:12
  • What is current status of mainframes - are they still manufactured and improved?
    – i486
    Commented Jul 10, 2020 at 10:38
  • @i486 Of course. They are still (and for all forseable future) the backbone for most large systems. Whenever you do a credit card transaction, chances are good that a /360 style mainframe is involved at some - or several - points.
    – Raffzahn
    Commented Aug 17, 2020 at 20:58
4

In the 1960s, IBM used a crazy variety of character codes. IBM was the king of punched cards, commonly known as "IBM cards", so many codes related to the sparse 12 bit codes used for those. However, even these were not fully standardized: different keypunch models used different character sets! 6-bit BCDIC was designed to easily map to the most common card code characters. Many IBM peripherals used codes closely related to BCDIC, but they generally required some translation specific to the peripheral. But printer chains could be changed. Then there were Selectric tilt/rotate codes, which only told the mechanism how to move: what character you got depended on what typeball you had on the spindle.

EBCDIC, extended BCDIC, was IBM's internal data exchange code for the 1130 and System/360, but IBM machines could also use ASCII as an external code.

I think it was the Teletype Model 33 that drove the popularity of ASCII. IBM's business was card-based batch processing mainframes, but time-sharing and single-user interactive systems were becoming available. In the beginning, the Model 33 was the terminal of choice because it was inexpensive and good enough for the job. If your focus is interactivity, there are no punched cards, and your keyboard, printer, and paper tape all use ASCII, it makes sense to make ASCII your common character code.

4
  • 1
    And one may add that true blue professionals referred to ASCII terminals as DUMB terminals (as they could not probe the mainframe functions directly)
    – MKhomo
    Commented Jul 8, 2020 at 23:29
  • It should be mentioned that BCDIC is an acronym for "Binary-Coded Decimal Interchange Code". Commented Jul 9, 2020 at 11:45
  • 1
    @MKhomo Nope, they were "dumb" because they couldn't process text without help. The "smart" terminals could do simple text manipulation themselves, without help from the mainframe.
    – John Doty
    Commented Aug 17, 2020 at 12:48
  • I suppose you're right if one discounts as processing, the holding on to what you've 'keyed' until you 'Hit' the Xmit key. My mainframe exposure is limited, but I remember when they wanted to provide me with a remote 'smart' terminal, the thing was as big as my desk and had more keys than an alligator had teeth, and the mainframe techs liked it because it could waltz with the consoles, in an arcane console-speak VTAM is all I remember of it; whereas I was more used to the little pedestals that fed little punched tapes with pre-ASCII BCD for a mini (HP3000 I think).
    – MKhomo
    Commented Aug 18, 2020 at 14:37
2

Answering one of your questions "When did ASCII become a worldwide standard", the answer is: never. The "A" stands for "American". At the time the US adopted ASCII, other countries were adopting their own variants, substituting different characters according to national needs: for example in the UK, "£" was substituted for "#". These variations were endorsed and harmonized by the international standard ISO 646. For many years if you bought a printer, for example, you would have to make some fiddly settings on installation to configure it to your preferred national variant of ISO 646. (Of course, many people, especially Americans, confused the terminology, and thought of all these standards as "ASCII with variations").

All of these were 7-bit standards, and in the 1980s they were largely superseded by 8-bit standards such as ISO 8859-1 also known as Latin-1. These too had regional variants, though with 8 bits a single variant was good enough for the whole of Western Europe. These standards generally had ASCII as a subset (or at least, the printable ASCII characters: control characters are another question). But the term "ASCII" persisted in popular usage, and you will see plenty of StackOverflow questions using the term "ASCII" to refer to characters with codes above 127 - indeed some people pretty well use "ASCII character" as a synonym for "character". But if you're talking standards, then ASCII per se was never a standard anywhere except the US.

2

I'm going to provide a terrible answer, but include a couple references that might be great for nostalgia. One is NostalgiaNerd on youtube, he provides a British viewpoint of IBM's shift to ASCII (OK they only did it through codepages, not really fully/completely ASCII).

The video is strangely titled, nothing about ASCII or EBCDIC in the name: "These keys shouldn't exist" (It's about the pipe sign, one has a space halfway down it on some keyboards) https://youtu.be/BktIY7VbrUs [Do note, this video might have a lot wrong info (per the comments on stack) I wasn't sure if I should strike or completely remove the link.]

If the video is error filled, NostalgiaNerd at least gives us some references to handwritten notepads from early development days of ASCII here: https://longstreet.typepad.com/thesciencebookstore/2012/03/heres-the-link.html

1
  • 4
    Ouch. That video gets so much wrong. It should be deleted right away before spreading any further - if only for his lazy way justifying the few errors he found. He mixes up early PC versions, how punch cards work, calls paper tapes punch cards, ignores the wide use of 6 bit codes based on ITA2, assumes that codes needs to be sorted in alphabetic order, using never used 6 hole tape, ... All in the first 4 minutes when I stopped counting. And all to support some rather exotic theory of influence. Bottom line: People will end up less informed than without watching.
    – Raffzahn
    Commented Jul 9, 2020 at 6:17

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .