26

In an answer to Why did IBM System 360 have byte addressable RAM I wrote regarding the choice of byte size:

7 bits would be a perfect match for ASCII, but engineers would instinctively recoil from basing the word sizes on a prime number.

(Okay, someone else pointed out that EBCDIC actually uses 8 bits, but that would only be a concern for IBM in particular.)

Now it occurs to me to wonder, did anyone ever build a computer with a 7 bit byte? Or with a 14, 28 or 56-bit word?

14
  • 6
    There were a number of computers with 6-bit bytes early on.
    – Hot Licks
    Commented Jul 8, 2020 at 13:14
  • 5
    @smci In my era (1970s) the eighth bit was used on paper-tape and comms channels as a parity bit, to detect (but not correct) single-bit errors on the media. Wrong parity would stop a papertape reader on the faulted row, but for comms it would send ASCII NAK to elicit retransmission. I worked on ICL 1900 series mainframes, which had a word size of 24 bits (4 x 6-bit chars) so 7-bit data used (effectively) wide chars to distinguish upper and lower case and control characters, which did not exist in the native 64-char set. Commented Jul 8, 2020 at 15:01
  • 7
    @smci, Re, "I thought ASCII was a kludge to fit 7 bits of alphanumeric data inside an 8-bit byte." ASCII wasn't created for computing. ASCII was created for the telegraph system. en.wikipedia.org/wiki/ASCII#History It was the wide availability of telegraph equipment (especially Teletype model 33 machines) that could be adapted as computer I/O devices that dragged ASCII into the computing world. Commented Jul 8, 2020 at 15:30
  • 8
    @Paul_Pedant: I think the idea behind parity was that in many cases, having a character be recognizably unreadable would be tolerable, but having it appear as the wrong character would not. If an "error character" appears in a place where t█e meaning is obvious, the recipient of a transmission may just fix it, but if such characters would render the meaning unclear, a retransmission can be requested.
    – supercat
    Commented Jul 8, 2020 at 16:33
  • 3
    I once programmed on a CDC 6600 which had a word length of 60 bits. When coding in Fortran, character strings were represented as 10 6-bit bytes per word (upper-case only!).
    – catnip
    Commented Jul 8, 2020 at 21:12

8 Answers 8

48

The PDP-10 had 'byte instructions' that could process a sequence of bytes of size 1 to 36 bits. The byte pointer was a word containing an 18-bit word address (and the usual index/indirect indications) plus position and size of the byte within the word.

It was common to use 7-bit byte sequences for ASCII text, which gave 5 characters per word and one (usually) unused bit.

There were monitor calls (system calls) that accepted strings of 7-bit characters packed in this way.

So: at the hardware level, bytes were variable-sized. At the software convention level, a byte was frequently 7 bits.

See section 2.3 in the KA-10 system reference manual.

6
  • 4
    PDP-10 flexibility in this way was very useful as operating systems also used sixbit encoding in places, e.g., for representing filenames in the file system in TOPS-10.
    – davidbak
    Commented Jul 8, 2020 at 11:35
  • 5
    See also RFC 4042. :-) Commented Jul 8, 2020 at 16:08
  • 2
    Side note: In a BASIC program, the line number was stored as 5 7-bit bytes in a word, and the left over bit was set to 1 to indicate this was a line number. I discovered this when I tried to create a BASIC program with a conventional text editor. It looked fine, but it didn't work because that extra bit wasn't set. (Wow, this was like 40 years ago.)
    – Jay
    Commented Jul 8, 2020 at 20:15
  • 1
    Sure. SOS used to set line numbers too. I think most of the language processors knew to ignore them.
    – dave
    Commented Jul 9, 2020 at 0:32
  • 1
    The same answer applies to the PDP-6. It had byte manipulation instructons like those of the PDP-10. Commented Jul 9, 2020 at 11:42
15

The VT52 text terminal certainly doesn't qualify as a full computer, but it does have a processor running software out of a ROM. The RAM holding the displayed text is 2048 7-bit bytes. The character generator ROM is also 7 bits wide.

5
  • But were those 7-bit bytes or 7-bit words? ;-)
    – dave
    Commented Jul 8, 2020 at 12:16
  • 8
    @another-dave Yes. ;-) Commented Jul 8, 2020 at 15:27
  • @another-dave, maybe and? Commented Jul 8, 2020 at 16:04
  • 3
    According to the VT52's maintenance manual, the machine had an 8 bit data bus and used the full 8 bits internally for its microcode instructions. So it could be said to have an 8-bit word size. However, only 7 bits were used for its character set. Commented Jul 9, 2020 at 10:07
  • 1
    @PaulHumphreys, I see 7-bit data buses in the block diagram in figure 4-1. Yes, the instructions are 8 bits. Program addresses are 10 bits. Commented Jul 9, 2020 at 11:52
11

The second-generation Soviet computer Minsk-32 (the series size is 2889 machines, 1968-75, civilian use, one of the rare early mainframes noted for use in Antarctica) used a 37-bit word and 7-bit representation of alphanumeric characters (5 in a word). Yes, the concept of "bytes" is difficult to apply to a similar old computer (which continued the line of vacuum tube machines), but special commands for the convenience of operations with 7-bit blocks took place in the command architecture.

7

The well-known IBM 1401 technically had a 7-bit byte (plus parity). It was designed around the common format of IBM punched cards, which it was designed to process; these had ten "digit" rows and two "zone" rows, of which one digit and optionally one zone (for which the zero row also counted as a third zone) could be punched simultaneously in each column.

This essentially meant that each column of a card contained a value that could be encoded in 6 bits - 4 representing a BCD digit, and two more indicating the zone (no zone, first zone, second zone, or zero row). To this, the 1401 added a "word mark" bit for internal use, for a total of 7 bits.

6

Yes; there have been several (although, to my knowledge, none in the most simple sense where seven binary bits are treated strictly as as a base-7 system of Peano-like numbers). Instead, they are systems in which at least one (typically, two or three) carry are treated as separate state-modification bits.

The most oldest/most simple example (although it may not meet the definition of a Turing-complete computer) is the ancient 5/2 abacus.

More recent examples generally are cases where some form of binary-coded decimal is used, particularly those that use Chen-Ho encoding (which fit a better conception of the system being "7-bit", as Boolean logic/operations can still be (relatively) easily applied, as opposed to more packed (or packed/padded) 7-bit numbers, which require a variable number of instructions to ascertain certain binary/two's complement values.

Of these, the "two of seven" approach is most common. Examples: the IBM 650, the FACOM 128, and the "IBM 370 compatibility feature" (hardware emulation) built in to the IBM 7070/7074.

3

The Norsk Data ND-505 had a 28-bit address bus.

5
  • 4
    So it did! But only physically not architecturally: "The only significant 28-bit computer was the Norsk Data ND-505, which was essentially a 32-bit machine with six wires in its address bus removed."
    – rwallace
    Commented Jul 8, 2020 at 1:13
  • 5
    @rwallace six? not four? Commented Jul 8, 2020 at 6:41
  • 15
    It has taken 14 years for the arithmetic in en.wikipedia.org/wiki/Special:Diff/56879895 to be challenged.
    – JdeBP
    Commented Jul 8, 2020 at 13:41
  • 4
    The question here talks about the bit size of bytes, not how many of them could be addressed. It seems to me that a machine having a 28 bit address bus wouldn't be particularly relevant to it. Commented Jul 9, 2020 at 10:18
  • 5
    @OmarL - I haven't looked into this machine specifically, but machines with 32-bit wide memory interfaces frequently lack A0 and A1 address lines, i.e. if maximum address size is also 32 bits, they have 30 address lines. Presumably, this system had address lines A2-A27, hence the suggestion of 6 lines being removed.
    – occipita
    Commented Jul 10, 2020 at 1:23
2

The ADAU1701 is a 28-/56-bit DSP for audio processing. CHAR_BIT is probably 28 on that platform like most odd-sized DSPs but I'm not quite sure since I couldn't find its programming manual

-1

Some famous Microchip PIC processors, such as the Microchip PIC 16F84 and the Microchip PIC 16F877A, have a "14-bit processor core". They execute all instructions out of a 14-bit-wide memory. In other words, their program memory has a 14-bit word. .

In addition to executable code, programs running on some (but not all) "14-bit processor cores" can read and write all 14 bits of any word in the non-volatile program memory. A few programs pack character map image data into 14 pixels per program word or 7-bit ASCII text into 2 ASCII characters per 14-bit program word.

Alas, programs running on the original PIC16F84 cannot read or write their own program flash memory (only execute), so if they use constant character data at all, they generally store one ASCII character per 14-bit program word.

8
  • 3
    Sorry but PIC16F84 is an extremely far fetched example which does not even have 7-bit bytes. It's an 8-bit processor and it having 14-bit instruction words is irrelevant. It has no mechanism of reading program memory directly so you can't store two 7-bit ASCII letters into single program word. The only ways to return a value from program code is to load it in W register, load it in the register file, or use the RETLW opcode which also returns a byte.
    – Justme
    Commented Aug 25, 2023 at 11:59
  • 1
    @Justme: Indeed, even when later parts with 14-bit code store added the ability to access it via program control, each word was split into a 6-bit part and an 8-bit part. Code could have used each word to store two ASCII characters, but there was no inherent support for doing so.
    – supercat
    Commented Aug 25, 2023 at 15:31
  • While it is true that some PIC programmers used the 14 bit word to store two 7 bit characters, doing so is not a feature of the CPU or the ISA. It's a software application that may happen on any CPU.
    – Raffzahn
    Commented Aug 25, 2023 at 17:16
  • 1
    @Justme: The original poster literally asked "did anyone ever build a computer with a 7 bit byte? Or with a 14, 28 or 56-bit word?". When someone claims that "it having 14-bit instruction words is irrelevant.", I feel that person didn't actually read the entire question.
    – David Cary
    Commented Aug 27, 2023 at 2:27
  • @DavidCary The MCU in question is not in itself a computer. The MCU can only move around and do calculations with 8-bit bytes of data. The program memory opcodes are 14-bit words, but due to Hardware architecture having separate data and instruction buses, the 14-bit instructions or program memory are not accessible to the programmer or the program running on the MCU in any way, except for the MCU being capable of reading and executing the 14-bit opcodes.
    – Justme
    Commented Aug 27, 2023 at 5:45

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .