Why did IBM decide to use 8 bits for EBCDIC?
7 bits per character seems reasonable, since 2^7 = 128 symbols
which is enough for all English letters and special symbols.
Why did IBM decide to use 8 bits for EBCDIC?
7 bits per character seems reasonable, since 2^7 = 128 symbols
which is enough for all English letters and special symbols.
Why did IBM decide to use 8 bits for EBCDIC?
Because the /360 ISA defined a byte to consist of 8 bits - which in turn was chosen as it allows to store two decimal numerals (BCD) within a byte.
7 bits per character seems reasonable, since 2^7 = 128 symbols which is enough for all English letters and special symbols.
For one, there is not just English - a company like IBM for sure does not want to restrict their sales to only English speaking countries? And even then, its customers don't want to serve only them as well, right?
On a more practical side, when it came to printable characters EBCDIC did cover everything ASCII did (*1) while leaving plenty room for additional use cases.
Most important: There is simply no sense in defining a character code that uses less bits than a character cell offers.
*1 - At that point its important to keep in mind that IBM was a major proponent of ASCII - in fact, next to all terminal equipment used ASCII - there was never any dispute people nowadays assume.