14

I've read some English and Swedish computer magazines from the 1990s, even late 1990s, and they frequently (maybe even consistently) use "Mb" to mean "Megabyte", even though "Mb" at least now means "Megabit", whereas "MB" is Megabyte.

(Yes, I also know about the "MiB" stuff, but it never seemed to be used by anyone. And it's irrelevant for this case anyway.)

It was extra confusing for me, because I also read video game magazines and they (particularly Nintendo) often talked about "64 Mb cartridge" (literally meaning 64 Megabit, AKA 16 MB) and whatnot.

Was not the "B = byte, b = bit" standard established even back in the day, in the 1970s/early 1980s?

14
  • 20
    64 Megabits is 8 Megabytes.
    – Jon Custer
    Commented May 1, 2020 at 15:21
  • 3
    They possibly did mean 'megabit', especially when it came to 16-bit consoles where cart size was very often discussed that way, probably stemming from the marketing departments being economical with the truth and users not knowing the difference. There was a thing at the time where PC owners would point out that no, your N64 game isn't actually on a 512MB cartridge, console peasant :)
    – Alan B
    Commented May 1, 2020 at 15:24
  • 2
    Given that consoles weren't such a big deal in Europe prior to the Gameboy and Mega Drive — the Master System was a decent seller but those two were the watershed — perhaps it's just that there wasn't any real ambiguity? The pump-up-the-numbers labelling of 'megabits' wasn't really used by anyone in the 1980s home computer world.
    – Tommy
    Commented May 1, 2020 at 15:27
  • 3
    @Tommy Definitely Amiga\ST\Archimedes etc users talked in MB meaning megabytes.
    – Alan B
    Commented May 1, 2020 at 15:29
  • 3
    How are you surprised by this if, quoting, you said: "Yes, I also know about the "MiB" stuff, but it never seemed to be used by anyone."? That's the same thing: not concording on symbolism. Commented May 2, 2020 at 11:13

6 Answers 6

37

Was not the "B = byte, b = bit" standard established even back in the day, in the 1970s/early 1980s?

Not really. It existed (I think at least as far back as 1979's JEP100, but I don't have good sources), but even through the 90s I would say that it wasn't that strictly adhered to. There was a lot of variability all over the place, especially in consumer-facing materials. Even to this day, I write "B" for byte and "bit" for bit to try to minimize the potential for confusion, and never use "b".

6
  • 30
    I've been programming my whole life, most of it professionally - but I cut my teeth back when this was totally ambiguous to self-taught flunkies like myself. To this day I still transpose Mb for MB and don't even get me started on MiB... Bits and Bytes are measured in multiples of 1024 and anyone that says different is wrong wrong wrong! jk. :-P
    – Geo...
    Commented May 1, 2020 at 16:30
  • 12
    Except when you're seeing how fast the bits go down wires, and then it's k = 1,000, M = 1,000,000. (My first professional programming involved connecting computers to data communications links leased from the GPO)
    – dave
    Commented May 1, 2020 at 22:03
  • 10
    @Geo... I sympathize, but hard drive manufacturers, like 80s MIDI keyboard manufacturers, love to take any opportunity to be technically correct but deceptive about their numbers, so bytes in multiples of 1000 are too common to do anything but sigh and accommodate.
    – ssokolow
    Commented May 2, 2020 at 16:04
  • 2
    Being from the 80s, at the start of the decade Mb usually did mean megabit. It wasn't uncommon to measure memory in megabits. That was how you purchased memory: a byte-addressable machine of 16KB would have eight 16Kb chips. For larger computers not all computers were byte-addressable, and word sizes varied considerably between models, so if you wanted an idea of the size of memory, then expressing it in megabits was optimal. By the end of the decade the 8-bit byte was firmly established, byte-addressing was the norm apart from supercomputers, so one could then usefully say "megabytes".
    – vk5tu
    Commented May 3, 2020 at 13:06
  • 1
    @Geo... You are, of course, correct. Just that the prefix isn't k, but ki, and not M but Mi, and so on, and so forth... ;) Commented May 4, 2020 at 10:28
8

Was not the "B = byte, b = bit" standard established even back in the day, in the 1970s/early 1980s?

Sure, it was, but magazines and the like were not only consumer publications, but as well made by only partially educated people. Everyone wrote like he thought it would fit. More so, I don't think any country ever invoked a spelling police for computer magazines, or did they?

:)

While some magazines let each author decide, others tried to evoke their own conventions to keep present consistency to their readers. These conventions were based on different reasoning from spelling (always writing Kb and Mb for bytes), over heritage (like electronics/communication magazines using strictly only decimal prefixes, as was common for transmission rates) and avoiding (always writing in full or as word like Kbyte/Mbyte) to attempts of avoiding collision (like avoiding collision by use 'unused/' SI combinations KB and mB for binary Kbyte and Mbyte).

In the end it differed from publication to publication and country to country. The Wiki entry for Binary Prefixes tries to shed light on some of the origins and attempts to straighten it.

Yes, I also know about the "MiB" stuff, but it never seemed to be used by anyone.

Well, I do, and many others as well. Try it yourself. You'll get used soon and laugh about all the inconsistency others still produce.

9
  • 7
    As someone who wrote for computer magazines in the 1980s, can confirm ☺ But MiB is for Men in Black
    – scruss
    Commented May 1, 2020 at 22:55
  • 7
    It seems wrong to ascribe to lack of education the not knowing something that was not standardized in the 1970s, was de facto standardized the following decade (ironically for this answer, as promoted by computer magazines such as BYTE and InfoWorld), and only de jure standardized (with bit not abreviated to lowercase b, contrary to the question and what the de facto conventions were) in the 21st century. That's lack of a time machine, not lack of education.
    – JdeBP
    Commented May 2, 2020 at 7:57
  • 1
    @JdeBP To be fair the quoted line says 1970s / early 80s. And there are standards using lowercase b for bit: 1 2
    – JBentley
    Commented May 2, 2020 at 10:55
  • 4
    You are side-stepping where you ascribed it to people not being educated, Raffzahn. And you are dodging and weaving, talking of lack of standardization in one breath and then claiming worldwide standardization in the next. M. Bentley has pointed to some 21st century years for you. And M. Bentley, you cannot be seriously proposing that standards from 2002 and 2004 are counterexamples to what I said about standardization in the 21st century. ISO/IEC 2382:1984, would be a counterexample, were it not that its entries for "bit" and "byte" do not give abbreviations. (-:
    – JdeBP
    Commented May 2, 2020 at 12:57
  • 1
    @JdeBP I think you misunderstood me. The standards I cited were counterexamples to this: "with bit not abbreviated to lowercase b". I was pointing out that they were abbreviated to lowercase b.
    – JBentley
    Commented May 4, 2020 at 6:40
4

At least for console games, it was actually common to measure size in bits, not bytes.

A reference can be found at https://atariage.com/forums/topic/167980-what-does-two-mega-cartridge-mean/

2

Since 1968 I was taught B = byte and b = bit.

The confusion with kb = 1024 bits comes from the digital world using binary arithmetic: 10000000000 binary = 2^10 decimal = 1024 decimal.

1000 decimal = 1111101000 binary. I don't know about you but I prefer the standard cludge.

3
  • 1
    NO .. I feel cheated by all hard-drive manufacturers advertising x GB or x TB and then receiving less capacity ... and they always "excuse" by writing some BS about your OS may display differently ... while they just wanted to use the bigger number as advertising factor and at the same time safe money.
    – eagle275
    Commented May 4, 2020 at 8:31
  • 1
    @eagle275, yes, but all manufacturers do that consistently. The entity at wrong here is, technically, your operating system, for displaying the unit as "GB", when it is actually showing "GiB" numbers. There are systems that - correctly - display the same number with the correct unit. Windows (in any version) is not one of them, afaik. Commented May 4, 2020 at 10:32
  • Not wanting to start a debate - but as far as I know at least the currently used OS behave similar - some even "wasting" more space for administrative structures (Inodes and mft and so on ) But I haven't seen a single OS reporting more space than the drive manufacturer claims (excluding compressed file systems )
    – eagle275
    Commented May 4, 2020 at 10:35
2

I've worked with computers since the beginning of the 80's, professionally since 1987. I was unaware that there was any kind of "official" convention on what "b" and "B" means and it has always been chaotic and confusing. I have always assumed a "b" in upper or lower case could either mean bits or bytes and I usually worked it out from context. In fact, when asked how much memory a particular application needs, I will usually write 24Gb (or whatever) meaning 24x230 bytes

In fact, in the early 80's home computing scene in the UK the big debate was on whether to use "K" or "k" to mean "kilo" and some purists said we shouldn't use "kilo" at all because we were invariably talking about 1024, not 1000. "B" or "b" always meant "bytes". Transfer speeds were pretty much irrelevant to 80's home computing except when dealing with modems and then the unit was "baud" as in "300 baud modem". Once I got to university and started doing "proper" computing, memory was usually measured in words, which of course vary in size according to the particular machine (as can bytes, theoretically).

So, the answer to the question is that they either didn't know or didn't care that there is is a difference. You can usually tell from context what is meant anyway.

0

One thing to remember is that you are reading a printed magazine. Writing 512 MB is a lot louder than 512 Mb. Magazine style guides may have dictated using a lower-case "b".

It's also worth noting that storage sizes (memory, disk, whatever) have always been measured in bytes, so kb and Mb (there really wasn't any Gb in those days) were always unambiguously measured in bytes. Even when you working with error-corrected memory (when a byte takes up more than 8 bits), you count in bytes.

The only time you see "b" referencing bit (rather than byte) is in network speeds. And, network speeds measured in bits can be deceptive; if you are passing one kB of information down a wire, you will pass more than 8 kb of bits.

1
  • So "B" is '"b" turned up to 11?
    – Jon Custer
    Commented Jul 19, 2021 at 18:39

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .