3

There was a time in the early 80s when 64k RAM chips had a significant defect rate, such that half-bad ones could be obtained at a discount. Some computer manufacturers such as Sinclair and Tandy took advantage of this.

48K RAM was a fairly common configuration for a number of different computers in those days. It seems to me that it could make sense to supply it with quarter-bad 64k chips.

Did anyone ever do this? Or is it the case that it would've made sense, but no one ever happened to do it? Or is there a technical or business reason why it would not make sense after all?

2
  • Seems pretty unlikely as the entire lifespan of a DRAM generation in the early 80s (from introduction to obsolescence) was only about 3 years. The 64k chip generation (rougly 1980-1983) allowed for 48KB ram with 6 chips, so it would only be after that (first 256K chips in 1983, I think), that this might have been relevant, and by this time, most designs supported more RAM. So any period where using odd-sized cost-reduced chips made sense would be very narrow.
    – Chris Dodd
    Commented Feb 8 at 21:30
  • @ChrisDodd '64k chip generation (rougly 1980-1983) allowed for 48KB ram with 6 chips' - not with 64kx1 it didn't! And 16kx4 chips were late and surprisingly expensive. But you are right that the period when even the half-bad chips historically used, made sense, was quite narrow.
    – rwallace
    Commented Feb 8 at 21:45

2 Answers 2

3

You might want to take a look at real chips. Access is organized in rows and columns. any error will disable it's whole column (or row, depending on organizaion), thus there are no 'quarter good' chips.

In addition, even if possible, it's not possible to use them in other configuration with other of the same kind, as the top most address bit is only used 'half' way. To overcome this would need additional external logic.

Beside all technical reasoning it's important to keep in mind that use of partiality good chips is not really a good business case. Those half good 32 KiB were only sold for very short time and in rather low numbers. Advancing production quality made them pointless quick.


Note, this is as well related to overprovisioning as done with mainframes and top end single chip CPUs, where the amount of money per unit is way outside consumer range. In fact, both cases have different reasons. For mainframes it's about extreme reliability. Having a spare CPU ready is what the customer pays for. For micros it's about increasing yield.

8
  • 2
    Most real chips use redundancy to improve yields -- here's a patent and a paper. This does require a little bit of extra circuitry to disable the rows or columns or blocks with defects, but the yield improvement is generally worth it.
    – Chris Dodd
    Commented Feb 8 at 20:55
  • 1
    what is exact technical reasoning to disable the whole row in case of any error?
    – lvd
    Commented Feb 8 at 21:02
  • @ChrisDodd yes, as covered in the last paragraph, except, neither most, nor is it about what the question asks. Last but not least, the patent you're citing is about changing RAM structure to enable partial use. Being registered in 1992 does prove this :)
    – Raffzahn
    Commented Feb 8 at 21:03
  • @lvd Because it's ONE row connecting all cells? How else to handle a bad cell?
    – Raffzahn
    Commented Feb 8 at 21:05
  • Your last paragraph implies it is only done for "high end" or expensive stuff, which is actually backwards. It is mostly done for cheap commodity stuff (like DRAMs) to improve yields, and high-end stuff just raises the price to cover low yields.
    – Chris Dodd
    Commented Feb 8 at 21:12
1

Notwithstanding the discussion whether using the term "half-bad" is really viable: Those chips worked according to their specification, so they were "good, but half-capacity".

With "half-bad" chips you needed two supply chains, two type identifiers, and a PCB layout that can adapt/re-route two address lines (to select which half is the one not used/usable).

With "quarter-bad" ones, you'd have to introduce the factor four, the economical likelihood of that would have been, well, very unlikely. With improving fabs and yield, even the economical viability for half-bad ones vanished very quickly.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .