Timeline for Why might advanced digital computers be unusable on my planets?
Current License: CC BY-SA 4.0
9 events
when toggle format | what | by | license | comment | |
---|---|---|---|---|---|
Nov 25, 2022 at 16:43 | comment | added | Daniel B | @Daron great, so build something underground and cover the roof with a few meters of it. | |
Nov 25, 2022 at 14:24 | comment | added | Daron | @DanielB The sludge is not magical. It is just deep enough to absorb radiation. Even a few metres of water will do the job. The Arnie GIF does not make it clear. | |
Nov 24, 2022 at 16:48 | comment | added | Daniel B | @Daron Certainly better than a layer of sludge, though I suppose if it’s magical radiation blocking sludge, it would make sense to make a roof of double-layered polycarbonate glass with a layer of sludge between the inner and outer panes. Above the faraday cage. | |
Nov 24, 2022 at 15:08 | comment | added | Daron | @DanielB Hmm. . . Will Faraday protect you against every form of radiation that could possibly harm you? What about neutron showers? | |
Nov 24, 2022 at 15:08 | comment | added | Daron | @Vesper Ah you're right. There is no reason to pump the sludge itself around as coolant. Just pump water through sealed tubes from the processor into the sludge and back. | |
Nov 24, 2022 at 13:43 | comment | added | Daniel B | Faraday caged computers solve this pretty neatly. | |
Nov 24, 2022 at 13:14 | comment | added | Vesper | Man, have you ever heard of a water-cooled radiator, or a geothermal plant? Essentially you describe a problem as not being able to radiate heat into the sludge with resultant overheat. Wrong, say 80286 PC did not require external cooling at all, and here in 2022 we have immersion-cooled devices with radiators elsewhere - these could be developed to transfer heat to sludge even if it's solid. | |
Nov 24, 2022 at 10:36 | history | edited | Daron | CC BY-SA 4.0 |
added 434 characters in body
|
Nov 24, 2022 at 10:24 | history | answered | Daron | CC BY-SA 4.0 |