0
$\begingroup$

Say I wanted to heat a room with either resistors or cpus mining bitcoin, if I give either the same watts, will they both produce the same watts into heating the room ?

(obviously the cpus additionally create proof of work)

$\endgroup$
2
  • 1
    $\begingroup$ is your question about information or energy? watts is a measure of power, so if power is consumed by your electronics, it has to come from somewhere; and transistors/resistors always dissipate heat. But information has to do with entropy; which is different from just "watts" or "heat"...although closely connected $\endgroup$ Commented Feb 6, 2021 at 6:53
  • $\begingroup$ If they use the same amount of watts then the heating will be the same. This doesn't say much about information though. $\endgroup$ Commented Feb 6, 2021 at 8:19

1 Answer 1

1
$\begingroup$

Yes. Watts of heat are watts of heat. Creating information in practice costs energy. Some devils in the details:

For current processors the kind of computation done hardly matters, since most of the heating is due to inefficiencies in the microchips which is essentially like resistive heating.

But even perfectly efficient processors would heat by discarding information. This is called Landauer's principle, usually expressed as "when you erase information you will have to pay an entropy cost". This means that there will be waste heat. In a sense this waste heat is the discarded information being transmitted away using thermal vibrations (in theory it could be using other channels, but in practice it will be waste heat), becoming random noise.

One can construct algorithms that are more energy efficient than others.

If you run a computation that does not discard any information you can in theory avoid this waste heat. So the number of operations per Joule of energy is not fixed. In the case of mining bitcoin you can in principle do everything reversibly up until the point where you transmit the new block, then run the calculation backwards until you are in the original state, receive a new block and just pay for erasing the bits used to store that block over the old. There are downsides: memory will fill up until you "uncalculate" results, reversible computing tends to be very slow and so on.

When we say a program "creates information" is usually means that it produces a useful result (digits of $\pi$, bitcoins, answers to questions, etc.) out of previously available information and computing resources (operations, time). There is a great deal of difference between semantic information (something that informs us, like "the billionth hexadecimal digit of $\pi$ is a 9") and information in the physical sense here, states that could be different (a bunch of bits, "1001"). It is the later that have a thermal cost - changing states around increases entropy unless done reversibly - but they are used to represent the useful semantic information. Hence a correct and incorrect $\pi$-digit calculating program will both produce the same heat. Which in a real sense is discarded information flushed away as thermal excitations.

The amount of semantic information found is nearly unrelated to the heat: a better algorithm can find the same answer using less compute.

$\endgroup$
1
  • $\begingroup$ thank you - that is a fascinating read that will offer me many googles worth of figuring it out =) $\endgroup$
    – Eric
    Commented Feb 6, 2021 at 10:02

Not the answer you're looking for? Browse other questions tagged or ask your own question.