Say I wanted to heat a room with either resistors or cpus mining bitcoin, if I give either the same watts, will they both produce the same watts into heating the room ?
(obviously the cpus additionally create proof of work)
Say I wanted to heat a room with either resistors or cpus mining bitcoin, if I give either the same watts, will they both produce the same watts into heating the room ?
(obviously the cpus additionally create proof of work)
Yes. Watts of heat are watts of heat. Creating information in practice costs energy. Some devils in the details:
For current processors the kind of computation done hardly matters, since most of the heating is due to inefficiencies in the microchips which is essentially like resistive heating.
But even perfectly efficient processors would heat by discarding information. This is called Landauer's principle, usually expressed as "when you erase information you will have to pay an entropy cost". This means that there will be waste heat. In a sense this waste heat is the discarded information being transmitted away using thermal vibrations (in theory it could be using other channels, but in practice it will be waste heat), becoming random noise.
One can construct algorithms that are more energy efficient than others.
If you run a computation that does not discard any information you can in theory avoid this waste heat. So the number of operations per Joule of energy is not fixed. In the case of mining bitcoin you can in principle do everything reversibly up until the point where you transmit the new block, then run the calculation backwards until you are in the original state, receive a new block and just pay for erasing the bits used to store that block over the old. There are downsides: memory will fill up until you "uncalculate" results, reversible computing tends to be very slow and so on.
When we say a program "creates information" is usually means that it produces a useful result (digits of $\pi$, bitcoins, answers to questions, etc.) out of previously available information and computing resources (operations, time). There is a great deal of difference between semantic information (something that informs us, like "the billionth hexadecimal digit of $\pi$ is a 9") and information in the physical sense here, states that could be different (a bunch of bits, "1001"). It is the later that have a thermal cost - changing states around increases entropy unless done reversibly - but they are used to represent the useful semantic information. Hence a correct and incorrect $\pi$-digit calculating program will both produce the same heat. Which in a real sense is discarded information flushed away as thermal excitations.
The amount of semantic information found is nearly unrelated to the heat: a better algorithm can find the same answer using less compute.