7

The Background

Bitcoin tracks chain work by counting the (expected) minimum number of hashes it would take to create a chain of the same number of blocks and same difficulty steps[1]. Naturally, accumulated work measured in this fashion (e.g. expected number of hashes) increases with each new block that is added to the chain.

For a given target t, the minimum expected number of hash attempts necessary to find a block which meets the proof of work criteria (hash less than or equal to t) is geometrically distributed[2] with parameter p = t / 2^256.

We can calculate the entropy, w, of such a geometrically distributed random variable[3] by the equation w(p) = (-(1-p)log2(1-p) - plog2(p)) / p, and using p = t / 2^256 we can trivially rewrite this into a function of the target instead, w(t).

Using this method we can calculate the minimum work (e.g. entropy) w(t), denominated in bits, necessary to describe the proof of work signal. Other components of the blockchain (such as what data is in the blocks besides the proof of work signal) are not included in this analysis so that is why it is an estimate of the minimum encoding.

Plugging in some numbers:

  • for block zero (and any block with the lowest possible difficulty, hence p = 1 / 2^32), we get approximately 33.44 bits of work per block.
  • for a more recent block, say block #602784 we getapproximately 76.97 bits of work per block.

The Question

Can we complete the above analysis for every difficulty period and sum the entropies to obtain a lower bound on the minimum complexity, in bits, of the (proof of work aspects of the) Bitcoin blockchain? Or is there some assumption (such as being able to sum the entropies in this context?) that is unsafe/wrong?

An Attempted Answer & Some Motivation

Performing such an analysis[4] seems to give a result, as of block 602784, of approximately 36.9 million bits or 4.6 megabytes and growing.

However, one reason why the method seems interesting is that, if it is reasonable, then it seems that all the math which the bitcoin protocol currently does with "time" (in the unix timestamp sense, such as difficulty adjustments, and block timestamp must advance median of last n block's timestamps) could instead be done in "bits." As such, this could be helpful for when/if the protocol is extended into operate across vast distances of spacetime?

Then again, there is probably something terribly wrong with this analysis which is why the question is being asked here.

References

[1] Strongest vs Longest chain and orphaned blocks

[2] How is it that concurrent miners do not subvert each other's work?

[3] https://math.stackexchange.com/questions/490559/entropy-of-geometric-random-variable

[4] https://github.com/philbertw4/mathematicalbitcoin/blob/master/scala/SatsPerBit.sc

5
  • "it seems that all the math which the bitcoin protocol currently does with 'time' (...) could instead be done in 'bits.'" I'm not sure I follow the reasoning here. 'Time' is used to standardize the blocktime to an expected time interval (an external measure). 'Bits of accumulated work' is a measure of something internal to the system, so it can't be used to standardize the block time in the same way (since we can't perfectly predict the future rate of entropy change). But anyways, interesting question/way of thinking about it.
    – chytrik
    Commented Nov 27, 2019 at 9:13
  • @chytrik, yes I'm glad you picked up on that too (and glad you find it an interesting question!). I'm not sure whether/if there is a "natural" block "bitrate" to target. This is actually why I tagged this question with the #blockspace-market tag, as perhaps the two concepts begin intersecting?
    – philbw4
    Commented Nov 27, 2019 at 20:46
  • ‘blockspace-market’ refers to the competition between unconfirmed transactions to be confirmed in a block, according to the fees they’ve paid. So it doesn’t directly relate to the block difficulty measure (‘bit of accumulated work’) in this case.
    – chytrik
    Commented Nov 27, 2019 at 20:51
  • Ah, ok, wrong tag then, sorry. Its a very interesting problem to think about. Choosing a target rate of "zero bits per block" is tempting as it then ends up being an almost self-defeating chain in the sense that, since it is impossible to produce a proof-of-work of zero bits, every difficulty adjustment will simply increase the difficulty in an attempt to "never receive another block from these creatures which keep submitting blocks with more work!" :-) I'm not sure how/if a decrease in difficulty could be accomplished though.
    – philbw4
    Commented Nov 27, 2019 at 20:56
  • I've been thinking about this a "bit" more and likely quite naively. However, if in fact this analysis holds up at all, then perhaps we could cross the bridge into physics and use landlauer's principle in a manner that would allow us to reinterpret the bits of work directly into a temperature and then difficulty adjustments would be set to target a specific constant temperature. So, in this context, "constant targeted temperature" would play the role that "constant targeted block rate".
    – philbw4
    Commented Nov 30, 2019 at 4:13

1 Answer 1

1

While it's possible to define this "entropy" metric for the blockchain, I do not believe it measures the same thing as difficulty or chainwork, or could be used as a substitute for it.

Specifically, entropy measures how much data is necessary to convey information. The entropy of a Bernouilli trial with probability target/2256 is how many bits you need to convey whether a particular block attempt was successful. It does not make sense to sum that up over all blocks, because we already know those are successful. It might make sense to sum it over all block attempts, yielding how many bits are needed exactly which attempts were succesful. Of course, that's something very different.

The goal of chainwork is giving a best guess for how much work was performed to create the blockchain, and I believe it is as good as a single number metric can be in that regard.

Not the answer you're looking for? Browse other questions tagged or ask your own question.