Someone I know posited an idea that I instinctively think (know?) cannot be true, but I can't seem to figure out how to explain why not. The idea is based on the assumption that more electricity must be used to store a bit value of 1
in RAM than a 0
. If that assumption is true (I'm not sure that it is), then the idea is that we could lessen the electricity drawn by systems if we hyper-optimized every encoding system such that less commonly stored values are assigned the heavier byte values (heavier in the sense of having more bits set to 1).
I'm not even sure that's a sense-making endeavor, even if the initial assumption were true. This would all certainly be moot, though, if someone could explain that the byte value x00
does not require less electricity to store in RAM than xFF
.
0x7F
has 7 bits set0x80
is greater but has only one bit set. Hence an optimization would get really complicated.1
vs a0
(?)... scale that up to 16-32 GB of RAM and you might make a measurable difference, perhaps in the order of milli Joules.. (note: all figures are guesses).1
vs a0
though.