Skip to main content

Questions tagged [information-theory]

The tag has no usage guidance.

13 votes
2 answers
5k views

Who coined the term "signal-to-noise ratio" and when did statisticians start using the term "noise" to describe randomness?

I'm writing about the history of the concept of noise and am having trouble tracking down references from when the term "noise" started being associated with statistical noise such as ...
vy32's user avatar
  • 655
1 vote
0 answers
114 views

Bit as eighth of dollar vs Shannon's Binary Digit?

https://en.wikipedia.org/wiki/Bit#History does not mention the common usage of bit being an eighth of a dollar -- surely Shannon was inspired by this? This question was asked in 2019 in English ...
releseabe's user avatar
  • 1,213
2 votes
0 answers
85 views

What contributions from Glavieux & Thitimajshima were "merged" into Claude Berrou's now-famous 1993 talk "Near Shannon limit error-correcting coding"?

At 01:15:13 in the delightful video The Bit Player | Claude Shannon | Father of Information Theory | IEEE Information Theory Society Andrea Goldsmith recounts ...
uhoh's user avatar
  • 2,207
3 votes
0 answers
107 views

Pre-20th century sources on information theory?

What are some pre-19th or pre-20th century sources on information theory? Do they exist?
Geremia's user avatar
  • 5,391
4 votes
1 answer
350 views

Why was the 'differential entropy' from information theory so named?

The entropy of a distribution $p$ on a discrete set $\mathcal{X}$ is defined as $$H(p) = -\sum_{x \in \mathcal{X}} p_x \log p_x.$$ Shannon in his classic paper [1] defines the analogue for continuous ...
Mr. G-Man's user avatar
6 votes
2 answers
331 views

Why doesn't Morse code try to achieve more compression?

I was working on the Morse code Wikipedia article , when I noticed something strange. There might be a reason, but I can't find it. As you can see, the code symbol ...
user avatar