Money is typically denominated in a way that allows for a greedy algorithm when computing a given amount $s$ as a sum of denominations $d_i$ of coins or bills:
$$ s = \sum_{i=1}^k n_i d_i\quad\text{with } n_i > 0, d_i > d_{i-1} $$
We start with the highest denomination $d$ not exceeding $s$ and subtract it. We repeat this process until $s$ is completely represented. As long as all denominations are available, this simple algorithm guarantees the minimum number of coins or bills used. (This is a little sketchy but I guess we are all familiar with it and it's not central to my question.)
My question is: what property of $d_i$ makes this work? I once heard that the value of any coin must be bigger than the sum of all coins of lower denomination for the greedy algorithm to work. This holds for the typical denomination:
$$ 1,2,5,10,20,50,100 $$
However, I wasn't able to come up with a counter example or even a proof. Surely this problem has been studied and I would be glad to know more about it. In fact, I'm not even sure about the right tags -- feel free to edit them. I assume the algorithm to find the sum with the minimum number of coins becomes tricky when we no longer can assume that all denominations are available.