I've been thinking about ways to compress the output from a (supposedly) random number generator. Let's assume for a moment that my computer can produce high-quality random numbers. I'm certainly not an expert in this field, so please correct me wherever.
Let's say I need random numbers between zero and 199 inclusive, however I can only read a minimum of a byte at a time from the RNG, so I use some compression function to reduce the 256 possible values of the byte to 200 different values. I'm considering using the modulo operation as a compression function, in that the modulo operation will cause a wraparound of the value should it exceed 199.
I think I've immediately spotted a flaw with using modulo in that it's twice as likely for the values 0-55 to occur. Would you say this assessment of the modulo operation is correct, or is there something about the properties of random numbers (entropy or whatever) that means that this doesn't matter? Also, if not modulo, could you suggest a good method of reducing the number of possible values of a RNG which effectively preserves their 'randomness'?