I have been working with some histograms(using frequency and not frequency density) and I noticed that sometimes people have the first interval longer than the rest of the intervals.
See below for example: Site here
The first interval of the frequency table is 10-0 = 10 units long, while the second is 20-11 = 9 units long. But the histogram is drawn as if the intervals are the same length. Isn't this a bit misleading-especially when using frequency instead of frequency density?
This also will make it confusing for understanding the interval given if we were to be presented with the histogram as we may think that the intervals are $0\leq x < 10$, $10\leq x < 20$, $20\leq x < 30$, but this is not the case as the first interval must be $0\leq x \leq 10$
Need help
I would like some advice on why this is ok to have the first interval longer than the rest as this surly will misrepresent the data.