Just as an example, let's say we have a uniformly distributed random variable $X$. Then $X$ is dependant on two parameters, where max and min, or average and span are the usual ones. If you have observed $X$ you can say something about those parameters, just from what $X$-s you have observed.
Say we have observed the following values of the uniformly distributed integer random variable $X$:
$$
0, 0, 0, 1, 0, 1, 1, 1, 1, 0, 1, 0, 0, 1, 1, 0
$$
Wouldn't you agree that we can with some certainty conclude that $\max(X)=1$ and $\min(X) = 0$?
To show an actual Fisher information example, let's instead say that the random variable $X$ is either $0$ with some probability $\theta$ or $1$ with probability $(1-\theta)$. Thus $f_X(0;\theta) = \theta$ and $f_X(1;\theta) = 1-\theta$ The Fisher information of $\theta$ is the value
$$
\mathcal I(\theta) =E\left[\left(\frac{\partial}{\partial \theta}f_X(X;\theta)\right)^2\Bigg |\theta\right] = \left(\frac{\partial}{\partial \theta}\ln f_X(0;\theta)\right)^2f_X(0;\theta) + \left(\frac{\partial}{\partial \theta}\ln f_X(1;\theta)\right)^2f_X(1;\theta) \\\\
= \frac{1}{\theta^2}\cdot \theta + \frac{1}{(1-\theta)^2}(1-\theta) = \frac{1}{\theta(1-\theta)}
$$
and this function measures how much information observations of $X$ gives about $\theta$. According to wikipedia a function with large values means observations give much information. Together with the observed most likely estimate of $\theta$ as $0.5$, we get that $\mathcal I(0.5) = 4$. I do not have enough experience with Fisher information to tell you if this specific value is "large".