I know that there is a strong relationship between Shannon entropy and thermodynamic entropy -- they even have the same units and differ only by a constant factor. This suggests that they both intrinsically describe the same fundamental concept.
Wikipedia says that there is a strong relationship between Fisher information and relative entropy (also called Kullback-Leibler divergence), as does an answer to a previous question on Math.SE.
However, looking at the relevant formulas, it does not look like Fisher information would be measured with the same units that relative entropy would. This suggests that they are measuring fundamentally distinct, albeit related, physical concepts.
The formula for the Shannon entropy can be written as follows: $$\int [ - \log p(x) ]\ p(x) \, dx $$ This is usually measured in bits.
What are the units of Fisher information (given that Shannon entropy can be measured in bits)?
Fisher information can be written as: $$\int \left(\frac{\partial}{\partial \theta} \log p(x; \theta) \right)^2 p(x;\theta) \, dx $$
My guess, based on comparing the definitions of Shannon entropy and Fisher information, is that the latter would be measured in units something like $$\frac{\text{bit}^2}{\Theta^2} $$ where $\Theta$ is the unit of measurement of the parameter $\theta$ that is to be estimated.
I am not quite sure how to account for the effects of the extra partial differentiation compared to the definition of Shannon entropy. Perhaps the expectation operation $\int ( \cdot) p(y) \, dy$ should leave the units unchanged, although I don't know how to give a non-intuitive explanation of this suspicion.
Since the Fisher information is the variance of the score, this question might be answered by first deriving the units of the score.
This question might be related, although it was unanswered.