I am trying to test whether the covariance matrix for the maximum likelihood estimates for a gaussian general linear model approaches the inverse Fisher information matrix (times 1/n , n being the sample size).
However I am not sure how to compare two matrices. One approach would be to compare the eigenvalues. However I am finding with my runs of code, that the eigenvalues of my covariance matrix tend to vary quite wildly. I have tried using the spectral norm (i.e largest eigenvalue, but it is often the case that I have a single very large eigenvalue (say, order unity), and the others are much smaller (of order 10e-15). The Fisher information matrix eigenvalues tend to be quite consistent (about 10e-6, following the example with the same number of samples, observations and covariates for comparison).
So how would one compare two matrices, and relatively easily? Is there a statistical test for this?
My thoughts are that:
-IF we work in a given basis, then the features defining a matrix are:
It's rank
The eigenvectors (and thus the subspace spanned by them)
The eigenvalues
The simplest case for comparison would be if the rank and eigenvectors were the same, so only the eigenvalues had to be compared. Otherwise, depending on what one means be 'how different the matrices are', one would have to have a metric that accounts for the difference in the eigenvalues (perhaps a measure of the overlap of the subspace spanned by them? But this doesn't account for all of the information!), and the eigenvalues. Perhaps also whether the eigenvalues correspond to the same vectors.
So I see that this is not straightforward and probably depends on what you want to do with the information. But in that case, how to meaningfully compare matrices? I suppose in this case of MLE, we really mean element-wise comparison.