[Building on a comment by @user619894 and especially on @P.Quinton's answer].
TLDR: it is a well-defined estimate in the sense that it correlates with the true answer in an understandable way. It also never over-estimates the result. Its error will be big when the vector has significant components in directions of widely different eigenvalues.
Derivation
Because $H$ is symmetric and positive definite we know that it decomposes in an orthonormal basis $v_i$ with eigenvalues $\lambda_i>0$; i.e. $Hv_i=\lambda_i,H^{-1}v_i=\frac{1}{\lambda_i}, v_i^Tv_j=1(i=j)$.
Let's express $v=\sum_ia_iv_i$. Then the true answer is:
$$\text{Answer} = v^THv = \sum_{i,j}a_ia_jv_j^TH^{-1}v_i = \sum_{i,j} \frac{a_ia_j}{\lambda_i} v_j^Tv_i = \sum_i \frac{a_i^2}{\lambda_i}.$$
Of course, we don't know the true answer because we don't know $\lambda_i$nor $a_i$. But we can look at the estimate as a function of both and see how they differ:
$$\text{Estimate} = \frac{|v|^4}{v^THv} = \frac{\left(\sum_ia_i^2\right)^2}{\sum_ia_i^2\lambda_i}.$$
Now we can look at the ratio:
$$\frac{\text{Estimate}}{\text{Answer}} = \frac{\left(\sum_ia_i^2\right)^2}{\left(\sum_ia_i^2\lambda_i\right)\left(\sum_i \frac{a_i^2}{\lambda_i}\right)} = \frac{\sum_i a_i^2}{\sum_ia_i^2\lambda_i}\cdot\frac{\sum_i a_i^2}{\sum_i a_i^2\frac{1}{\lambda_i}} = \frac{\text{ Harmonic average of $\lambda_i$ weighted by }a_i^2}{\text{Arithmetic average of $\lambda_i$ weighted by }a_i^2}.$$
It is well-known that for $\lambda_i>0$ the harmonic mean is always less than or equal to the arithmetic mean. Thefore, the estimate will always be an under-estimate. Furthermore, the approximation will be worse the more diverse the $\lambda_i$ are (with significant $a_i$ component for $v$). Intuitively, the inverse cares more about the smallest eigenvalues.