I have 3D gridded oceanographic data for a variety of properties and I'm computing depth weighted averages (the gridding is irregular, but the dataset comes with the gridding data which can be called easily).
Let's say that I have a segment of ocean (for simplicity let's say I've already averaged in longitude and now have latitude, X, versus depth, Z), and I want to find the weighted depth average temperature, for example. If I'm not mistaken, this should be the each Z level of my X by Z grid multiplied by the corresponding Z cell value. Then I would sum over the Z dimension and then divide by the total depth (the sum of the values of Z).
This made me think that if I thought of X vs Z as a matrix, I could think of this as
$\frac{{X}^T\vec{z} }{\sum_i{{z_i}}}$ (Right?)
But this looks a lot like a projection of the columns of $X$ onto $\vec{z}$, but with $\sum{z_i}$ instead of $\sum{z_i^2}$. Is there a way to think of this weighted average as a projection? Is there a different sense of norm I could use? This doesn't change my answer (unless my answer is wrong :/) but I would like to have a better way to relate my linear algebra knowledge to practical applications. Feel free to wax poetic about this.