I've come across the following linear algebra problem while trying to derive something in information theory. I'm looking both for numerical ways to solve this type of problem and for anything analytical that can be said about it. If there's a closed-form solution (e.g. in terms of matrix algebra) that would be great.
I have a matrix $C$ (which might be singular) and vectors $a$ and $b$ such that $\sum_i a_i = \sum_j b_j$. I'm looking for vectors $\alpha$ and $\beta$ such the following conditions hold simultaneously: $$ \sum_i \alpha_i C_{ij} \beta_j = b_j\qquad\text{(for every $j$)} $$ and $$ \sum_j \alpha_i C_{ij} \beta_j = a_i\qquad\text{(for every $i$).} $$
Clearly there are cases where no solution exists, such as when $C$ is diagonal and $a\ne b$, but I suspect that in my case there always will be a solution. It looks like this should be easy to solve, but I can't quite see how to go about it.
The following may or may not be relevant for answering the question, but in my case the numbers all relate to a joint probability distribution over variables $A$, $B$ and $X$:
- The elements of $C$ are the marginal distribution for $A$ and $B$. That is, $C_{ij} = p(A=i,B=i)$;
- $a$ and $b$ are the following marginal conditional probability distributions: $a_i = p(A=i \mathop{|} X=k)$ and $b_j = p(B=j \mathop{|} X=k)$, for some particular $k$;
- The numbers $\alpha_i C_{ij} \beta_j$ are the elements of an (unknown) conditional distribution $p(A=i,B=j\mathop{|}X=k)$, which is what I'm attempting to find. (Actually it's not the true conditional distribution but a minimum-information estimate of it, which is why it has this particular form.)
In terms of the linear algebra problem, this means that $a$, $b$ and $C$ satisfy the following constraints:
- all the elements of $C$, $a$ and $b$ are real and between 0 and 1.
- $\sum_{ij}C_{ij} = 1$
- $\sum_{i}a_{i} = \sum_j b_j = 1$.
- $a$ and $b$ can be expressed as the row and column sums of a matrix $D$ with real entries in $[0,1]$ such that $D_{ij}=0$ whenever $C_{ij}=0$, and $\sum_{ij}D_{ij}=1$. (The elements of $D$ are the "true" conditional distribution $p(A=i,B=j\mathop{|}X=k)$.)