I would like to compute the maximum likelihood estimator of a general two-dimensional discrete probability $p_{ij}$ with $i=1,\ldots,N$ and $j=1,\ldots,N$ given a set of counts $c_{ij}$ and the marginal probabilities $\sum_j p_{ij} = p^{(x)}_i$ and $\sum_i p_{ij} = p^{(y)}_j$. The values $c_{ij}$, $p^{(x)}_i$ and $p^{(y)}_j$ are inputs.
I would like to know if there is a closed form solution to the above problem.
I have worked on the problem using Lagrange multipliers. The objective function is $$ F\left(\{p_{ij}\}, \{\lambda^{(x)}_i\}, \{\lambda^{(y)}_j\}\right) = \sum_{ij} c_{ij} \log p_{ij} - \sum_{i} \lambda^{(x)}_i \left(\sum_j p_{ij} - p^{(x)}_i\right) - \sum_{j} \lambda^{(y)}_j \left(\sum_i p_{ij} - p^{(y)}_j\right). $$ The $p_{ij}$ derivatives result in the equations $p_{ij} = \frac{c_{ij}}{\lambda^{(x)}_i+\lambda^{(y)}_j}$ for all $i,j$. Substituting these equations into the $\lambda$ derivative conditions gives $$ \sum_j \frac{c_{ij}}{\lambda^{(x)}_i+\lambda^{(y)}_j} = p^{(x)}_i \text{ for all $i$} $$ and $$ \sum_i \frac{c_{ij}}{\lambda^{(x)}_i+\lambda^{(y)}_j} = p^{(y)}_j \text{ for all $j$}. $$
I do not know how to solve the above equations except by numerical methods. I would like a closed form if possible.