Because this is really the same as your previous question, changing only the details, permit me to outline a general approach to finding moments of the max and min of a bivariate random variable.
So, generally, let $(X,Y)$ be any bivariate random variable and write $M=\max(X,Y),$ $m=\min(X,Y)$ for its extrema. For any numbers $(i,j)$ (usually natural numbers) the (raw) $(i,j)$ moment of $(M,m)$ is
$$\mu_{i,j}(M,m) = E[M^i m^j].$$
These are relevant because correlation is a function of some of these moments. One formula for it is
$$\rho(M,m) = \frac{\mu_{1,1}(\bar M, \bar m)}{
\sqrt{\mu_{2,0}(\bar M, \bar m)\ \mu_{0,2}(\bar M, \bar m)}}$$
where $\bar M = M - \mu_{1,0}(M,m)$ and $\bar m = m - \mu_{0,1}(M,m)$ are the centered versions of $M$ and $m.$
The idea is to condition the calculation of the expectation according to which variable(s) $X$ and $Y$ correspond to $M$ and $m.$ There are three possibilities:
When $X \gt Y,$ $X=M$ and $Y=m.$
When $X \lt Y,$ $X=m$ and $Y=M.$
When $X = Y,$ $X=Y=M=m.$
Thus, according to the basic properties of conditional expectation,
$$\begin{aligned}
\mu_{i,j}(M,m) &= E[X^i Y^j\mid X\gt Y]\Pr(X\gt Y)\\
& + E[Y^i X^j\mid X\lt Y]\Pr(X \lt Y)\\
& + E[X^{i+j}\mid X=Y]\Pr(X=Y) \\
\\
& = E[X^i Y^j\text{ and }X\gt Y] + E[Y^i X^j\text{ and }X\lt Y] +E[X^{i+j}\text{ and }X = Y].
\end{aligned}$$
When $\Pr(X=Y)=0,$ as when $(X,Y)$ has a continuous distribution, the third term drops out.
When $(X,Y)$ has a density function $f,$ it is continuous. Fubini's Theorem usually applies and reduces the foregoing to
$$\begin{aligned}
&\mu_{i,j}(M,m) \\ &= \int_{-\infty}^\infty y^j \left(\int_y^\infty x^if(x,y)\mathrm dx\right)\mathrm dy + \int_{-\infty}^\infty y^i \left(\int_{-\infty}^yx^j f(x,y)\mathrm dx\right)\mathrm dy.
\end{aligned}\tag{*}$$
Example
Suppose, as in the present question,
$$f(x,y) = C xy(x+y)\ \mathcal{I}_{[0,1]\times [0,1]}.$$
The indicator function $\mathcal I$ explicitly states the density is zero unless $0\le x \le 1$ and $0 \le y \le 1.$
We plug this $f$ into $(*)$ and compute
$$\begin{aligned}
\mu_{i,j}(M,m) &= \int_{-\infty}^\infty\int_y^\infty x^i y^j C xy(x+y)\mathcal{I}_{[0,1]\times [0,1]}\,\mathrm dx\mathrm dy \\
&+ \int_{-\infty}^\infty\int_{-\infty}^y y^i x^j C xy(x+y)\mathcal{I}_{[0,1]\times [0,1]}\,\mathrm dx\mathrm dy\\
&= C\int_0^1 \int_y^1 x^i y^j xy(x+y)\,\mathrm dx\mathrm dy
+ C \int_0^1\int_{0}^y y^i x^j xy(x+y)\,\mathrm dx\mathrm dy\\
&= C\int_0^1\left\{\int_y^1 x^{i+2}y^{j+1} + x^{i+1}y^{j+2}\,\mathrm dx+ \int_0^y x^{j+2}y^{i+1} + x^{j+1}y^{i+2}\,\mathrm dx\right\}\mathrm dy
\end{aligned}$$
The rest is just calculations. For the values of $i$ and $j$ needed in the question (natural numbers), this formula involves the sum of four univariate integrals over $x.$ Just evaluate them (all four have such similar forms that you need only evaluate one and apply that result in all four cases), integrate that with respect to $y,$ and plug the answers into the formula for $\rho(M,m).$ (The constant $C$ can be found using the formula for $\mu_{0,0}.$)