Cumulative distribution function (CDF)
Based on @whuber's comments, their formula represents the probability that the
final $n-k$ observations will be non-successes, which is the same as the first
$k$ observations containing all existing successes (i.e., the CDF).
$$
P(k)=(1-p)^{n−k}
$$
Probability mass function (PMF)
The PMF represents the probability that after $k$ observations, all existing
successes have been found. We can derive it from the CDF:
$$
\begin{align*}
p(k)&=P(k)-P(k-1)\\
&=(1-p)^{n−k}-(1-p)^{n−(k-1)}\\
&=p(1-p)^{n-k}
\end{align*}
$$
i.e.
$$
p(k)=p(1-p)^{n-k}
$$
Expected value
To determine the expected value (mean) for our probability mass function, we
use the expected value's general expression:
$$
E[X]=\sum_i x_i\,p(X=x_i)
$$
where:
- $E[X]$ is the expected value of the random variable $X$
- $x_i$ represents the possible values of the random variable $X$
- $p(X=x_i)$ is the probability mass function of the random variable $X$, which
gives the probability that $X$ takes on the value $x_i$
Replacing with the terms from our problem, we get:
$$
\begin{align*}
E[k]&=\sum_{k=0}^{n}kp(1-p)^{n-k}\\
&=p(1-p)^n\sum_{k=0}^{n}k(1-p)^{-k}
\end{align*}
$$
We need to analytically simplify the summation $\sum_{k=0}^{n}k(1-p)^{-k}$. If
we define $r=(1-p)^{-1}$, the summation becomes:
$$
\sum_{k=0}^{n} kr^k
$$
To simplify our summation, we can now use the sum of the first $n$ terms of the
arithmetico-geometric sequence, which has the form:
$$
\sum_{k=1}^{n}\left[a+(k-1)d\right]br^{k-1}=\frac{ab-(a+nd)br^n}{1-r}+\frac{dbr(1-r^n)}{(1-r)^2}
$$
If we set $a=d=1$ and $b=r$, the expression becomes:
$$
\sum_{k=1}^{n}kr^k=\frac{r-(1+n)r^{n+1}}{1-r}+\frac{r^2(1-r^n)}{(1-r)^2}
$$
Using it in our $E[k]$ expression while remembering that $r=(1-p)^{-1}$, we get:
$$
E[k]=p(1-p)^n\left[\frac{(1-p)^{-1}-(1+n)(1-p)^{-n-1}}{1-(1-p)^{-1}}+\frac{(1-p)^{-2}(1-(1-p)^{-n})}{(1-(1-p)^{-1})^2}\right]
$$
Using SymPy to simplify the previous expression, we obtain:
$$
E[k]=\frac{p \left(n - \left(1 - p\right)^{n} + 1\right) + \left(1 - p\right)^{n} - 1}{p}
$$
Which can be further rewritten to get to @Hunaphu's formula:
$$
\begin{align*}
E[k]&=\frac{p \left(n - \left(1 - p\right)^{n} + 1\right) + \left(1 - p\right)^{n} - 1}{p}\\
&=n-(1-p)^n+1+\frac{(1-p)^n-1}{p}\\
&=n+\frac{p-p(1-p)^n+(1-p)^n-1}{p}\\
&=n+\frac{(1-p)^n(1-p)-(1-p)}{p}\\
&=n+\frac{(1-p)^{n+1}-(1-p)}{p}
\end{align*}
$$
Therefore, the answer to my question is:
$$
E[k]=n+\frac{(1-p)^{n+1}-(1-p)}{p}
$$
Which is exactly @Hunaphu's formula, but doesn't seem at all related with the
hypergeometric distribution, as they suggested.
In any case, thanks to @whuber for providing an idea on how to start solving the
problem, and to @Hunaphu for having presented the final answer, although lacking
any feasible explanation on how they got there.
If there's a shorter path to this answer, please provide more answers, as I
would like to know about it.