Suppose we play a game where we start with an initial amount of money $S$ and we wish to reach a larger target amount of money $T$. We a play a game in rounds as follows. There is a coin that has probability $p$ of coming up heads. We decide for the round how much to wager (less that or equal to our current amount of money) and if the coin flip is heads we win that amount, otherwise we lose that amount. Coin flips between different rounds are independent. Furthermore, if our current amount of money ever goes below $S$ then we are replenished back to $S$.
We may as well assume $S = 1$. Then in terms of $T$ and $p$, and our current amount of money for a particular round, how much should we wager for that round? The goal is to minimize the expected number of rounds before our current amount of money is greater than or equal to $T$. I have a feeling that for $p$ around 0.75 or so, the optimal strategy might not be to wager everything each round.
UPDATE: I ran some simulations for various $T$ with $p = 0.75$, assuming that we wager a fixed percentage $r$ of our current amount of money (so betting everything every time means that $r = 1$). For $T = 100$ setting $r = 1$ almost minimized the expected number of rounds, but not quite. $r = 0.95$ was slightly better and seemed close to optimal. And for $T = 500$, the optimal seemed to be more around $r = 0.65$. Of course, using fixed $r$ for $r < 1$ cannot always be optimal if it depends on $T$, because if our current amount of money is $M$ then the current state of the game is really more like starting all over with $S' = 1$, $T' = T / M$ and so as we win or lose and get closer or farther away from $T$, we should vary $r$ since the optimal fixed $r$ depends on $T$. To me this makes the question even more interesting because the current amount of money we have matters as it relates to $T$ and what percentage of our current amount of money we should wager.