Simple math optimization problem
I ran into a problem that can be formulated / simplified as the following game:
You have two numbers - X and Y. The game consists of N rounds, in each round
you must *use* some number. The rules are: you must start using one of the numbers,
and on some round P you can switch to use the second number. You can't switch back.
Each round, the number that's being *used* is decreased by a multiplication factor
F, so it becomes a bit smaller. Your goal is to maximize a total value - the sum
of all the numbers you *used*.
For example, say N is 90 and you decide to use X for P = 10 rounds, and then
use Y for the rest. You start using X, so after the first round it's no longer
X, but X*F. After the second round it's X*F*F, etc. On round P, you switch to Y,
which also starts decreasing, and so on.
Since N is my case is 90, this problem isn't hard to solve with brute force, in
fact that's how I started. But then I got interested - can we solve this
problem analytically ? And it turns out we indeed can !
The total value can be expressed as follows:
Value = X*SUM(0..P-1)F^i + Y*SUM(0..N-P-1)F^i
Using the formula for the sum of geometric series:
Value = X*(1-F^P)/(1-F) + Y*(1-F^(N-P))/(1-F) ==> Value = (X+Y)/(1-F) - X*F^P/(1-F) - Y*F^(N-P)/(1-F)
So, derive and compare to 0 ...
-X*ln(F)*F^P/(1-F) - Y*ln(F)*F^N/(1-F)*F^P = 0 ==> X*F^2P = Y*F^N
And finally:
P = (N + log{base F}(Y / X)) / 2
Checking it out, it gives very logical results (for instance if X = Y, P = N/2 as
expected). Also, comparing it to the brute force approach proves that this is
indeed correct.