Hello, (no idea if this is the wrong/right forum, differential equations, hmm, I doubt it belongs here? It is an equation at least...)
A pool contains a leak that drains 3% of the amount of water currently in the pool each second! When the pool is full it contains 500.000 liters of water, how long does it take till the pool only contains 1/10 of water, assuming that the pool was filled with 100% water, before the leak began.
My calculations:
100% of water in the pool = 500.000
1/10 of water = 50.000
3% = *0.97
Lets use those numbers for further calculations:
50.000 = 500.000 *0.97^t
0.1 = 0.97^t
lg(0.1) = t* lg(0.97)
t = lg(0.1) / lg(0.97)
t = -0.0132...
What is this, where is my mistake?
T is in "seconds" (t, time...).
A pool contains a leak that drains 3% of the amount of water currently in the pool each second! When the pool is full it contains 500.000 liters of water, how long does it take till the pool only contains 1/10 of water, assuming that the pool was filled with 100% water, before the leak began.
My calculations:
100% of water in the pool = 500.000
1/10 of water = 50.000
3% = *0.97
Lets use those numbers for further calculations:
50.000 = 500.000 *0.97^t
0.1 = 0.97^t
lg(0.1) = t* lg(0.97)
t = lg(0.1) / lg(0.97)
t = -0.0132...
What is this, where is my mistake?
T is in "seconds" (t, time...).