How long does it take before X is equal to 1/10 of X...

ManyTimes

New member
Joined
Feb 8, 2010
Messages
14
Hello, (no idea if this is the wrong/right forum, differential equations, hmm, I doubt it belongs here? It is an equation at least...)

A pool contains a leak that drains 3% of the amount of water currently in the pool each second! When the pool is full it contains 500.000 liters of water, how long does it take till the pool only contains 1/10 of water, assuming that the pool was filled with 100% water, before the leak began.

My calculations:
100% of water in the pool = 500.000
1/10 of water = 50.000
3% = *0.97
Lets use those numbers for further calculations:
50.000 = 500.000 *0.97^t
0.1 = 0.97^t
lg(0.1) = t* lg(0.97)
t = lg(0.1) / lg(0.97)
t = -0.0132...

What is this, where is my mistake?
T is in "seconds" (t, time...). :)
 
Try this...

Find out how many liters water has to leak out to have the necessary 10% left in the pool. Now find out how much water leaves the pool every second. Now you can find out how many seconds it will take to get rid of the necessary amount of water.
 
Since you are in difiqus, I assume you need to use calculus to solve your problem, correct?
 
>>Find out how many liters water has to leak out to have the necessary 10% left in the pool. Now find out how much water leaves the pool every second. Now you can find out how many seconds it will take to get rid of the necessary amount of water.

Not bad... Off topic:
450.000/3% = 450.000/13.500 = 33,33... but the answer is 76 because I programmed a loop to calculate it for me, to check my answer... Just want the formula

On topic:
But I want to use my method, the equation... I just want to find the time, T in my equation... And it should be around 76... :)

>>I assume you need to use calculus to solve your problem, correct?
I have no idea what you mean by "use calculus"... So, no idea! :)
 
If the water loss was constant...meaning it lost 3% of the original 500,000 L every second, it would be...

500,000 * (.03) = 15,000 L

Water to be drained: 90% of 500,0000 = 450,000 L therefore...

15,000(t) = 450,000

t = 450,000/15,000 = 30 sec.

But that is wrong...we have to take into account that during most of the seconds, there wasn't 500,000 L in the pool...there was less.

It sounds like a compound interest problem in disguise.

The equation is...

A(1+i)^t = E, where

A=the initial amount of "money" or water in our case

i=the "interest rate" (in our case it is negative, unlike most compound interest problems. And instead of an interest rate %, it is a percent water loss %)

t = the number of time intervals (what do you think this is?)

E = the new principal after all the time intervals have past.

Try this out; let me know if you need help.
 
ManyTimes said:
Hello, (no idea if this is the wrong/right forum, differential equations, hmm, I doubt it belongs here? It is an equation at least...)

A pool contains a leak that drains 3% of the amount of water currently in the pool each second! When the pool is full it contains 500.000 liters of water, how long does it take till the pool only contains 1/10 of water, assuming that the pool was filled with 100% water, before the leak began.

My calculations:
100% of water in the pool = 500.000
1/10 of water = 50.000
3% = *0.97
Lets use those numbers for further calculations:
50.000 = 500.000 *0.97^t
0.1 = 0.97^t
lg(0.1) = t* lg(0.97)
t = lg(0.1) / lg(0.97)
t = -0.0132...

What is this, where is my mistake?
T is in "seconds" (t, time...). :)

Let the amount of water in the pool at a time (t) = W(t)

then

dW/dt = - 0.03 * W

dW/W = - 0.03 *t

Integrating bothe sides

W = W[sub:6fltdd1r]o[/sub:6fltdd1r] * e[sup:6fltdd1r](-0.03*t)[/sup:6fltdd1r]

W/W[sub:6fltdd1r]o[/sub:6fltdd1r] = e[sup:6fltdd1r](-0.03*t)[/sup:6fltdd1r]

1/10 = e[sup:6fltdd1r](-0.03*t)[/sup:6fltdd1r]

ln(10) = 0.03 * t

t = ln(10)/0.03
 
Top