Ok, I'm given y = x-x^2 ( or y = -(x-1)^2+1 ). Now I'm also told that there is a line that passes through the origin that divides that area between that curve and the x-axis into 2 equal regions. I'm asked to find the slope of the line.
Just checking my work here.. basically I get down to solving for the point of intersection of y=mx and y=x-x^2, so I set them equal, and x = 1-m. Now I setup an integral from 0 to 1-m of [(x-x^2)-(mx)]dx = 1/12 and I Get something like this...
[(x^2)/2 - (x^3)/3 - (mx^2)/2| from 0 to 1-m = 1/12, and I plug in 1-m for x. I end up down to 1-3m+9m^2+6m^3 = 1/12.. I may have messed up on my algebra or the idea all together, but I cant seem to find a correct value. When I solve that equation, I Get a negative number.. (-1.82....) and I am pretty sure that's not correct. I assume I integrate m as a constant? Help would be much appreciated!
Just checking my work here.. basically I get down to solving for the point of intersection of y=mx and y=x-x^2, so I set them equal, and x = 1-m. Now I setup an integral from 0 to 1-m of [(x-x^2)-(mx)]dx = 1/12 and I Get something like this...
[(x^2)/2 - (x^3)/3 - (mx^2)/2| from 0 to 1-m = 1/12, and I plug in 1-m for x. I end up down to 1-3m+9m^2+6m^3 = 1/12.. I may have messed up on my algebra or the idea all together, but I cant seem to find a correct value. When I solve that equation, I Get a negative number.. (-1.82....) and I am pretty sure that's not correct. I assume I integrate m as a constant? Help would be much appreciated!