Suppose that f is a function such that f( 0 ) = y and f ( pi ) = 2 and that the integral from 0 --> pi of [ ( f (x) + f''(x) ) sin x dx ] exists. Find the value of the integral from 0 --> pi of [( f(x) + f''(x)) sin x dx
I multiply the integral out.
Int [ f (x) sinx + f''(x) sin x ]
Then i subtract f' (x) cos x and add f'(cos x) to obtain
Int [ f sin x - f'cosx + f'cos x + f''sinx ]
= Int [ f(x) sinx - f'(x) (cosx) ] + int [ f'(x) cos x + f''(x) sinx) ]
= int [f(x) (-cosx) ]' + int ( f'(x) (sinx) ]' - using product rule backward
But then i am stuck because i can integrate both functions, however i am not given a value for f'(x) so how do i finish this?
I multiply the integral out.
Int [ f (x) sinx + f''(x) sin x ]
Then i subtract f' (x) cos x and add f'(cos x) to obtain
Int [ f sin x - f'cosx + f'cos x + f''sinx ]
= Int [ f(x) sinx - f'(x) (cosx) ] + int [ f'(x) cos x + f''(x) sinx) ]
= int [f(x) (-cosx) ]' + int ( f'(x) (sinx) ]' - using product rule backward
But then i am stuck because i can integrate both functions, however i am not given a value for f'(x) so how do i finish this?