Could anyone help me with this integral? Thanks in advance!

Jonas314159

New member
Joined
Nov 20, 2019
Messages
3
Hi!

So I am having trouble with this integral from -∞ to ∞ ∫sin^2(x)/x^6 dx . I'm having trouble since I don't know of any way to do the primitive function of the expression in the integral. I think that the value of the integral is supposed to be divergent (but I could be wrong), so if you know of any other way to show that it is without doing the integral, that is also appreciated!

Thanks!
 
Last edited:
The value is divergent? What does that mean?

What causes you to lean toward failure to converge? Is there a finite limit at x = 0?

Is (0,∞ )∫ sin(x)/x^3 dx any easier or even helpful?

Please give us more than hunches.
 
Is the function even or odd? After you answer this we will talk more?
An even function since the exponents are even, so the integral can be written 2 times itself but from 0 to ∞, but I'm still having trouble getting anything meaningful from that expression.
 
The value is divergent? What does that mean?

What causes you to lean toward failure to converge? Is there a finite limit at x = 0?

Is (0,∞ )∫ sin(x)/x^3 dx any easier or even helpful?

Please give us more than hunches.
My thinking is that the integral for 1/x^6 diverges, and since 1=sin^2(x) + cos^2(x) it can be written as the integral for sin^2(x)/x^6 + the integral for cos^2(x)/x^6, which itself has to diverge. I would expect sin^2(x)/x^6 and cos^2(x)/x^6 to behave relatively similarly and therefor would assume that they both diverge. Thus sin^2(x)/x^6 diverges. But I don't know how to prove it.

I might be overlooking something, but I don't see how sin(x)/x^3 makes anything easier.
 
An even function since the exponents are even, so the integral can be written 2 times itself but from 0 to ∞, but I'm still having trouble getting anything meaningful from that expression.
What if sin(x) and sin(-x) are not the same value or do not just differ by signs? In that case squaring sin(x) and sin(-x) would NOT result in the same value. You need to be careful here. You were lucky because sin(x) and sin(-x) do differ only in signs.
 
My thinking is that the integral for 1/x^6 diverges, and since 1=sin^2(x) + cos^2(x) it can be written as the integral for sin^2(x)/x^6 + the integral for cos^2(x)/x^6, which itself has to diverge. I would expect sin^2(x)/x^6 and cos^2(x)/x^6 to behave relatively similarly and therefor would assume that they both diverge. Thus sin^2(x)/x^6 diverges. But I don't know how to prove it.

I might be overlooking something, but I don't see how sin(x)/x^3 makes anything easier.
the integral for cos^2(x)/x^6, which itself has to diverge.---Why?

The sin curve and cos curve are a phase shift apart. Maybe after you show that the integral for cos^2(x)/x^6 diverges you can write sin(x) in terms of cosine and show that the original integral also diverges. This is all based on the fact that you seem positive that the integral for cos^2(x)/x^6 diverges.
 
An even function since the exponents are even, so the integral can be written 2 times itself but from 0 to ∞, but I'm still having trouble getting anything meaningful from that expression.

If you could find a simple function f(x) that fulfills ALL the following:-

1) ∫ f(x)/x^6 dx is easy to do
2) In the limit as x -> 0, f(x)/x^6 stays smaller than sin^2(x)/x^6
3) The integral (0,∞)∫ f(x)/x^6 dx diverges (at x=0)

then what would this prove?
 
Top