As a simple example, if you define the radical as just the "positive", then [imath]\sqrt{-1}\cdot\sqrt{-1}=i\cdot i=-1[/imath], while [imath]\sqrt{-1\cdot-1}=\sqrt{-1\cdot1}=\sqrt{1}=1[/imath].
Correcting a little typo,
[imath]\sqrt{-1}\cdot\sqrt{-1}=i\cdot i=-1[/imath], while [imath]\sqrt{-1\cdot-1}=\sqrt{1}=1[/imath].
If you do define a
principal square root (which is no longer as simple as "take the positive root", since complex numbers are not positive or negative), then you can't assume that the root of the product is the product of the roots. As a result, in evaluating an expression with such roots, you have to
evaluate the radicals first, and not take shortcuts.
So I was doing a problem which had to do with √i and √-i. I found that √i=√2(1+i)/2 and so far as I can tell this is correct.
I then figured that √-i=i√i, and this too looks correct.
The problem I get is that given this √-i=i√2(1+I)/2. But I've checked and this doesn't seem to be true. So where is the mistake?
Following through on the details,
i actually has two roots, [imath]\pm\frac{\sqrt{2}}{2}(1+i)[/imath]; the one with
positive real part is conventionally taken as the
principal root in appropriate contexts. Given this, [imath]\sqrt{i}=\frac{\sqrt{2}}{2}(1+i)[/imath] and [imath]\sqrt{-i}=\frac{\sqrt{2}}{2}({\color{red}{1-i}})[/imath].
Note that this is not equal to your [imath]\sqrt{-i}=i\sqrt{i}=i\cdot\frac{\sqrt{2}}{2}(1+i)=\frac{\sqrt{2}}{2}({\color{red}{i-1}})={\color{red}{-}}\frac{\sqrt{2}}{2}({\color{red}{1-i}})[/imath]. That gives the wrong sign.