I have a series of 24 raw scores which I have standardized by individually converting them to z-scores. My assignment asks me to calculate the mean and standard deviation of these new scores (I presume to prove that the mean for a series of z-scores is always = 0 and the standard deviation for a series of z-scores is always = 1).
As expected, the mean of the scores (which ranged from -1.79 to 1.33) = 0. However, upon calculating the standard deviation for the series of scores, I am getting 0.84 as the value (which is clearly incorrect). I have run through my math thrice to ensure that I didn't make a simple calculation error, so I cannot understand why my standard deviation would not come out as = 1.
I am using the formula:
SD = ?( [ ?z^2 – (?z)^2/N ] / N )
where (?z)^2 = 0 and N = 24
Reducing the formula to:
SD = ?( [?z^2] / N )
And in calculating ?z^2 I added the squares of each individual z-score (giving me a value of ~16.74).
If anyone could possibly give me a heads up as to where I went wrong (or perhaps suggesting other reasons why the SD wouldn't = 1) it would be GREATLY appreciated!
As expected, the mean of the scores (which ranged from -1.79 to 1.33) = 0. However, upon calculating the standard deviation for the series of scores, I am getting 0.84 as the value (which is clearly incorrect). I have run through my math thrice to ensure that I didn't make a simple calculation error, so I cannot understand why my standard deviation would not come out as = 1.
I am using the formula:
SD = ?( [ ?z^2 – (?z)^2/N ] / N )
where (?z)^2 = 0 and N = 24
Reducing the formula to:
SD = ?( [?z^2] / N )
And in calculating ?z^2 I added the squares of each individual z-score (giving me a value of ~16.74).
If anyone could possibly give me a heads up as to where I went wrong (or perhaps suggesting other reasons why the SD wouldn't = 1) it would be GREATLY appreciated!