I'm comparing a number that ranges from 100 to 500 to a number that ranges from 0.001 to 1. Now, the correlation coefficient only varies in the thousandths place based on whether or not I divide the larger set of numbers by 1000. Is that some sort of rounding error, or is it actually important?
Lastly, if I have to normalize the data to correctly use the Pearson Correlation Coefficient, does anyone know of an easy calculator online to do that? I'm loathe to have to paste each point of data one at a time. I know how you can do it in excel in three steps, but I'm hoping for a simple paste and click (yes, I'm embarrassingly lazy). Thanks for the math help, and thanks for any laziness help you know of.
Lastly, if I have to normalize the data to correctly use the Pearson Correlation Coefficient, does anyone know of an easy calculator online to do that? I'm loathe to have to paste each point of data one at a time. I know how you can do it in excel in three steps, but I'm hoping for a simple paste and click (yes, I'm embarrassingly lazy). Thanks for the math help, and thanks for any laziness help you know of.