Standardized regression coefficients, so-called, are the "unstandardized" coefficients, multiplied by the standard deviation of the predictor variable and divided by the standard deviation of the response variable. In the simplest case of one predictor, the standardized coefficient is the same as the correlation. The partial correlations enter into it of course in the more general cases. A partial correlation can be computed from the multiple correlation of two regressions, one containing all the variables and one containing all but the variables held constant.
The "beta weights", or standardized coefficients, do provide a "scale free" interpretation, but the multiple correlation needs to be considered as well, since that is the correlation between the predicted values and the response.
Each beta weight (or just regression coefficient) is proportional to a partial correlation, but the constant of proportionality differs from one predictor to another; therefore the correlation cannot be unity.