## Vx Y VY

The correlation can never be greater than 1 or less than minus 1. A correlation close to zero indicates that the two variables are unrelated. A positive correlation indicates that the two variables move together, and the relationship is stronger the closer the correlation gets to one. A negative correlation indicates the two variables move in opposite directions, and that relationship also gets stronger the closer the correlation gets to minus 1. Two variables that are perfectly positvely correlated (r=1) essentially move in perfect proportion in the same direction, while two assets which are perfectly negatively correlated move in perfect proporiton in opposite directions.

A simple regression is an extension of the correlation/covariance concept which goes one step further. It attempts to explain one variable, which is called the dependent variable, using the other variable, called the independent variable. Keeping with statitical tradition, let Y be the dependent variable and X be the independent variable. If the two variables are plotted against each other on a scatter plot, with Y on the vertical axis and X on the horizontal axis, the regression attempts to fit a straight line through the points in such a way as the minimize the sum of the squared deviations of the points from the line. Consequently, it is called ordinary least squares (OLS) regression. When such a line is fit, two parameters emerge - one is the point at which the line cuts through the Y axis, called the intercept of the regression, and the other is the slope of the regression line.

OLS Regression:

OLS Regression:

The slope (b) of the regression measures both the direction and the magnitude of the relation. When the two variables are positively correlated, the slope will also be positive, whereas when the two variables are negatively correlated, the slope will be negative. The magnitude of the slope of the regression can be read as follows - for every unit increase in the dependent variable (X), the independent variable will change by b (slope). The close linkage between the slope of the regression and the correlation/covariance should not be surprising since the slope is estimated using the covariance -

The intercept (a) of the regression can be read in a number of ways. One interpretation is that it is the value that Y will have when X is zero. Another is more straightforward, and is based upon how it is calculated. It is the difference between the average value of Y, and the slope adjusted value of X.

The intercept (a) of the regression can be read in a number of ways. One interpretation is that it is the value that Y will have when X is zero. Another is more straightforward, and is based upon how it is calculated. It is the difference between the average value of Y, and the slope adjusted value of X.

Regression parameters are always estimated with some noise, partly because the data is measured with error and partly because we estimate them from samples of data. This noise is captured in a couple of statistics. One is the R-squared of the regression, which measures the proportion of the variability in Y that is explained by X. It is a direct function of the correlation between the variables -

R - squared of the Regression = CorrelationYX = pYX = —2T

## Lessons From The Intelligent Investor

If you're like a lot of people watching the recession unfold, you have likely started to look at your finances under a microscope. Perhaps you have started saving the annual savings rate by people has started to recover a bit.

## Post a comment