Regression Assumptions and Constraints

Both the simple and multiple regressions that we have described in this section also assume linear relationships between the dependent and independent variables. If the relationship is not linear, we have two choices. One is to transform the variables, by taking the square, square root or natural log (for example) of the values and hope that the relationship between the transformed variables is more linear. The other is run non-linear regressions that attempt to fit a curve through the data.

There are implicit statistical assumptions behind every multiple regression that we ignore at our own peril. For the coefficients on the individual independent variables to make sense, the independent variable needs to be uncorrelated with each other, a condition that is often very difficult to meet. When independent variables are correlated with each other, the statistical hazard that is created is called multicollinearity. In its presence, the coefficients on independent variables can take on unexpected signs (positive instead of negative, for instance) and unpredictable values.

There are simple diagnostic statistics that allow us to measure how far the data that we are using in a regression may be deviating from our ideal. If these statistics send out warning signals, we ignore them at our own peril.

Stocks and Shares Retirement Rescue

Stocks and Shares Retirement Rescue

Get All The Support And Guidance You Need To Be A Success At Investing In Stocks And Shares. This Book Is One Of The Most Valuable Resources In The World When It Comes To

Get My Free Ebook

Post a comment