Table 1: Definition of some basic terms.

Regression analysis is a process used to estimate a function that predicts the value of the response variable in terms of values of other independent variables. The regression model often relies heavily on the underlying assumptions being satisfied. There are three types of regression, namely;

-         Simple linear regression studies the linear relationship between two variables, i.e., one dependent and one independent variable.

-         Multiple Linear regression studies the linear relationship between one dependent variable and more than one independent variables.

-         Nonlinear regression assumes that the relationship between a dependent variable and independent variables is not linear in regression parameters [18]

Ridge regression is one of the remedial measures for handling severe multicollinearity in the least-squares estimation. The ridge regression is a constrained version of least squares. It tackles the estimation problem by producing a biased estimator with small variances [18].

 

Ridge regression provides another alternative estimation method that may be used to advantage when the predictor variables are highly collinear [19]. The estimators produced are biased but tend to have a smaller mean squared error than OLS estimators [4].

Mean Squared Error Matrix measures the average of the error squares, i.e., the average squared difference between the estimated value and the actual values.

A dispersion matrix, also known as a covariance matrix, is a square matrix giving the covariance between each pair of elements of a given random vector.