This paper proposes a new biased estimator for estimating the regression parameters for the multiple linear regression models when the regressors are correlated. Theoretical comparisons and simulations results show that, the proposed estimator performs better than other existing estimators under some conditions in the smaller mean squares error sense. A real-life dataset is analyzed to illustrate the findings of the paper.
Biased estimator, Liu estimator, Monte Carlo simulation, Multicollinearity, MSE, Ridge regression estimator
The ordinary least squares (OLS) estimator is the best linear unbiased estimator and has been used to estimate the parameters of the linear regression model since its inception. One of the important assumptions for the linear regression model is that the regressors (independent variables) are independent. However, in practice the regressors may or may not be independent, which causes the problem of multicollinearity. In the presence of the multicollinearity, the OLS estimator is inefficient and gives wrong sign of the parameters in the multiple linear regression models [1]. To handle these problems, many authors have given different types of estimators, to mention a few, [1-7], and recently [8], for one parameter biased estimator. However, to refer for two parameters mode, the following authors are notable, [9-15], among others.
The main objective of this paper is to propose a new two parameter biased estimator for the regression coefficients and then to compare the performance of the new estimator with the OLS, the ordinary ridge regression (ORR) of Hoerl and Kennard [1], the Liu of Liu [6] and the Kibria-Lukman (KL) of Kibria and Lukman [8] estimators. The rest of the paper is organized as follows: Some estimators and their statistical properties are given in Section 2. In section 3, the theoretical comparisons among the proposed estimator and existing estimators and the biasing parameters k and d are given. A Monte Carlo simulation study is performed in section 4. A real-life data are analyzed in section 5. Finally, Some Conclusions are given in section 6.
Consider the following linear regression model:
where y is an n×1 vector of the response variable, X is a known n×p full rank matrix of the explanatory variables, β is an p×1 vector of unknown regression coefficients, is a n × 1 vector of disturbance assumed to be distributed with mean vector 0 and variance covariance matrix σ2I, and I is an identity matrix of order n × n. To define various estimators, canonical form of the model (2.1) is given by:
where, , and C is an orthogonal matrix such that Then, the OLS estimator of α is given as:
and then the mean squared error matrix (MSEM) of is given by
The ORR of α [1] is
Where, N=[U+kI]-1 and the MSEM of would be
Hoerl, et al. [16] defined the biasing parameter k for as follows:
The Liu estimator of α [6] is
Where, and the biasing parameter d of is given by
If is negative, Ozkale and Kaciranlar [9] adopt the alternative biasing parameter below:
The recently proposed KL estimator by Kibria and Lukman [8] is defined as,
Where, and the biasing parameter k of the KL estimator is defined by
The new biased (NB) estimator of α is obtained by minimizing subject to
where c is a constant,
Here, k and d are the Lagrangian multipliers.
The solution to (2.15) gives the new estimator as follows:
where and
Moreover, the proposed NB estimator is also obtained by augmenting to equation (2.2) and then using the OLS estimate.
The proposed NB estimator is a general estimator which includes the OLS and the KL estimator:
1 If in becomes .
2 If in becomes .
The MSEM of the proposed NB estimator of α is given by
The lemmas below will be used for theoretical comparisons among estimators in the next section.
Lemma 2.1: [17] Let E be an positive definite matrix, that is and α be some vector; then, if and only if
Lemma 2.2: [18] Let be two linear estimators of α. Suppose that where be the covariance matrix of and i = 1,2. Consequently,
if and only if where
Theorem 3.1: if and only if
Proof:
Where will be positive definite (pd) if and only if or Clearly, for and By Lemma 2.2. The proof is completed.
Theorem 3.2: if and only if
Where,
Proof:
Where, will be pd if and only if or Clearly, for and , By Lemma 2.2. The proof is completed.
Theorem 3.3: if and only if
where,
Proof:
Where, will be pd if and only if or Clearly, for and By Lemma 2.2. The proof is completed.
Theorem 3.4: if and only if
Where,
Proof:
Where, will be pd if and only if or Clearly, for and By Lemma 2.2 The proof is completed.
Different biasing parameters estimators of k and d are proposed in different studies, for example, Hoerl and Kennard [1], Liu [6], [16,19-23], among others.
So the optimal values of k and d for the proposed NB estimator is going to be found. At first, by minimizing the equation m, we obtain the optimal value of k when d is fixed as
Differentiating m with respect to k and setting we get
Then, the estimated optimal value of k is given as follows:
and,
Also, the optimal value of d will be found by differentiating m with respect to d when k is fixed and setting we get
Then, the estimated optimal d with the unbiased estimators is given by
And
The parameters k and d estimators selection in are found iteratively as follows:
1. Find an initial estimate of d using
2. Determine from (3.12) using in step 1.
3. Estimate in (3.15) by using in step 2.
4. If or use
A Monte Carlo simulation study is performed to show the performance of the NB estimator over some existing estimators. It contains two parts: (i) Simulation technique and (ii) Results discussion.
Using the equation below, we generate the explanatory variables (see, Gibbons [24] and Kibria [19]):
where are independent standard normal pseudo-random numbers and is the correlation between any two explanatory variables and here has two values 0.9 and 0.99. The n observations for the response variable y are gotten by the following equation:
where are The values of β are obtained as [25]. Also, we choose the values of the biasing parameters of the estimators as by following the work of Wichern and Churchill [26] and Kan, et al. [27] is that when k lies between 0 and 1, the ORR estimator performs better. The Monte Carlo simulation study replication is 1000 times for n = 50 and 100 and 1, 25, and 100. For each replicate, we calculate the mean square error (MSE) of the estimators using the next equation:
where is the estimator and is the true parameter. The estimated MSEs of the estimators are shown in Table 1, Table 2, Table 3 and Table 4 for (ρ = 0.90 and n = 50), (ρ = 0.99 and n = 50), (ρ = 0.90 and n = 100), and (ρ = 0.99 and n = 100), respectively.
Table 1: Estimated MSE for OLS, ORR, Liu, KL and the proposed NB. View Table 1
From Table 1, Table 2, Table 3 and Table 4, we observed that, when the factors and are going to increase, the estimated MSE values are also going to increase, while n is going to increase, the estimated MSE values are going to decrease. Also, the OLS estimator is performing the worst for all cases in presence of the multicollinearity. Moreover, the simulation results show that, the proposed NB estimator is performing better than the other estimators for most of cases. The Liu estimator gives better results in the MSE values when i.e. when the biasing parameters are near to zero. For ρ = 0.9: The condition number (CN) is approximately around 5 and the variance inflation factors (VIFs) are around 4 to 6 such that it is observed that a close agreement of the proposed NB (although better results) with the KL estimator for low values of k = d and a better performance as k = d increases for a fixed value of σ. This improvement increases with the increase of the value of σ and an even better improvements is observed if ρ = 0.99 where the CN and the VIFs become larger and are approximately around 15 and around 36 to 61, respectively. So, the proposed NB estimator works better for the strong correlated explanatory variables. So, the performance of the proposed NB estimator almost depends on the value of ρ, σ, the biasing parameters k and d, and the true parameter. Thus, simulation results are consistent with the theoretical results.
Table 2: Estimated MSE for OLS, ORR, Liu and the proposed NB. View Table 2
Table 3: Estimated MSE for OLS, ORR, Liu, KL and the proposed NB. View Table 3
Table 4: Estimated MSE for OLS, ORR, Liu, KL and the proposed NB. View Table 4
To illustrate the theoretical and simulation results of this paper, we consider a real life data in this section. The Portland cement data was originally adopted by Woods, et al. [28]. This data was also analyzed by many researchers, for examples, [29,30]; Lukman, et al. [14] and recently by Kibria and Lukman [8], among others. And this data is analyzed here to explain the performance of the proposed NB estimator and the other existing estimators. The regression model for this data is given by
For knowing more details about this data, we refer Woods, et al. [28].
To show the existence of the multicollinearity, different measures are calculated as the VIFs which are 38.50, 254.42, 46.87 and 282.51, Eigenvalues of are 44676.206, 5965.422, 809.952 and 105.419 and the CN of approximately equals 20.58. The VIFs, the eigenvalues and the CN tell us that there is a severe multicollinearity in the data. Also, is equal to 2.446. The correlation coefficients matrix of the explanatory variables are presented in Table 5 such that there is a significant and strong relationship among the following explanatory variables: and and Then, the estimated parameters and the MSE values of the estimators are presented in Table 6. It appears from Table 6 that the proposed NB estimator is performing the best where it is giving an obvious improvement over all existing estimators and a little improvement over the KL estimator in which this is consistent with the simulation results because here is small and not all the explanatory variables are significantly or strongly correlated at the same degree of correlation even though the data has high CN and VIFs.
Table 5: The correlation coefficients matrix of the explanatory variables. View Table 5
Table 6: The results of regression coeffcients and the corresponding MSE values. View Table 6
In this paper, we proposed a new biased (NB) estimator for handling the multicollinearity problem in the multiple linear regression models. Some existing estimators are the special case of the proposed estimator. The proposed NB estimator is compared theoretically with the Ordinary least squares (OLS) estimator, the Ordinary ridge regression (ORR) estimator, the Liu estimator and the Kibria-Lukman (KL) estimator, and then the biasing parameters d and k of the NB estimator are derived. A Monte Carlo simulation study is performed for comparing the performance of the OLS, ORR, Liu, KL and the proposed NB estimators. The main finding of this simulation is that the proposed NB estimator performed better than the above mentioned estimators under some conditions. A real-life data is analyzed to support the findings of the paper.
Authors are grateful to referees for their valuable comments and suggestions, which certainly improved the quality and presentation of the paper. Author B M Golam Kibria would like to dedicate this paper to his Ph. D. supervisor, Late Professor M Safiul Haq for his guidance and constant inspiration during his stay at The University of Westen Ontario and finally to achieve this present position.