Shrinkage estimators under generalized garrote and LINEX loss functions for regression analysis

dc.contributor.authorMunaweera Arachchilage, Inesh Prabuddha
dc.contributor.examiningcommitteeWang, Liqun (Statistics) Arino, Julien (Mathematics)en_US
dc.contributor.supervisorJafari Jozani, Mohammad (Statistics) Muthukumarana, Saman (Statistics)en_US
dc.date.accessioned2018-09-14T16:13:08Z
dc.date.available2018-09-14T16:13:08Z
dc.date.issued2018-06en_US
dc.date.submitted2018-06-20T18:39:38Zen
dc.degree.disciplineStatisticsen_US
dc.degree.levelMaster of Science (M.Sc.)en_US
dc.description.abstractShrinkage methods are widely used in multiple linear regression analysis to address the multicollinearity and some other issues in many practical situations. Most of the commonly used shrinkage methods such as ridge regression and lasso do not consider the importance of each variable when applying the shrinkage, and all model coefficients will be shrunken towards zero in a similar rate. In other words, those methods shrink least important coefficients and most important coefficients similarly. When someone needs to perform the subset selection while applying asymmetric shrinkage on coefficients, the adaptive lasso can be used, and if someone wants to retain all the variables in the model he/she can use generalized ridge regression. However, since the generalized ridge regression is defined with a transformed design matrix, and since it uses a number of tuning parameters which is equal to the number of predictors in the model, the user has no control over the amount of shrinkage on any individual coefficient. This thesis addresses this issue by proposing some new regularized methods which can be used as alternatives to ridge regression, following the idea of the non-negative garrote function. First, we develop the quadratic garrote, which shrinks coefficients unequally, and at the same time retains all the variables in the model. We show that this approach, it is capable of shrinking smaller coefficients even faster than the adaptive lasso while keeping the larger coefficients almost untouched. Also, we generalize the quadratic garrote so that it gives the flexibility for the user to decide the level of shrinkage on each variable directly, based on his experience or prior knowledge. We derive a closed-form solution for the quadratic garrote problem and study the theoretical properties of the suggested estimator such as its variance, expectation, and bias. Furthermore, we explore the possibility of using different loss functions as the penalty in the non-negative garrote and develop the LINEX regression method as a novel shrinkage approach. We use a numerical optimization technique to estimate the LINEX regression coefficients. In addition, through simulation studies under different settings as well as real-world applications, we show that the suggested shrinkage method can be used as a better substitute for the ridge regression in terms of prediction error and practical use.en_US
dc.description.noteOctober 2018en_US
dc.identifier.urihttp://hdl.handle.net/1993/33380
dc.language.isoengen_US
dc.rightsopen accessen_US
dc.subjectLINEX regression, Multiple linear regression, Non-negative garrote, Shrinkage methodsen_US
dc.titleShrinkage estimators under generalized garrote and LINEX loss functions for regression analysisen_US
dc.typemaster thesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
munaweera_inesh.pdf
Size:
11.91 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.2 KB
Format:
Item-specific license agreed to upon submission
Description: