Mixture of linear experts model for censored data : a novel approach with scale-mixture of normal distributions

dc.contributor.authorMirfarah, Elham
dc.contributor.authorNaderi, Mehrdad
dc.contributor.authorChen, Ding-Geng (Din)
dc.date.accessioned2022-08-26T05:22:29Z
dc.date.issued2021-06
dc.description.abstractMixture of linear experts (MoE) model is one of the widespread statistical frameworks for modeling, classification, and clustering of data. Built on the normality assumption of the error terms for mathematical and computational convenience, the classical MoE model has two challenges: (1) it is sensitive to atypical observations and outliers, and (2) it might produce misleading inferential results for censored data. The aim is then to resolve these two challenges, simultaneously, by proposing a robust MoE model for model-based clustering and discriminant censored data with the scale-mixture of normal (SMN) class of distributions for the unobserved error terms. An analytical expectation–maximization (EM) type algorithm is developed in order to obtain the maximum likelihood parameter estimates. Simulation studies are carried out to examine the performance, effectiveness, and robustness of the proposed methodology. Finally, a real dataset is used to illustrate the superiority of the new model.en_US
dc.description.departmentStatisticsen_US
dc.description.embargo2023-02-05
dc.description.librarianhj2022en_US
dc.description.sponsorshipThe National Research Foundation, South Africa and South Africa Medical Research Council.en_US
dc.description.urihttp://www.elsevier.com/locate/csdaen_US
dc.identifier.citationMirfarah, E., Naderi, M. & Chen, D.-G. 2021, 'Mixture of linear experts model for censored data: A novel approach with scale-mixture of normal distributions', Computational Statistics & Data Analysis, vol. 158, art. 107182, pp. 1-19, doi : 10.1016/j.csda.2021.107182.en_US
dc.identifier.issn0167-9473 (print)
dc.identifier.issn1872-7352 (online)
dc.identifier.other10.1016/j.csda.2021.107182
dc.identifier.urihttps://repository.up.ac.za/handle/2263/86969
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.rights© 2021 Elsevier B.V. All rights reserved. Notice : this is the author’s version of a work that was accepted for publication in Computational Statistics and Data Analysis. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. A definitive version was subsequently published in Computational Statistics and Data Analysis, vol. 158, art. 107182, pp. 1-19, 2021, doi : 10.1016/j.csda.2021.107182.en_US
dc.subjectMixture of linear experts (MoE)en_US
dc.subjectScale-mixture of normal (SMN)en_US
dc.subjectScale-mixture of normal class of distributionsen_US
dc.subjectEM-type algorithmen_US
dc.subjectCensored dataen_US
dc.titleMixture of linear experts model for censored data : a novel approach with scale-mixture of normal distributionsen_US
dc.typePostprint Articleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Mirfarah_Mixture_2021.pdf
Size:
685.74 KB
Format:
Adobe Portable Document Format
Description:
Postprint Article

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.75 KB
Format:
Item-specific license agreed upon to submission
Description: