Detecting Influential Observations in Principal Components and Common Principal Components
Boente, Graciela; Pires, Ana M.; Rodrigues, Isabel M.
Computational Statistics & Data Analysis, 54(12) (2010), 2967-2975
http://dx.doi.org/10.1016/j.csda.2010.01.001
Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (like the minimum volume ellipsoid, the minimum covariance determinant and the SS-estimators) is not adequate for detecting atypical observations in small samples from the normal distribution. In the multi-population setting and under a common principal components model, aggregated measures based on standardized empirical influence functions are used to detect observations with a significant impact on the estimators. As in the one-population setting, the cutoff values obtained from the asymptotic distribution of those aggregated measures are not adequate for small samples. More appropriate cutoff values, adapted to the sample sizes, can be computed by using a cross-validation approach. Cutoff values obtained from a Monte Carlo study using SS-estimators are provided for illustration. A real data set is also analyzed.
|