Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

Sufficient Dimension Reduction (SDR) as a class of dimension reduction methods for regression problems achieve great success in recent years. However, the existing SDR methods have some limitations, such as strict conditions on the predictor X, data sparseness problem or their performances rely on the forms of models. Motivated by these problems, we propose new SDR methods which do not need smoothing techniques and perform well under different kinds of models and predictors, especially discrete or categorical predictors. Since the proposed SDR methods only work when n > p, we further develop two variable selection methods for large p small n problems. In this dissertation, we propose three projects. In the first one, we propose a new method to estimate the direction in single-index models via distance covariance (dcov) and study the asymptotic properties of the estimate; the second project is to extend the methodology to multiple-index models; the third one is to propose two variable selection methods: the first one is to combine the dcov-based SDR method with a penalty term; the second is to screen variables by ranking the standardized distance covariance.

Details

PDF

Statistics

from
to
Export
Download Full History