Files
Abstract
In this dissertation, we develop an R package, NlcOptim, to solve the optimization problems with nonlinear objective functions and nonlinear constraints. This package can be used to solve problems in sufficient dimension reduction and variable selection because of its capability to accept the input parameters as a constrained matrix. We propose a framework for dimension reduction problems via Distance Covariance (DCOV) where both the response and the predictor are vectors. In this framework, distance covariance method is employed to estimate the central subspace effectively, and we also propose two different methods based on projective resampling technique to transfer multivariate response to univariate response. This approach keeps the model-free advantage, and can fully recover the central subspace even when many predictors are discrete. We then extend DCOV methods to canonical analysis, termed as Canonical Distance Covariance Analysis (CDCA), where we explore the relationships between two multivariate sets of variables. In addition, we extend DCOV to estimate the dual central subspace (DCS), which is to find the basis that span the subspace of Y as well as the basis that span the subspace of X. At last, we develop a new concept, termed the Dual Variable Selection (DVS), to propose a method for simultaneously selecting subsets for each of the two random vectors, by employing DCOV method combined with LASSO penalty.