Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

The sparse solution technique, stemming from the principles of compressive sensing, holds myriad applications in both applied mathematics and data science. This dissertation studies its applications in two directions: local clustering and function approximation. Local clustering aims at extracting a local structure inside a graph without the necessity of knowing the entire graph structure. As the local structure is usually small in size compared to the entire graph, one can think of it as a compressive sensing problem where the indices of target cluster can be thought as a sparse solution to a linear system associated with the graph Laplacian. Inspired by this idea, we developed two algorithms which can find the cluster of interest efficiently and effectively. For function approximation in $C([0,1]^d)$, we propose a new approach via Kolmogorov superposition theorem (KST) based on the linear spline approximation of the K-outer function in Kolmogorov superposition representation. We achieve the optimal approximation rate as $O(1/n^2)$, with $n$ being the number of knots of the linear spline functions over $[0,1]$, and the approximation constant increases linearly in the dimension $d$. We show that there is a dense subclass in $C([0,1]^d)$ whose approximation can achieve such optimal rate, and the number of parameters needed in such approximation is at most $O(nd)$. This approach can also be extended to the numerical solution of partial differential equation such as the Poisson equation.

Details

PDF

Statistics

from
to
Export
Download Full History