Coordinate Descent Methods for Sparse Optimal Scoring and its Applications

Loading...
Thumbnail Image
Date
2021
Journal Title
Journal ISSN
Volume Title
Publisher
University of Alabama Libraries
Abstract

Linear discriminant analysis (LDA) is a popular tool for performing supervised classification in a high-dimensional setting. It seeks to reduce the dimension by projecting the data to a lower dimensional space using a set of optimal discriminant vectors to separate the classes. One formulation of LDA is optimal scoring which uses a sequence of scores to turn the categorical variables into quantitative variables. In this way, optimal scoring creates a generalized linear regression problem from a classification problem. The sparse optimal scoring formulation of LDA uses an elastic-net penalty on the discriminant vectors to induce sparsity and perform feature selection. We propose coordinate descent algorithms for finding optimal discriminant vectors in the sparse optimal scoring formulation of LDA, along with parallel implementations for large-scale problems. We then present numerical results illustrating the efficacy of these algorithms in classifying real and simulated data. Finally, we use Sparse Optimal Scoring to analyze and classify visual comprehension of Deaf persons based on EEG data.

Description
Electronic Thesis or Dissertation
Keywords
Coordinate Descent, Linear Discriminant Analysis, Sparse Optimal Scoring
Citation