Show simple item record

dc.contributor.advisorDing, Chris
dc.creatorZheng, Shuai
dc.date.accessioned2017-07-03T14:05:35Z
dc.date.available2017-07-03T14:05:35Z
dc.date.created2017-05
dc.date.issued2017-04-24
dc.date.submittedMay 2017
dc.identifier.urihttp://hdl.handle.net/10106/26760
dc.description.abstractMachine learning technology is now widely used in engineering, science, finance, healthcare, etc. In this dissertation, we make several advances in machine learning technologies for high dimensional data analysis, image data classification, recommender systems and classification algorithms. In this big data era, many data are high dimensional data which is difficult to analyze. We propose two efficient Linear Discriminant Analysis (LDA) based methods to reduce data to low dimensions. Kernel alignment measures the degree of similarity between two kernels. We propose kernel alignment inspired LDA to find a subspace to maximize the alignment between subspace-transformed data kernel and class indicator kernel. Classical LDA uses arithmetic mean of all between-class distances. However, arithmetic mean between-class distance has some limitations. First, large between-class distance could dominate the arithmetic mean. Second, arithmetic mean does not consider pairwise between-class distance and thus some classes may overlap with each other in the subspace. We propose harmonic mean based LDA to overcome the limitations of classical LDA. Low-rank models can capture the correlations between data. We propose an efficient low-rank regression model for image and website classification and a regularized Singular Value Decomposition (SVD) model for recommender system. Real life data often includes information from different channels. These different aspects/channels of the same object are called multi-view data. In this work, we propose a multi-view low-rank regression model by imposing low-rank constraints on multi-view data and we provide a closed-form solution to the multi-view low-rank regression model. Recommender system is very important for online advertising, online shopping, social network, etc. In recent applications, regularization becomes an increasing trend. We present a regularized SVD (RSVD) model for recommender system to improve standard SVD based models. Support Vector Machine (SVM) is an efficient classification approach, which finds a hyperplane to separate data from different classes. This hyperplane is determined by support vectors. In existing SVM formulations, the objective function uses L2 norm or L1 norm on slack variables. The number of support vectors is a measure of generalization errors. In this work, we propose a Minimal SVM, which uses L0.5 norm on slack variables. The result model further reduces the number of support vectors and increases the classification performance.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.subjectMachine learning
dc.subjectLinear discriminant analysis
dc.subjectMulti-view data
dc.subjectLow-rank
dc.subjectRegression
dc.subjectSingular value decomposition
dc.subjectRecommender system
dc.subjectSupport vector machines
dc.titleMachine Learning: Several Advances in Linear Discriminant Analysis, Multi-View Regression and Support Vector Machine
dc.typeThesis
dc.degree.departmentComputer Science and Engineering
dc.degree.nameDoctor of Philosophy in Computer Science
dc.date.updated2017-07-03T14:07:42Z
thesis.degree.departmentComputer Science and Engineering
thesis.degree.grantorThe University of Texas at Arlington
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy in Computer Science
dc.type.materialtext


Files in this item

Thumbnail


This item appears in the following Collection(s)

Show simple item record