top of page
Search

Linear Discriminant Analysis (LDA) | Data Reduction Using MATLAB

LDA Objective

• The objective of LDA is to perform dimensionality reduction

PCA

• In PCA, the main idea to re-express the available dataset to extract the relevant information by reducing the redundancy and minimize the noise.

• We didn’t care about whether this dataset represent features from one or more classes, i.e. the discrimination power was not taken into consideration while we were talking about PCA

• In PCA, we had a dataset matrix X with dimensions mxn, where columns represent different data samples.

• We first started by subtracting the mean to have a zero mean dataset, then we computed the covariance matrix Sx = XX^T.

• Eigen values and eigen vectors were then computed for Sx. Hence the new basis vectors are those eigen vectors with highest eigen values, where the number of those vectors was our choice.

• Thus, using the new basis, we can project the dataset onto a less dimensional space with more powerful data representation

LDA

• Consider a pattern classification problem, where we have Cclasses, e.g. seabass, tuna, salmon.

• Each class has Ni m-dimensional samples, where i = 1,2, …, C.

• Hence we have a set of m-dimensional samples {x1 , x2 ,…, x^(Ni)} belong to class ωi.

• Stacking these samples from different classes into one big fat matrix X such that each column represents one sample

• We seek to obtain a transformation of X to Y through projecting the samples in X onto a hyperplane with dimension C-1.

• Let’s see what does this mean

Example

LDA for two classes

Compute the Linear Discriminant projection for the following two dimensional dataset.

– Samples for class ω1 : X1=(x1,x2)={(4,2),(2,4),(2,3),(3,6),(4,4)}

– Sample for class ω2 : X2=(x1,x2)={(9,10),(6,8),(9,5),(8,7),(10,8)}

Between-class scatter matrix

Hire expert to get help in any data reduction techniques using Python or MATLAB and get help with our experienced expert.