Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

please solve these questions : 1/ explain the principle of Fisher\'s discriminan

ID: 3110265 • Letter: P

Question

please solve these questions :

1/ explain the principle of Fisher's discriminant function (LDF) and express the vector V (d-dimensional hyper plane used to project the samples of two classes ) as a function of Scatter Matrix ( Sw and means of class 1 and 2)

2/ Define principal components (Pcs). How do you compute  principal components for a given data set? What are the uses of  principal components?

3/ Define support vectors in the context of Support Vector Machine.

4/ Derive the distance between separation boundaries in SVM as a function of W, where W is a d-dimensional vector perpendicular to the decision boundary.

5/ Present Lagrangian formulation of maximizing the distance between separation boundaries.

6/ What do you understand by linear separability and kernel trick in SVM

7/ Describe k-means algorithm for clustering of data into k clusters

8/ Express probability density function when the data set consists of c Gaussian Mixture Models (GMM)

9/ Describe Expectation Maximization (EM) algorithm briefly using GMM

10/ What are the differences between k-means and EM algorithm

Explanation / Answer

2. Principal Components are connected with the eigenvectors of any of the covariance or correlation matrix of the data. The ith principal component is the line that follows the eigenvector connected with the ith largest eigenvalue and this measure the variance in the direction of the ith principle component. The ratio of the eigenvalue to the sum of the eigenvalues is the quantity of variation clarified by the ith PC.

Suppose two variables, length and width are measured and plotted, both variables have nearly the same variance and they are highly correlated with one another. A vector is passed through the long axis of the cloud of points and a second vector at right angles to the first, with both vectors passing through the centroid of the data. PC axes will usually do not coincide exactly with any of the original variables.

Uses: The principle component analysis is a method useful finds the patterns or structure in high dimensional data sets. The two instant uses of this are dimension reduction and the collinearity-elimination.