Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

In the problem, you will implement the soft margin SVM using different gradient

ID: 3226410 • Letter: I

Question

In the problem, you will implement the soft margin SVM using different gradient descent methods. You can use either R or Python for this problem. Our training data consists of n pairs (z1,y1), (z2, y2), ,(zn, yn), with ari ERd and yi E {-1, 1). Define a hyperplane by tr f(r) A classification rule induced by f(z) is G(a) sign w b) To recap, to estimate the w, b of the soft margin SVM, we can minimize the cost: (2) max 0, 1- yi w (j) (j) b) n order to minimize the cost function, we first Define L(w, b; ari, yi) max10, 1 obtain the gradient with respect to w (j) the jth item in the vector w, and b as follows: of (w,b) OL(w, b; zi, yi) (3) where if yi (r w b) 1 OL(w, b; ri, yi) Liz otherwise. w, bi ri, yi) f yi (ar w b) 21 yi otherwise Now, we will implement and compare the following gradient descent techniques: Batch gradient descent: terate through the entire dataset and update the parameters as follows

Explanation / Answer

Everything about SVm and gradient descent algprithm is nicely explained. Please try to implement this on your own.

Tips: Try to train multiple times with diferent values of mini batch size, learning rate and hence choose the best out of it. To determine the best, you should split the data in two parts, - training and test data ( with 0.7 and 0.3 ratio respectively). The best model will be the one which give highest accuracy on test data set,

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote