T/F Questions. Justify your answer. (a) In case of binary classification, if tra
ID: 672267 • Letter: T
Question
T/F Questions. Justify your answer.
(a) In case of binary classification, if training data is linearly separable then Perceptron finds
the hyperplane that maximizes the margin between the positive examples from the negative ones.
(b) If training data is linearly separable, it is possible that Perceptron algorithm can run for ever.
(c) When training data is linearly separable, smaller the margin easier the problem of finding a separating hyperplane.
(d) In case of SVM, the maximum margin hyperplane that separates positive and negative examples is a linear combination of training data points.
(e) In case of SVM, the maximum margin hyperplane is specified by the data points which are not support vectors
Explanation / Answer
(5 points) In SVM learning for the non-separable case, how does C balance between margin vs
number of misclassi ed data points? What happens when C is small? What happens when C is
large?
(b) (5 points) In SVM learning for linearly separable case, why only a few i
's are nonzero in the SVM
solution?
(c) (5 points) Can you give a simple placement of 7 data points in 2D, where 4 data points belongs to
positive class and 3 data points belong to negative class and the maximum margin SVM solution
will have all 7 points as support vectors?
(d) (5 points) How does di erent values of the slack variable i give rise to three di erent kinds of
support vectors?
(e) (5 points) How is C in the SVM solution related to the maximum value of any i?
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.