Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Consider the training examples shown in the following table for a binary classif

ID: 3680569 • Letter: C

Question

Consider the training examples shown in the following table for a binary classification problem. What is the entropy of this collection of training examples with respect to the target class variable? What are the information gains of a_1 and a_2 relative to these training examples? For a_3, which is a continuous attribute, compute the information gain for every possible split. What is the best split (among a_1, a_2, and a_3) according to the information gain? What is the best split (between a_1 and a_2) according to the Gini index?

Explanation / Answer

a)

Entropy = -4/8(log4/8) – 4/8(log4/8) = 1

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote