Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Create a confusion matrix for each model and and ROC curve for each one. Suppose

ID: 3065855 • Letter: C

Question

Create a confusion matrix for each model and and ROC curve for each one.

Suppose two classification models (e.g., logistic regression and decision tree) were used for a classification problem (e.g., predicting survived/died in an accident). Actual observed outcomes and predicted probabilities for 10 test samples are given in the following table OBS ACTUAL PROBABILITY PROBABILITY MODEL 1) (MODEL 2) 0.75 0.8 0.65 0.85 0.3 0.45 0.55 0.35 0.4 0.25 # OUTCOME 0.7 0.8 0.65 0.9 0.45 0.5 0.55 0.35 0.4 0.6 5 9 10 1. Create a confusion matrix with a threshold of 0.5 for each model. 2. Calculate TPR (true positive rate) and FPR (false positive rate) with the same threshold (0.5) for each model. 3. Draw two ROC curves: one for Model 1 and another for Model 2, preferably in one figure using

Explanation / Answer

Lets see the outcomes of the model

1. The confusion matrix for model 1 is as follows :

The confusion matrix for model 2 is,

2. True positive rate for model 1 is,

# of true positives / # positives in the population(or actuals) = 5 / 6 = 0.8333

True positive rate for model 2 is,

4 / 6 = 0.6667

False positive rate for model 1 is,

# false positives / # negatives in the population(actuals) = 2 / 4 = 0.5

False positive rate for model 2 is,

1 / 4 = 0.25

3.

Obs # Actual Outcome Probability(Model 1) Predicted outcome(Model 1) Probability(Model 2) Predicted Outcome(Model 2) 1 P 0.7 P 0.75 P 2 P 0.8 P 0.8 P 3 P 0.65 P 0.65 P 4 P 0.9 P 0.85 P 5 P 0.45 N 0.3 N 6 P 0.5 P 0.45 N 7 N 0.55 P 0.55 P 8 N 0.35 N 0.35 N 9 N 0.4 N 0.4 N 10 N 0.6 P 0.25 N
Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote