Keywords: AUC: Area Under the Curve ROC: Receiver operating characteristics Fpr-False Positive rate Tpr-True Positive rate**
a) Metric functions-Confusion Matrix b) Scoring parameters
Comment: The Confusion matrix summary shows the same result with the classification report showing that the classification is accurate
The mean absolute error for the predicted value or its likely deviation from the original value is 0.0.. which is perfect.
Consulting the map to try linear SVC
Comment: LinearSVC gave a score of 79.5%
comparison of a model true positive rate vs model false positive rate *True positive= model predicts 1 when truth is 1 *False positive= model predicts 1 when truth is 0 *True negative =model predicts 0 when truth is 0 *False negative =model predicts 0 when truth is 1
The confusion matrix will help to compare the labels a model predicts and the actual labels it was meant to predict. It reveals where the model is getting confused.
A perfect model has an f1-score of 1.00. f1-score of 1.00 is good, perfect accuracy is equal to 1.0, we recorded an accuracy of 1.0
Comments: Comparing the baseline model with the improved, we find out that the Accuracy, precision, Recall,f1 metrics is fully optimized .The model is a good one for deployment.