SVC is predicting everything in One Class when I try to plot the decision boundaries

  machine-learning, python, svm

I’m just trying out a SVC classifier on the Social Media Ads dataset from Kaggle, and it performs well, but when I go to plot the decision boundaries, it predicts the entire mesh to 1.

Here’s the initial SVC, all hyperparameters on default:

from sklearn.svm import SVC
model = SVC()
model.fit(X_train,y_train)
y_train_pred = model.predict(X_train)
y_test_pred = model.predict(X_test

For the test data I get these scores:

Test:
               precision    recall  f1-score   support

           0       0.94      0.88      0.91        68
           1       0.78      0.88      0.82        32

    accuracy                           0.88       100
   macro avg       0.86      0.88      0.87       100
weighted avg       0.89      0.88      0.88       100

All good, so I go code the plot with this:

plt.figure()
ax = sns.scatterplot(x = ads_data.Age, 
                     y = ads_data.EstimatedSalary, 
                     data = ads_data, 
                     hue=ads_data.Purchased,
                     palette='Set2')
plt.legend().remove()


xlim = ax.get_xlim()
ylim = ax.get_ylim()
xx, yy = np.meshgrid(np.linspace(*xlim, num=200),
                      np.linspace(*ylim, num=200))
Z = model.predict(np.c_[xx.ravel(), yy.ravel()]).reshape(xx.shape)

contours = ax.contourf(xx, yy, Z, alpha=0.3, cmap = 'Set2')
plt.show()

And Z turns out to be just all 1. So the entire plot gets cast in this blue-ish tint. What’s wrong there? Thanks a lot for any insight.

Source: Python Questions

LEAVE A COMMENT