I've built an XGBoost classifier with following code:
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.30, random_state=42)
xgbc = XGBClassifier()
xgbc.fit(X_train, y_train)
y_pred = xgbc.predict(X_test)
rep = metrics.classification_report(y_test, y_pred)
print(rep)
And I get the scores as follows.

Then I performed 5 fold cross-validation:
cross_val_score(xgbc, X, y, cv=5)
And got the following cv scores.
- 1.0
- 1.0
- 0.99497487
- 0.98994975
- 0.98492462
I'm fairly new to machine learning, and I can't figure out if my model is pefroming excellently or if it has overfit.
cross_val_scorereturns, these 5 numbers are the performance of the same model on different slices of the data. Please see my answer below for more details.