I used both XGboost and random forest for a two-class classification.
with random forest accuracy is 77%
with XGboost accuracy is 71%
when you feed a sample to random forest model it says probability score for class 1 is 51% and for class 2 is 49( almost for every example probability score is near 50%) but when you feed a sample to XGboost (lower general accuracy) the probability score for being in each class is above 70%. it seems XGboost is less accurate but more confident. can anyone tell me the story behind this?