I am using explained deviance (sometimes referred to as percent deviance, or deviance explained by the model) as a goodness-of-fit measure for my species distribution model. Explained deviance is calculated as: (Null Deviance - Residual Deviance) / Null Deviance , and the greater the explained deviance, the greater the explanatory power of the model. One of my deviance values is greater than 1.0 (when multiplied, greater than 100%)....why is that?
Edit: Here is my R code & output for the boosted regression tree:
CODE
spurge10.tc5.lr001 <- gbm.step(data=model.data10, gbm.x = 6:34, gbm.y = 4, family = "bernoulli", tree.complexity = 5, learning.rate = 0.001, bag.fraction = 0.5)
OUTPUT: I am using "estimated cv deviance"
fitting final gbm model with a fixed number of 1400 trees for PresOrAbs
mean total deviance = 1.386 mean residual deviance = 0.718
estimated cv deviance = 1.079 ; se = 0.046
training data correlation = 0.861 cv correlation = 0.58 ; se = 0.038
training data ROC score = 0.982 cv ROC score = 0.819 ; se = 0.021
Thanks!