It is often said on this site that R-squared is problematic for nonlinear models, search ... , but I cannot find a canonic thread. A published paper, with simulations, is this, and they also point to information measures like AIC as better alternatives. From an answer in this R-help thread
There is a good reason that an nls model fit in R does not provide
r-squared - r-squared doesn't make sense for a general nls model.
One way of thinking of r-squared is as a comparison of the residual
sum of squares for the fitted model to the residual sum of squares for
a trivial model that consists of a constant only. You cannot
guarantee that this is a comparison of nested models when dealing with
an nls model. If the models aren't nested this comparison is not
terribly meaningful.
So the answer is that you probably don't want to do this in the first
place.
-- Douglas Bates
(This would apply equally to the adjusted version.) But that answer also points to a solution: R-squared is a kind of comparison to a simple, default (and nested) model, and if you can find some other simple model for comparison, you can use that. But, for a regression tree, the constant only model corresponds to a default tree with only one node, the root, so R-squared do make sense in this case.
But your question about the meaning of $p$ (the number of parameters) for a tree model is still interesting! In models like splines, gams and many others, we use a measure called the effective number of parameters or effective number of degrees of freedom. See this list of questions with answers. But can this be defined for a very non-linear model like trees (or forests?) The book The Elements of Statistical Learning by Hastie, Tibshirani & Friedman offers a discussion as Exercise 9.5 (page 336, second Edition.) What is proposed there is to define the degrees of freedom of the fit as $\DeclareMathOperator{\C}{\mathbb{Cov}} \sum_i \C(y_i, \hat{y}_i)/\sigma^2$. Doing the exercise is left for you. It should involve some simulation.
Some related posts: Decision Tree Quality Metric, computing AIC or BIC for nonlinear regression models