In machine learning, ensemble methods combine multiple algorithms to make a prediction. Bagging, boosting and stacking are some examples.
Questions tagged [ensemble-learning]
466 questions
3
votes
1 answer
Weighting variables for an index
I have been tasked with trying to modify our current "index" which basically takes 4 observations per person and calculates a score based on what they achieve. Here is how the score is created (all variables are ordinal in nature, higher scores…
Btibert3
- 1,334
- 2
- 15
- 24
3
votes
1 answer
Simple voting scheme using confidence for each vote
I am doing classification by splitting each observation into 14 subparts and then classifying each of these subparts individually. The overall classification of the observation is then performed using an ensemble of 14 votes.
I can see how much…
pir
- 5,056
3
votes
1 answer
Minimizing Curve fit for predictive model
Let's assume we've found 100 independent variables that can predict y. Each of those independent variable are close to uncorrelated and they are all curve fitted. Using any single one to predict y equates to a large probability of failed out of…
user1234440
- 163
2
votes
0 answers
Ensemble classification model on partially overlapped datasets?
Given two partially overlapping datasets $X_1$ and $X_2$ (say past 10K hours and past 10K minutes), how could one go about creating an ensemble model of classifiers of these datasets? Standard approaches such as boosting or stacking will not work,…
Raven
- 131
2
votes
1 answer
How to determine if the errors made by the classifiers are uncorrelated
I am working on ensemble methods to improve the Area under the ROC curve in an experiment. In Ensemble Methods in Machine Learning ", Dietterich says " A necessary and suficient condition for an ensemble of classifiers to be more accurate than any…
Jorge Amaral
- 311
1
vote
1 answer
Can majority voting be applied in this situation?
I have 6 class labels, say: a, b, c, d, e, and f.
I am using 3 classification models: decision tree, random forest, and naive Bayes.
Can majority voting be applied this ensemble, say more than 50% votes for a decision?
vidhi
- 11
- 2
1
vote
1 answer
What is this ensemble learning technique called?
For example, I trained two models: one with SVM and one with KNN.
Final Prediction = 0.4*KNN + 0.6*SVM
Is this considered blending?
user46925
0
votes
1 answer
Using a logistic model on the estimates of several other classification models
I'm working on a classification model that will predict whether a sales opportunity will end up 'won' or 'lost', given various attributes of the opportunity. I've been using my training data to build several models, including a random forest model,…
0
votes
0 answers
How to choose the meta learner for the super learner model?
I am building a super learner ensemble model using the classifiers SVM, kNN, AdaBoost, XGBoost, and Random Forest. However I am not sure the logic behind what classifier to use for the meta learner I have seen many applications using logistic…
Lucy
- 1
- 1
0
votes
0 answers
Math behind ensemble learning
I'm struggling to find some clear math behind ensemble learning.
I can simulate it very easily, eg:
import numpy as np, scipy.stats
r = np.random.random(1000)
d = np.array([0]*1000)
cors = []
for i in range(100):
v = np.random.random(1000)
c =…
Blaze
- 73
-1
votes
3 answers
Interpreting Ensemble Models
The project I'm working on uses a lot of different variables to predict sales. The best model, in terms of mean average squared error is an ensemble Model which is a combination of a regression model and a probability tree model.
How do I interpret…