Questions tagged [stacking]

Stacking is a meta-ensemble machine learning technique that trains a second-level machine learning model on the predictions from multiple machine learning models trained on the data.

Stacking works by:

  1. Training a variety of machine learning models on the dataset
  2. Generating predictions from each of the trained models
  3. Training a second-level machine learning model (a meta-learner) on the predictions from step #2

Stacking apparently produces more accurate results than voting/averaging of ensemble predictions.

References:

103 questions
2
votes
1 answer

Stacking or Voting - Multiple feature sets extracted with different parameters

I am extracting features from time series data using different parameters and then creating a SINGLE feature based data set with all features to perform classification. If I wanted to create separate feature sets corresponding to different…
Atif
  • 123
1
vote
0 answers

Stacking with two distinct level-0 predictors

I'm looking to employ a blended approach using, potentially, least squares linear regression for a classification problem. The level-1 classifier should output a probability between two classes. Could I train the level-1 when the level-0 classifiers…
Clocker
  • 111
  • 3
0
votes
1 answer

Superlearner Without OOB Results

I'm interested in creating a superlearner algorithm. Unfortunately, my situation is such that I have access to the predictions of submodels I'm interested in on new data, but don't necessarily have access to the OOB results of their original…
Nate
  • 21