Note
Go to the end to download the full example code. or to run this example in your browser via JupyterLite or Binder
Plot individual and voting regression predictions#
A voting regressor is an ensemble meta-estimator that fits several base
regressors, each on the whole dataset. Then it averages the individual
predictions to form a final prediction.
We will use three different regressors to predict the data:
GradientBoostingRegressor,
RandomForestRegressor, and
LinearRegression).
Then the above 3 regressors will be used for the
VotingRegressor.
Finally, we will plot the predictions made by all models for comparison.
We will work with the diabetes dataset which consists of 10 features collected from a cohort of diabetes patients. The target is a quantitative measure of disease progression one year after baseline.
# Authors: The scikit-learn developers # SPDX-License-Identifier: BSD-3-Clause importmatplotlib.pyplotasplt fromsklearn.datasetsimport load_diabetes fromsklearn.ensembleimport ( GradientBoostingRegressor , RandomForestRegressor , VotingRegressor , ) fromsklearn.linear_modelimport LinearRegression
Training classifiers#
First, we will load the diabetes dataset and initiate a gradient boosting regressor, a random forest regressor and a linear regression. Next, we will use the 3 regressors to build the voting regressor:
X, y = load_diabetes (return_X_y=True) # Train classifiers reg1 = GradientBoostingRegressor (random_state=1) reg2 = RandomForestRegressor (random_state=1) reg3 = LinearRegression () reg1.fit(X, y) reg2.fit(X, y) reg3.fit(X, y) ereg = VotingRegressor ([("gb", reg1), ("rf", reg2), ("lr", reg3)]) ereg.fit(X, y)
VotingRegressor(estimators=[('gb', GradientBoostingRegressor(random_state=1)),
 ('rf', RandomForestRegressor(random_state=1)),
 ('lr', LinearRegression())])In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook. On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.