Note
Go to the end to download the full example code. or to run this example in your browser via JupyterLite or Binder
Decision Tree Regression#
In this example, we demonstrate the effect of changing the maximum depth of a decision tree on how it fits to the data. We perform this once on a 1D regression task and once on a multi-output regression task.
# Authors: The scikit-learn developers # SPDX-License-Identifier: BSD-3-Clause
Decision Tree on a 1D Regression Task#
Here we fit a tree on a 1D regression task.
The decision trees is used to fit a sine curve with addition noisy observation. As a result, it learns local linear regressions approximating the sine curve.
We can see that if the maximum depth of the tree (controlled by the
max_depth
parameter) is set too high, the decision trees learn too fine
details of the training data and learn from the noise, i.e. they overfit.
Create a random 1D dataset#
importnumpyasnp rng = np.random.RandomState (1) X = np.sort (5 * rng.rand(80, 1), axis=0) y = np.sin (X).ravel() y[::5] += 3 * (0.5 - rng.rand(16))
Fit regression model#
Here we fit two models with different maximum depths
fromsklearn.treeimport DecisionTreeRegressor regr_1 = DecisionTreeRegressor (max_depth=2) regr_2 = DecisionTreeRegressor (max_depth=5) regr_1.fit(X, y) regr_2.fit(X, y)
DecisionTreeRegressor(max_depth=5)In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.