Note
Go to the end to download the full example code. or to run this example in your browser via JupyterLite or Binder
Isotonic Regression#
An illustration of the isotonic regression on generated data (non-linear monotonic trend with homoscedastic uniform noise).
The isotonic regression algorithm finds a non-decreasing approximation of a function while minimizing the mean squared error on the training data. The benefit of such a non-parametric model is that it does not assume any shape for the target function besides monotonicity. For comparison a linear regression is also presented.
The plot on the right-hand side shows the model prediction function that results from the linear interpolation of thresholds points. The thresholds points are a subset of the training input observations and their matching target values are computed by the isotonic non-parametric fit.
# Authors: The scikit-learn developers # SPDX-License-Identifier: BSD-3-Clause importmatplotlib.pyplotasplt importnumpyasnp frommatplotlib.collectionsimport LineCollection fromsklearn.isotonicimport IsotonicRegression fromsklearn.linear_modelimport LinearRegression fromsklearn.utilsimport check_random_state n = 100 x = np.arange (n) rs = check_random_state (0) y = rs.randint(-50, 50, size=(n,)) + 50.0 * np.log1p (np.arange (n))
Fit IsotonicRegression and LinearRegression models:
ir = IsotonicRegression (out_of_bounds="clip") y_ = ir.fit_transform(x, y) lr = LinearRegression () lr.fit(x[:, np.newaxis ], y) # x needs to be 2d for LinearRegression
LinearRegression()In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.