Submit
Path:
~
/
/
opt
/
alt
/
python35
/
share
/
doc
/
alt-python35-scikit-learn-0.18.1
/
examples
/
linear_model
/
File Content:
plot_polynomial_interpolation.py
#!/usr/bin/env python """ ======================== Polynomial interpolation ======================== This example demonstrates how to approximate a function with a polynomial of degree n_degree by using ridge regression. Concretely, from n_samples 1d points, it suffices to build the Vandermonde matrix, which is n_samples x n_degree+1 and has the following form: [[1, x_1, x_1 ** 2, x_1 ** 3, ...], [1, x_2, x_2 ** 2, x_2 ** 3, ...], ...] Intuitively, this matrix can be interpreted as a matrix of pseudo features (the points raised to some power). The matrix is akin to (but different from) the matrix induced by a polynomial kernel. This example shows that you can do non-linear regression with a linear model, using a pipeline to add non-linear features. Kernel methods extend this idea and can induce very high (even infinite) dimensional feature spaces. """ print(__doc__) # Author: Mathieu Blondel # Jake Vanderplas # License: BSD 3 clause import numpy as np import matplotlib.pyplot as plt from sklearn.linear_model import Ridge from sklearn.preprocessing import PolynomialFeatures from sklearn.pipeline import make_pipeline def f(x): """ function to approximate by polynomial interpolation""" return x * np.sin(x) # generate points used to plot x_plot = np.linspace(0, 10, 100) # generate points and keep a subset of them x = np.linspace(0, 10, 100) rng = np.random.RandomState(0) rng.shuffle(x) x = np.sort(x[:20]) y = f(x) # create matrix versions of these arrays X = x[:, np.newaxis] X_plot = x_plot[:, np.newaxis] colors = ['teal', 'yellowgreen', 'gold'] lw = 2 plt.plot(x_plot, f(x_plot), color='cornflowerblue', linewidth=lw, label="ground truth") plt.scatter(x, y, color='navy', s=30, marker='o', label="training points") for count, degree in enumerate([3, 4, 5]): model = make_pipeline(PolynomialFeatures(degree), Ridge()) model.fit(X, y) y_plot = model.predict(X_plot) plt.plot(x_plot, y_plot, color=colors[count], linewidth=lw, label="degree %d" % degree) plt.legend(loc='lower left') plt.show()
Submit
FILE
FOLDER
Name
Size
Permission
Action
README.txt
135 bytes
0644
lasso_dense_vs_sparse_data.py
1862 bytes
0644
plot_ard.py
2828 bytes
0644
plot_bayesian_ridge.py
2733 bytes
0644
plot_huber_vs_ridge.py
2206 bytes
0644
plot_iris_logistic.py
1679 bytes
0644
plot_lasso_and_elasticnet.py
2074 bytes
0644
plot_lasso_coordinate_descent_path.py
2945 bytes
0644
plot_lasso_lars.py
1080 bytes
0644
plot_lasso_model_selection.py
5431 bytes
0644
plot_logistic.py
1568 bytes
0644
plot_logistic_l1_l2_sparsity.py
2601 bytes
0644
plot_logistic_multinomial.py
2480 bytes
0644
plot_logistic_path.py
1195 bytes
0644
plot_multi_task_lasso_support.py
2319 bytes
0644
plot_ols.py
1936 bytes
0644
plot_ols_3d.py
2040 bytes
0644
plot_ols_ridge_variance.py
2060 bytes
0644
plot_omp.py
2263 bytes
0644
plot_polynomial_interpolation.py
2088 bytes
0644
plot_ransac.py
1859 bytes
0644
plot_ridge_coeffs.py
2785 bytes
0644
plot_ridge_path.py
2138 bytes
0644
plot_robust_fit.py
3050 bytes
0644
plot_sgd_comparison.py
1819 bytes
0644
plot_sgd_iris.py
2202 bytes
0644
plot_sgd_loss_functions.py
1232 bytes
0644
plot_sgd_penalties.py
1877 bytes
0644
plot_sgd_separating_hyperplane.py
1221 bytes
0644
plot_sgd_weighted_samples.py
1458 bytes
0644
plot_sparse_recovery.py
7486 bytes
0644
plot_theilsen.py
3846 bytes
0644
N4ST4R_ID | Naxtarrr