Submit
Path:
~
/
/
opt
/
alt
/
python35
/
share
/
doc
/
alt-python35-scikit-learn-0.18.1
/
examples
/
linear_model
/
File Content:
plot_ridge_path.py
""" =========================================================== Plot Ridge coefficients as a function of the regularization =========================================================== Shows the effect of collinearity in the coefficients of an estimator. .. currentmodule:: sklearn.linear_model :class:`Ridge` Regression is the estimator used in this example. Each color represents a different feature of the coefficient vector, and this is displayed as a function of the regularization parameter. This example also shows the usefulness of applying Ridge regression to highly ill-conditioned matrices. For such matrices, a slight change in the target variable can cause huge variances in the calculated weights. In such cases, it is useful to set a certain regularization (alpha) to reduce this variation (noise). When alpha is very large, the regularization effect dominates the squared loss function and the coefficients tend to zero. At the end of the path, as alpha tends toward zero and the solution tends towards the ordinary least squares, coefficients exhibit big oscillations. In practise it is necessary to tune alpha in such a way that a balance is maintained between both. """ # Author: Fabian Pedregosa -- <fabian.pedregosa@inria.fr> # License: BSD 3 clause print(__doc__) import numpy as np import matplotlib.pyplot as plt from sklearn import linear_model # X is the 10x10 Hilbert matrix X = 1. / (np.arange(1, 11) + np.arange(0, 10)[:, np.newaxis]) y = np.ones(10) ############################################################################### # Compute paths n_alphas = 200 alphas = np.logspace(-10, -2, n_alphas) clf = linear_model.Ridge(fit_intercept=False) coefs = [] for a in alphas: clf.set_params(alpha=a) clf.fit(X, y) coefs.append(clf.coef_) ############################################################################### # Display results ax = plt.gca() ax.plot(alphas, coefs) ax.set_xscale('log') ax.set_xlim(ax.get_xlim()[::-1]) # reverse axis plt.xlabel('alpha') plt.ylabel('weights') plt.title('Ridge coefficients as a function of the regularization') plt.axis('tight') plt.show()
Submit
FILE
FOLDER
Name
Size
Permission
Action
README.txt
135 bytes
0644
lasso_dense_vs_sparse_data.py
1862 bytes
0644
plot_ard.py
2828 bytes
0644
plot_bayesian_ridge.py
2733 bytes
0644
plot_huber_vs_ridge.py
2206 bytes
0644
plot_iris_logistic.py
1679 bytes
0644
plot_lasso_and_elasticnet.py
2074 bytes
0644
plot_lasso_coordinate_descent_path.py
2945 bytes
0644
plot_lasso_lars.py
1080 bytes
0644
plot_lasso_model_selection.py
5431 bytes
0644
plot_logistic.py
1568 bytes
0644
plot_logistic_l1_l2_sparsity.py
2601 bytes
0644
plot_logistic_multinomial.py
2480 bytes
0644
plot_logistic_path.py
1195 bytes
0644
plot_multi_task_lasso_support.py
2319 bytes
0644
plot_ols.py
1936 bytes
0644
plot_ols_3d.py
2040 bytes
0644
plot_ols_ridge_variance.py
2060 bytes
0644
plot_omp.py
2263 bytes
0644
plot_polynomial_interpolation.py
2088 bytes
0644
plot_ransac.py
1859 bytes
0644
plot_ridge_coeffs.py
2785 bytes
0644
plot_ridge_path.py
2138 bytes
0644
plot_robust_fit.py
3050 bytes
0644
plot_sgd_comparison.py
1819 bytes
0644
plot_sgd_iris.py
2202 bytes
0644
plot_sgd_loss_functions.py
1232 bytes
0644
plot_sgd_penalties.py
1877 bytes
0644
plot_sgd_separating_hyperplane.py
1221 bytes
0644
plot_sgd_weighted_samples.py
1458 bytes
0644
plot_sparse_recovery.py
7486 bytes
0644
plot_theilsen.py
3846 bytes
0644
N4ST4R_ID | Naxtarrr