Submit
Path:
~
/
/
proc
/
thread-self
/
root
/
opt
/
alt
/
python35
/
lib64
/
python3.5
/
site-packages
/
sklearn
/
__pycache__
/
File Content:
kernel_ridge.cpython-35.pyc
��(X� � @ s� d Z d d l Z d d l m Z m Z d d l m Z d d l m Z d d l m Z d d l m Z Gd d � d e e � Z d S)zFModule :mod:`sklearn.kernel_ridge` implements kernel ridge regression.� N� )� BaseEstimator�RegressorMixin)�pairwise_kernels)�_solve_cholesky_kernel)� check_X_y)�check_is_fittedc @ ss e Z d Z d Z d d d d d d d d � Z d d d � Z e d d � � Z d d d d � Z d d � Z d S)�KernelRidgea� Kernel ridge regression. Kernel ridge regression (KRR) combines ridge regression (linear least squares with l2-norm regularization) with the kernel trick. It thus learns a linear function in the space induced by the respective kernel and the data. For non-linear kernels, this corresponds to a non-linear function in the original space. The form of the model learned by KRR is identical to support vector regression (SVR). However, different loss functions are used: KRR uses squared error loss while support vector regression uses epsilon-insensitive loss, both combined with l2 regularization. In contrast to SVR, fitting a KRR model can be done in closed-form and is typically faster for medium-sized datasets. On the other hand, the learned model is non-sparse and thus slower than SVR, which learns a sparse model for epsilon > 0, at prediction-time. This estimator has built-in support for multi-variate regression (i.e., when y is a 2d-array of shape [n_samples, n_targets]). Read more in the :ref:`User Guide <kernel_ridge>`. Parameters ---------- alpha : {float, array-like}, shape = [n_targets] Small positive values of alpha improve the conditioning of the problem and reduce the variance of the estimates. Alpha corresponds to ``(2*C)^-1`` in other linear models such as LogisticRegression or LinearSVC. If an array is passed, penalties are assumed to be specific to the targets. Hence they must correspond in number. kernel : string or callable, default="linear" Kernel mapping used internally. A callable should accept two arguments and the keyword arguments passed to this object as kernel_params, and should return a floating point number. gamma : float, default=None Gamma parameter for the RBF, laplacian, polynomial, exponential chi2 and sigmoid kernels. Interpretation of the default value is left to the kernel; see the documentation for sklearn.metrics.pairwise. Ignored by other kernels. degree : float, default=3 Degree of the polynomial kernel. Ignored by other kernels. coef0 : float, default=1 Zero coefficient for polynomial and sigmoid kernels. Ignored by other kernels. kernel_params : mapping of string to any, optional Additional parameters (keyword arguments) for kernel function passed as callable object. Attributes ---------- dual_coef_ : array, shape = [n_samples] or [n_samples, n_targets] Representation of weight vector(s) in kernel space X_fit_ : {array-like, sparse matrix}, shape = [n_samples, n_features] Training data, which is also required for prediction References ---------- * Kevin P. Murphy "Machine Learning: A Probabilistic Perspective", The MIT Press chapter 14.4.3, pp. 492-493 See also -------- Ridge Linear ridge regression. SVR Support Vector Regression implemented using libsvm. Examples -------- >>> from sklearn.kernel_ridge import KernelRidge >>> import numpy as np >>> n_samples, n_features = 10, 5 >>> rng = np.random.RandomState(0) >>> y = rng.randn(n_samples) >>> X = rng.randn(n_samples, n_features) >>> clf = KernelRidge(alpha=1.0) >>> clf.fit(X, y) # doctest: +NORMALIZE_WHITESPACE KernelRidge(alpha=1.0, coef0=1, degree=3, gamma=None, kernel='linear', kernel_params=None) r ZlinearN� c C s: | | _ | | _ | | _ | | _ | | _ | | _ d S)N)�alpha�kernel�gamma�degree�coef0� kernel_params)�selfr r r r r r � r �/kernel_ridge.py�__init__h s zKernelRidge.__init__c C sa t | j � r! | j p i } n! d | j d | j d | j i } t | | d | j d d | �S)Nr r r ZmetricZ filter_paramsT)�callabler r r r r r )r �X�YZparamsr r r �_get_kernelq s zKernelRidge._get_kernelc C s | j d k S)N�precomputed)r )r r r r � _pairwise{ s zKernelRidge._pairwisec C s� t | | d d d d d d �\ } } | j | � } t j | j � } d } t | j � d k r{ | j d d � } d } | j d k } t | | | | | � | _ | r� | j j � | _ | | _ | S)a� Fit Kernel Ridge regression model Parameters ---------- X : {array-like, sparse matrix}, shape = [n_samples, n_features] Training data y : array-like, shape = [n_samples] or [n_samples, n_targets] Target values sample_weight : float or numpy array of shape [n_samples] Individual weights for each sample, ignored if None is passed. Returns ------- self : returns an instance of self. Z accept_sparse�csr�cscZmulti_outputTZ y_numericFr r )r r ���) r r �npZ atleast_1dr �len�shapeZreshaper r � dual_coef_�ravel�X_fit_)r r �yZ sample_weight�Kr r"