Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[MRG + 1] adding sample weights for BayesianRidge #10112

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Nov 13, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions doc/whats_new/v0.20.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,10 @@ Classifiers and regressors
``inverse_func`` are the inverse of each other.
:issue:`9399` by :user:`Guillaume Lemaitre <glemaitre>`.

- Add `sample_weight` parameter to the fit method of
:class:`linear_model.BayesianRidge` for weighted linear regression.
:issue:`10111` by :user:`Peter St. John <pstjohn>`.

Model evaluation and meta-estimators

- A scorer based on :func:`metrics.brier_score_loss` is also available.
Expand Down
18 changes: 15 additions & 3 deletions sklearn/linear_model/bayes.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from scipy import linalg
from scipy.linalg import pinvh

from .base import LinearModel
from .base import LinearModel, _rescale_data
from ..base import RegressorMixin
from ..utils.extmath import fast_logdet
from ..utils import check_X_y
Expand Down Expand Up @@ -140,7 +140,7 @@ def __init__(self, n_iter=300, tol=1.e-3, alpha_1=1.e-6, alpha_2=1.e-6,
self.copy_X = copy_X
self.verbose = verbose

def fit(self, X, y):
def fit(self, X, y, sample_weight=None):
"""Fit the model

Parameters
Expand All @@ -150,13 +150,25 @@ def fit(self, X, y):
y : numpy array of shape [n_samples]
Target values. Will be cast to X's dtype if necessary

sample_weight : numpy array of shape [n_samples]
Individual weights for each sample

.. versionadded:: 0.20
parameter *sample_weight* support to BayesianRidge.

Returns
-------
self : returns an instance of self.
"""
X, y = check_X_y(X, y, dtype=np.float64, y_numeric=True)
X, y, X_offset_, y_offset_, X_scale_ = self._preprocess_data(
X, y, self.fit_intercept, self.normalize, self.copy_X)
X, y, self.fit_intercept, self.normalize, self.copy_X,
sample_weight=sample_weight)

if sample_weight is not None:
# Sample weight can be implemented via a simple rescaling.
X, y = _rescale_data(X, y, sample_weight)

self.X_offset_ = X_offset_
self.X_scale_ = X_scale_
n_samples, n_features = X.shape
Expand Down
15 changes: 15 additions & 0 deletions sklearn/linear_model/tests/test_bayes.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,21 @@ def test_bayesian_ridge_parameter():
assert_almost_equal(rr_model.intercept_, br_model.intercept_)


def test_bayesian_sample_weights():
# Test correctness of the sample_weights method
X = np.array([[1, 1], [3, 4], [5, 7], [4, 1], [2, 6], [3, 10], [3, 2]])
y = np.array([1, 2, 3, 2, 0, 4, 5]).T
w = np.array([4, 3, 3, 1, 1, 2, 3]).T

# A Ridge regression model using an alpha value equal to the ratio of
# lambda_ and alpha_ from the Bayesian Ridge model must be identical
br_model = BayesianRidge(compute_score=True).fit(X, y, sample_weight=w)
rr_model = Ridge(alpha=br_model.lambda_ / br_model.alpha_).fit(
X, y, sample_weight=w)
assert_array_almost_equal(rr_model.coef_, br_model.coef_)
assert_almost_equal(rr_model.intercept_, br_model.intercept_)


def test_toy_bayesian_ridge_object():
# Test BayesianRidge on toy
X = np.array([[1], [2], [6], [8], [10]])
Expand Down