Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[MRG+1] Fix scipy-dev-wheels build because numpy.core.umath_tests has been renamed #10880

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 3, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions sklearn/ensemble/weight_boosting.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@
from abc import ABCMeta, abstractmethod

import numpy as np
from numpy.core.umath_tests import inner1d

from .base import BaseEnsemble
from ..base import ClassifierMixin, RegressorMixin, is_regressor, is_classifier
Expand Down Expand Up @@ -522,8 +521,8 @@ def _boost_real(self, iboost, X, y, sample_weight, random_state):

# Boost weight using multi-class AdaBoost SAMME.R alg
estimator_weight = (-1. * self.learning_rate
* (((n_classes - 1.) / n_classes) *
inner1d(y_coding, np.log(y_predict_proba))))
* ((n_classes - 1.) / n_classes)
* (y_coding * np.log(y_predict_proba)).sum(axis=1))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found that np.einsum('ij,ij->i', y_coding, np.log(y_predict_proba)) was another possibility but I find it harder to grok (maybe because I don't use einsum very regularly).

Note that (at least in the tests) both y_coding and y_predict_proba shapes are (n, p).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't this be done with np.dot?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see how but maybe I am missing a numpy trick. In the tests y_coding and y_predict_proba are 2d of shape (n_samples, n_classes) and you want an output that is of shape (n_samples,).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right


# Only boost the weights if it will fit again
if not iboost == self.n_estimators - 1:
Expand Down