@@ -2489,7 +2489,7 @@ class PowerTransformer(BaseEstimator, TransformerMixin):
2489
2489
or other situations where normality is desired.
2490
2490
2491
2491
Currently, PowerTransformer supports the Box-Cox transform and the
2492
- Yeo-Johson transform. The optimal parameter for stabilizing variance and
2492
+ Yeo-Johnson transform. The optimal parameter for stabilizing variance and
2493
2493
minimizing skewness is estimated through maximum likelihood.
2494
2494
2495
2495
Box-Cox requires input data to be strictly positive, while Yeo-Johnson
@@ -2851,8 +2851,8 @@ def power_transform(X, method='yeo-johnson', standardize=True, copy=True):
2851
2851
modeling issues related to heteroscedasticity (non-constant variance),
2852
2852
or other situations where normality is desired.
2853
2853
2854
- Currently, PowerTransformer supports the Box-Cox transform and the
2855
- Yeo-Johson transform. The optimal parameter for stabilizing variance and
2854
+ Currently, power_transform supports the Box-Cox transform and the
2855
+ Yeo-Johnson transform. The optimal parameter for stabilizing variance and
2856
2856
minimizing skewness is estimated through maximum likelihood.
2857
2857
2858
2858
Box-Cox requires input data to be strictly positive, while Yeo-Johnson
@@ -2897,7 +2897,7 @@ def power_transform(X, method='yeo-johnson', standardize=True, copy=True):
2897
2897
``Transformer`` API (e.g. as part of a preprocessing
2898
2898
:class:`sklearn.pipeline.Pipeline`).
2899
2899
2900
- QuantileTransformer : Maps data to a standard normal distribution with
2900
+ quantile_transform : Maps data to a standard normal distribution with
2901
2901
the parameter `output_distribution='normal'`.
2902
2902
2903
2903
Notes
0 commit comments