Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit b09afb0

Browse files
committed
DOC/MAINT final touches to FunctionTransformer
* copyedit doc * give the module an underscore (we should do this to all private modules...) * use public import path in test * change all logs to log1p to prevent warning in test and because it's genuinely useful for frequency data * what's new
1 parent 3c6cf99 commit b09afb0

File tree

5 files changed

+16
-12
lines changed

5 files changed

+16
-12
lines changed

doc/modules/preprocessing.rst

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -509,22 +509,22 @@ Note that polynomial features are used implicitily in `kernel methods <http://en
509509

510510
See :ref:`example_linear_model_plot_polynomial_interpolation.py` for Ridge regression using created polynomial features.
511511

512-
Custom Transformers
512+
Custom transformers
513513
===================
514514

515-
Often, you will want to convert an existing python function into a transformer
516-
to assist in data cleaning or processing. Users may implement a transformer from
517-
an arbitrary function with :class:`FunctionTransformer`. For example, one could
518-
apply a log transformation in a pipeline like::
515+
Often, you will want to convert an existing Python function into a transformer
516+
to assist in data cleaning or processing. You can implement a transformer from
517+
an arbitrary function with :class:`FunctionTransformer`. For example, to build
518+
a transformer that applies a log transformation in a pipeline, do::
519519

520520
>>> import numpy as np
521521
>>> from sklearn.preprocessing import FunctionTransformer
522-
>>> transformer = FunctionTransformer(np.log)
523-
>>> X = np.array([[1, 2], [3, 4]])
522+
>>> transformer = FunctionTransformer(np.log1p)
523+
>>> X = np.array([[0, 1], [2, 3]])
524524
>>> transformer.transform(X)
525525
array([[ 0. , 0.69314718],
526526
[ 1.09861229, 1.38629436]])
527527

528528
For a full code example that demonstrates using a :class:`FunctionTransformer`
529-
to do column selection,
529+
to do custom feature selection,
530530
see :ref:`example_preprocessing_plot_function_transformer.py`

doc/whats_new.rst

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,10 @@ New features
2929
range normalization when the data is already centered or sparse.
3030
By `Thomas Unterthiner`_.
3131

32+
- The new class :class:`preprocessing.FunctionTransformer` turns a Python
33+
function into a ``Pipeline``-compatible transformer object.
34+
By Joe Jevnik.
35+
3236
Enhancements
3337
............
3438

sklearn/preprocessing/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
normalization, binarization and imputation methods.
44
"""
55

6-
from .function_transformer import FunctionTransformer
6+
from ._function_transformer import FunctionTransformer
77

88
from .data import Binarizer
99
from .data import KernelCenterer
File renamed without changes.

sklearn/preprocessing/tests/test_function_transformer.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
from nose.tools import assert_equal
22
import numpy as np
33

4-
from ..function_transformer import FunctionTransformer
4+
from sklearn.preprocessing import FunctionTransformer
55

66

77
def _make_func(args_store, kwargs_store, func=lambda X, *a, **k: X):
@@ -78,6 +78,6 @@ def test_np_log():
7878

7979
# Test that the numpy.log example still works.
8080
np.testing.assert_array_equal(
81-
FunctionTransformer(np.log).transform(X),
82-
np.log(X),
81+
FunctionTransformer(np.log1p).transform(X),
82+
np.log1p(X),
8383
)

0 commit comments

Comments
 (0)