Flow
sklearn.preprocessing._data.PowerTransformer

sklearn.preprocessing._data.PowerTransformer

Visibility: public Uploaded 25-03-2021 by Pieter Gijsbers sklearn==0.23.1 numpy>=1.6.1 scipy>=0.9 0 runs
0 likes downloaded by 0 people 0 issues 0 downvotes , 0 total downloads
  • openml-python python scikit-learn sklearn sklearn_0.23.1
Issue #Downvotes for this reason By


Loading wiki
Help us complete this description Edit
Apply a power transform featurewise to make data more Gaussian-like. Power transforms are a family of parametric, monotonic transformations that are applied to make data more Gaussian-like. This is useful for modeling issues related to heteroscedasticity (non-constant variance), or other situations where normality is desired. Currently, PowerTransformer supports the Box-Cox transform and the Yeo-Johnson transform. The optimal parameter for stabilizing variance and minimizing skewness is estimated through maximum likelihood. Box-Cox requires input data to be strictly positive, while Yeo-Johnson supports both positive or negative data. By default, zero-mean, unit-variance normalization is applied to the transformed data.

Parameters

copySet to False to perform inplace computation during transformation.default: true
methodThe power transform method. Available methods are: - 'yeo-johnson' [1]_, works with positive and negative values - 'box-cox' [2]_, only works with strictly positive valuesdefault: "yeo-johnson"
standardizeSet to True to apply zero-mean, unit-variance normalization to the transformed outputdefault: true

0
Runs

List all runs
Parameter:
Rendering chart
Rendering table