Run
10418621

Run 10418621

Task 3022 (Supervised Classification) vowel Uploaded 22-11-2019 by Jan van Rijn
0 likes downloaded by 0 people 0 issues 0 downvotes , 0 total downloads
Issue #Downvotes for this reason By


Flow

sklearn.pipeline.Pipeline(columntransformer=sklearn.compose._column_transfo rmer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.pr eprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.St andardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.imput e._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotE ncoder)),variancethreshold=sklearn.feature_selection.variance_threshold.Var ianceThreshold,mlpclassifier=sklearn.neural_network.multilayer_perceptron.M LPClassifier)(1)Pipeline of transforms with a final estimator. Sequentially apply a list of transforms and a final estimator. Intermediate steps of the pipeline must be 'transforms', that is, they must implement fit and transform methods. The final estimator only needs to implement fit. The transformers in the pipeline can be cached using ``memory`` argument. The purpose of the pipeline is to assemble several steps that can be cross-validated together while setting different parameters. For this, it enables setting parameters of the various steps using their names and the parameter name separated by a '__', as in the example below. A step's estimator may be replaced entirely by setting the parameter with its name to another estimator, or a transformer removed by setting it to 'passthrough' or ``None``.
sklearn.impute._base.SimpleImputer(10)_add_indicatorfalse
sklearn.impute._base.SimpleImputer(10)_copytrue
sklearn.impute._base.SimpleImputer(10)_fill_value-1
sklearn.impute._base.SimpleImputer(10)_missing_valuesNaN
sklearn.impute._base.SimpleImputer(10)_strategy"constant"
sklearn.impute._base.SimpleImputer(10)_verbose0
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_n_jobsnull
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_remainder"passthrough"
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_sparse_threshold0.3
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_transformer_weightsnull
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_transformers[{"oml-python:serialized_object": "component_reference", "value": {"key": "numeric", "step_name": "numeric", "argument_1": [2, 3, 4, 5, 6, 7, 8, 9, 10, 11]}}, {"oml-python:serialized_object": "component_reference", "value": {"key": "nominal", "step_name": "nominal", "argument_1": [0, 1]}}]
sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder))(3)_verbosefalse
sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler)(8)_memorynull
sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler)(8)_steps[{"oml-python:serialized_object": "component_reference", "value": {"key": "imputer", "step_name": "imputer"}}, {"oml-python:serialized_object": "component_reference", "value": {"key": "standardscaler", "step_name": "standardscaler"}}]
sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler)(8)_verbosefalse
sklearn.preprocessing.imputation.Imputer(50)_axis0
sklearn.preprocessing.imputation.Imputer(50)_copytrue
sklearn.preprocessing.imputation.Imputer(50)_missing_values"NaN"
sklearn.preprocessing.imputation.Imputer(50)_strategy"mean"
sklearn.preprocessing.imputation.Imputer(50)_verbose0
sklearn.preprocessing.data.StandardScaler(36)_copytrue
sklearn.preprocessing.data.StandardScaler(36)_with_meantrue
sklearn.preprocessing.data.StandardScaler(36)_with_stdtrue
sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)(4)_memorynull
sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)(4)_steps[{"oml-python:serialized_object": "component_reference", "value": {"key": "simpleimputer", "step_name": "simpleimputer"}}, {"oml-python:serialized_object": "component_reference", "value": {"key": "onehotencoder", "step_name": "onehotencoder"}}]
sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)(4)_verbosefalse
sklearn.preprocessing._encoders.OneHotEncoder(17)_categorical_featuresnull
sklearn.preprocessing._encoders.OneHotEncoder(17)_categoriesnull
sklearn.preprocessing._encoders.OneHotEncoder(17)_dropnull
sklearn.preprocessing._encoders.OneHotEncoder(17)_dtype{"oml-python:serialized_object": "type", "value": "np.float64"}
sklearn.preprocessing._encoders.OneHotEncoder(17)_handle_unknown"ignore"
sklearn.preprocessing._encoders.OneHotEncoder(17)_n_valuesnull
sklearn.preprocessing._encoders.OneHotEncoder(17)_sparsetrue
sklearn.feature_selection.variance_threshold.VarianceThreshold(28)_threshold0.0
sklearn.pipeline.Pipeline(columntransformer=sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)),variancethreshold=sklearn.feature_selection.variance_threshold.VarianceThreshold,mlpclassifier=sklearn.neural_network.multilayer_perceptron.MLPClassifier)(1)_memorynull
sklearn.pipeline.Pipeline(columntransformer=sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)),variancethreshold=sklearn.feature_selection.variance_threshold.VarianceThreshold,mlpclassifier=sklearn.neural_network.multilayer_perceptron.MLPClassifier)(1)_steps[{"oml-python:serialized_object": "component_reference", "value": {"key": "columntransformer", "step_name": "columntransformer"}}, {"oml-python:serialized_object": "component_reference", "value": {"key": "variancethreshold", "step_name": "variancethreshold"}}, {"oml-python:serialized_object": "component_reference", "value": {"key": "mlpclassifier", "step_name": "mlpclassifier"}}]
sklearn.pipeline.Pipeline(columntransformer=sklearn.compose._column_transformer.ColumnTransformer(numeric=sklearn.pipeline.Pipeline(imputer=sklearn.preprocessing.imputation.Imputer,standardscaler=sklearn.preprocessing.data.StandardScaler),nominal=sklearn.pipeline.Pipeline(simpleimputer=sklearn.impute._base.SimpleImputer,onehotencoder=sklearn.preprocessing._encoders.OneHotEncoder)),variancethreshold=sklearn.feature_selection.variance_threshold.VarianceThreshold,mlpclassifier=sklearn.neural_network.multilayer_perceptron.MLPClassifier)(1)_verbosefalse
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_activation"relu"
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_alpha0.0001
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_batch_size"auto"
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_beta_10.9
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_beta_20.999
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_early_stoppingfalse
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_epsilon1e-08
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_hidden_layer_sizes[100]
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_learning_rate"constant"
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_learning_rate_init0.001
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_max_iter200
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_momentum0.9
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_n_iter_no_change10
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_nesterovs_momentumtrue
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_power_t0.5
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_random_state0
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_shuffletrue
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_solver"adam"
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_tol0.0001
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_validation_fraction0.1
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_verbosefalse
sklearn.neural_network.multilayer_perceptron.MLPClassifier(17)_warm_startfalse

Result files

xml
Description

XML file describing the run, including user-defined evaluation measures.

arff
Predictions

ARFF file with instance-level predictions generated by the model.

17 Evaluation measures

0.9993 ± 0.0007
Per class
Cross-validation details (10-fold Crossvalidation)
0.9747 ± 0.0187
Per class
Cross-validation details (10-fold Crossvalidation)
0.9722 ± 0.0205
Cross-validation details (10-fold Crossvalidation)
0.9355 ± 0.0144
Cross-validation details (10-fold Crossvalidation)
0.0223 ± 0.0035
Cross-validation details (10-fold Crossvalidation)
0.1653
Cross-validation details (10-fold Crossvalidation)
990
Per class
Cross-validation details (10-fold Crossvalidation)
0.9751 ± 0.0166
Per class
Cross-validation details (10-fold Crossvalidation)
0.9747 ± 0.0186
Cross-validation details (10-fold Crossvalidation)
3.4594
Cross-validation details (10-fold Crossvalidation)
0.9747 ± 0.0186
Per class
Cross-validation details (10-fold Crossvalidation)
0.1352 ± 0.0211
Cross-validation details (10-fold Crossvalidation)
0.2875
Cross-validation details (10-fold Crossvalidation)
0.0756 ± 0.0147
Cross-validation details (10-fold Crossvalidation)
0.2628 ± 0.051
Cross-validation details (10-fold Crossvalidation)