Flow
HROCH.classifier.NonlinearLogisticRegressor

HROCH.classifier.NonlinearLogisticRegressor

Visibility: public Uploaded 02-02-2024 by Jano P sklearn==1.3.2 numpy>=1.17.3 scipy>=1.5.0 joblib>=1.1.1 threadpoolctl>=2.0.0 0 runs
0 likes downloaded by 0 people 0 issues 0 downvotes , 0 total downloads
  • openml-python python scikit-learn sklearn sklearn_1.3.2
Issue #Downvotes for this reason By


Loading wiki
Help us complete this description Edit
Nonlinear Logistic Regressor

Parameters

algo_settingsIf not defined SymbolicSolver.ALGO_SETTINGS is used ```python algo_settings = {'neighbours_count':15, 'alpha':0.15, 'beta':0.5, 'pretest_size':1, 'sample_size':16} ``` - 'neighbours_count' : (int) Number tested neighbours in each iteration - 'alpha' : (float) Score worsening limit for a iteration - 'beta' : (float) Tree breadth-wise expanding factor in a range from 0 to 1 - 'pretest_size' : (int) Batch count(batch is 64 rows sample) for fast fitness preevaluating - 'sample_size : (int) Number of batches of sample used to calculate the score during trainingdefault: null
class_weightWeights associated with classes in the form ``{class_label: weight}`` If not given, all classes are supposed to have weight one The "balanced" mode uses the values of y to automatically adjust weights inversely proportional to class frequencies in the input data as ``n_samples / (n_classes * np.bincount(y))`` Note that these weights will be multiplied with sample_weight (passed through the fit method) if sample_weight is specifieddefault: null
code_settingsIf not defined SymbolicSolver.CODE_SETTINGS is used ```python code_settings = {'min_size': 32, 'max_size':32, 'const_size':8} ``` - 'const_size' : (int) Maximum alloved constants in symbolic model, accept also 0 - 'min_size': (int) Minimum allowed equation size(as a linear program) - 'max_size' : (int) Maximum allowed equation size(as a linear program)default: null
const_settingsIf not defined SymbolicSolver.CONST_SETTINGS is used ```python const_settings = {'const_min':-LARGE_FLOAT, 'const_max':LARGE_FLOAT, 'predefined_const_prob':0.0, 'predefined_const_set': []} ``` - 'const_min' : (float) Lower range for constants used in equations - 'const_max' : (float) Upper range for constants used in equations - 'predefined_const_prob': (float) Probability of selecting one of the predefined constants during search process(mutation) - 'predefined_const_set' : (array of floats) Predefined constants used during search process(mutation)default: null
cv_paramsIf not defined SymbolicSolver.CLASSIFICATION_CV_PARAMS is used ```python cv_params = {'n':0, 'cv_params':{}, 'select':'mean', 'opt_params':{'method': 'Nelder-Mead'}, 'opt_metric':make_scorer(log_loss, greater_is_better=False, needs_proba=True)} ``` - 'n' : (int) Crossvalidate n top models - 'cv_params' : (dict) Parameters passed to scikit-learn cross_validate method - select : (str) Best model selection method choose from 'mean'or 'median' - opt_params : (dict) Parameters passed to scipy.optimize.minimize method - opt_metric : (make_scorer) Scoring methoddefault: null
feature_probsThe probability that a mutation will select a featuredefault: null
init_const_settingsIf not defined SymbolicSolver.INIT_CONST_SETTINGS is used ```python init_const_settings = {'const_min':-1.0, 'const_max':1.0, 'predefined_const_prob':0.0, 'predefined_const_set': []} ``` - 'const_min' : (float) Lower range for initializing constants - 'const_max' : (float) Upper range for initializing constants - 'predefined_const_prob': (float) Probability of selecting one of the predefined constants during initialization - 'predefined_const_set' : (array of floats) Predefined constants used during initializationdefault: null
iter_limitIterations limit. If is set to 0 there is no limit and the algorithm runs until time_limit is metdefault: 0
metricMetric used for evaluating error. Choose from {'MSE', 'MAE', 'MSLE', 'LogLoss'}default: "LogLoss"
num_threadsNumber of used threadsdefault: 1
population_settingsIf not defined SymbolicSolver.POPULATION_SETTINGS is used ```python population_settings = {'size': 64, 'tournament':4} ``` - 'size' : (int) Number of individuals in the population - 'tournament' : (int) Tournament selectiondefault: null
precision'f64' or 'f32'. Internal floating number representationdefault: "f32"
problemPredefined instructions sets 'math' or 'simple' or 'fuzzy' or custom defines set of instructions with mutation probability ```python problem={'add':10.0, 'mul':10.0, 'gt':1.0, 'lt':1.0, 'nop':1.0} ``` |**supported instructions**|| |-|-| |**math**|add, sub, mul, div, pdiv, inv, minv, sq2, pow, exp, log, sqrt, cbrt, aq| |**goniometric**|sin, cos, tan, asin, acos, atan, sinh, cosh, tanh| |**other**|nop, max, min, abs, floor, ceil, lt, gt, lte, gte| |**fuzzy**|f_and, f_or, f_xor, f_impl, f_not, f_nand, f_nor, f_nxor, f_nimpl| *nop - no operation* *pdiv - protected division* *inv - inverse* $(-x)$ *minv - multiplicative inverse* $(1/x)$ *lt, gt, lte, gte -* $<, >, <=, >=$default: "math"
random_stateRandom generator seed. If 0 then random generator will be initialized by system timedefault: 0
target_clipArray of two float values clip_min and clip_max If not defined SymbolicSolver.CLASSIFICATION_TARGET_CLIP is used ```python target_clip=[3e-7, 1.0-3e-7] ```default: null
time_limitTimeout in seconds. If is set to 0 there is no limit and the algorithm runs until iter_limit is metdefault: 5.0
transformationFinal transformation for computed value. Choose from { None, 'LOGISTIC', 'ORDINAL'}default: "LOGISTIC"
verboseControls the verbosity when fitting and predictingdefault: 0
warm_startdefault: false

0
Runs

List all runs
Parameter:
Rendering chart
Rendering table