Flow
xgboost.sklearn.XGBClassifier

xgboost.sklearn.XGBClassifier

Visibility: public Uploaded 02-04-2020 by Antonio Barata sklearn==0.22.2.post1 numpy>=1.6.1 scipy>=0.9 5 runs
0 likes downloaded by 0 people 0 issues 0 downvotes , 0 total downloads
  • openml-python python scikit-learn sklearn sklearn_0.22.2.post1
Issue #Downvotes for this reason By


Loading wiki
Help us complete this description Edit
Implementation of the scikit-learn API for XGBoost classification.

Parameters

base_scoredefault: null
boosterdefault: null
colsample_bylevelSubsample ratio of columns for each leveldefault: null
colsample_bynodeSubsample ratio of columns for each splitdefault: null
colsample_bytreeSubsample ratio of columns when constructing each treedefault: null
gammaMinimum loss reduction required to make a further partition on a leaf node of the treedefault: null
gpu_iddefault: null
importance_typedefault: "gain"
interaction_constraintsConstraints for interaction representing permitted interactions. The constraints must be specified in the form of a nest list, e.g. [[0, 1], [2, 3, 4]], where each inner list is a group of indices of features that are allowed to interact with each other. See tutorial for more information importance_type: string, default "gain" The feature importance type for the feature_importances\_ property: either "gain", "weight", "cover", "total_gain" or "total_cover"default: null
learning_rateBoosting learning rate (xgb's "eta")default: null
max_delta_stepMaximum delta step we allow each tree's weight estimation to bedefault: null
max_depthMaximum tree depth for base learnersdefault: null
min_child_weightMinimum sum of instance weight(hessian) needed in a childdefault: null
missingValue in the data which needs to be present as a missing value. If None, defaults to np.nan num_parallel_tree: int Used for boosting random forestdefault: NaN
monotone_constraintsConstraint of variable monotonicity. See tutorial for more information.cdefault: null
n_estimatorsdefault: 100
n_jobsNumber of parallel threads used to run xgboostdefault: null
num_parallel_treedefault: null
objectiveSpecify the learning task and the corresponding learning objective or a custom objective function to be used (see note below) booster: string Specify which booster to use: gbtree, gblinear or dart tree_method: string Specify which tree method to use. Default to auto. If this parameter is set to default, XGBoost will choose the most conservative option available. It's recommended to study this option from parameters documentdefault: "binary:logistic"
random_stateRandom number seed .. note:: Using gblinear booster with shotgun updater is nondeterministic as it uses Hogwild algorithmdefault: null
reg_alphaL1 regularization term on weightsdefault: null
reg_lambdaL2 regularization term on weightsdefault: null
scale_pos_weightBalancing of positive and negative weights base_score: The initial prediction score of all instances, global biasdefault: null
subsampleSubsample ratio of the training instancedefault: null
tree_methoddefault: null
validate_parametersdefault: false
verbosityThe degree of verbosity. Valid values are 0 (silent) - 3 (debug)default: null

0
Runs

List all runs
Parameter:
Rendering chart
Rendering table