在建模过程中,用XGBClassifier定义参数为objective='reg:linear',输出就变成objective='multi:softprob'了,详见代码。
model = xgb.XGBClassifier(max_depth=5, learning_rate=0.1, colsample_bytree=0.7, subsample=0.7,silent=True, objective='reg:linear',num_boosting_rounds = 10)
model.fit(x_train,y_train,eval_metric='rmse')
输出变为:
XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bynode=1, colsample_bytree=0.7, gamma=0, gpu_id=-1,
importance_type='gain', interaction_constraints='',
learning_rate=0.1, max_delta_step=0, max_depth=5,
min_child_weight=1, missing=nan, monotone_constraints='()',
n_estimators=100, n_jobs=0, num_boosting_rounds=10,
num_parallel_tree=1, objective='multi:softprob', random_state=0,
reg_alpha=0, reg_lambda=1, scale_pos_weight=None, silent=True,
subsample=0.7, tree_method='exact', validate_parameters=1,
verbosity=None)
[16:26:15] WARNING: /workspace/src/learner.cc:480:
Parameters: { num_boosting_rounds, silent } might not be used.
This may not be accurate due to some parameters are only used in language bindings but
passed down to XGBoost core. Or some parameters are not used but slip through this
verification. Please open an issue if you find above cases.