Are there rules of thumb for xgboosts hyperperameter selection?
There are multiple parameters that need to be specified in the XGBClassifier. Certainly gridsearchcv could give some insight into optimal hyperparameters, but I would imagine there are some rules of thumb for some reasonable hyperparameter selection. For example, for a training set of ~200,000 examples with ~1000 features is it possible to specify reasonable values for n_estimators, learning_rate, and max_depth with just this information alone?
Topic xgboost classification machine-learning
Category Data Science