I'm comparing different ensemble models including:
from sklearn.tree import DecisionTreeRegressor
from sklearn.linear_model import Lasso
from sklearn.ensemble import RandomForestRegressor
from sklearn.ensemble import AdaBoostRegressor
from sklearn.ensemble import GradientBoostingRegressor
from xgboost import XGBRegressor
For XGBRegressor(), I can check the model parameters by:
xgb_model = XGBRegressor()
xgb_model.fit(X_train, y_train)
xgb_model
The result is:
xgb_model
XGBRegressor(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bynode=1, colsample_bytree=1, enable_categorical=False,
gamma=0, gpu_id=-1, importance_type=None,
interaction_constraints='', learning_rate=0.300000012,
max_delta_step=0, max_depth=6, min_child_weight=1, missing=nan,
monotone_constraints='()', n_estimators=100, n_jobs=52,
num_parallel_tree=1, predictor='auto', random_state=0, reg_alpha=0,
reg_lambda=1, scale_pos_weight=1, subsample=1, tree_method='exact',
validate_parameters=1, verbosity=None)
However for other regressors, I cannot check the model parameters, there is nothing in the brackets. This happens also in Adaboost and GradientBoost:
RF_model = RandomForestRegressor()
RF_model.fit(X_train, y_train)
RF_model
RF_model
RandomForestRegressor()
My question is how to check the model parameters?
hi you must use get_params() method on these algorithms :
RF_model = RandomForestRegressor()
RF_model.get_params()
this method return to you all parameters in a dict like this :
{'bootstrap': True,
'ccp_alpha': 0.0,
'class_weight': None,
'criterion': 'gini',
'max_depth': None,
'max_features': 'sqrt',
'max_leaf_nodes': None,
'max_samples': None,
'min_impurity_decrease': 0.0,
'min_samples_leaf': 1,
'min_samples_split': 2,
'min_weight_fraction_leaf': 0.0,
'n_estimators': 100,
'n_jobs': None,
'oob_score': False,
'random_state': None,
'verbose': 0,
'warm_start': False}