Is there any way to extract the actual feature importance coefficients/scores from the following code snippet (as opposed to the top num_feats
features)?
from sklearn.feature_selection import RFE
from sklearn.linear_model import LogisticRegression
rfe_selector = RFE(estimator=LogisticRegression(), n_features_to_select=num_feats, step=10, verbose=5)
rfe_selector.fit(X_norm, y)
If your goal is to extract the feature importance for the estimator that has been fit on the final, reduced dataset, you can access this estimator with the estimator_
attribute and extract its coefficients or feature importance scores:
from sklearn.feature_selection import RFE
from sklearn.linear_model import LogisticRegression
rfe_selector = RFE(estimator=LogisticRegression(), n_features_to_select=num_feats, step=10, verbose=5)
rfe_selector.fit(X_norm, y)
coefs = rfe_selector.estimator_.coef_[0]
Of course, it depends on the used estimator if you have to call coef_
or feature_importances_
.