HistGradientBoostingClassifier Features importance

Hi, I would like to know if features importance is available for HistGradientBoostingClassifier, say for one of the exercise in module 3. Thanks in advance, Tsvika.

We decided to not implement the feature importance based on the impurity in HistGradientBoosting because it is known to suffer from some bias. Instead, you can use the permutation_importance function. You can check an example that explain the bias and show the usage of permutation_importance: Permutation Importance vs Random Forest Feature Importance (MDI) — scikit-learn 1.2.0 documentation

This is with a RandomForest because we can use both the MDI and permutation approach with the same model.

1 Like

@glemaitre Thanks for your informative answer. I will follow your instructions to use permutation importance instead of feature importance. Let’s see if I can do it without explicitly example :slight_smile: Tsvika

@glemaitre Thanks, it works great with HGBC as you said :+1: