Skip to content

plot_feature_importance


method plot_feature_importance(models=None, show=None, title=None, figsize=None, filename=None, display=True) [source]

Plot a model's feature importance. The importances are normalized in order to be able to compare them between models. The feature_importance attribute is updated with the extracted importance ranking. Only for models whose estimator has a feature_importances_ attribute.

Parameters:

models: str, sequence or None, optional (default=None)
Name of the models to plot. If None, all the models in the pipeline are selected.

show: int, optional (default=None)
Number of features (ordered by importance) to show. None to show all.

title: str or None, optional (default=None)
Plot's title. If None, the title is left empty.

figsize: tuple or None, optional (default=None)
Figure's size, format as (x, y). If None, it adapts the size to the number of features shown.

filename: str or None, optional (default=None)
Name of the file. Use "auto" for automatic naming. If None, the figure is not saved.

display: bool or None, optional (default=True)
Whether to render the plot. If None, it returns the matplotlib figure.

Returns: fig: matplotlib.figure.Figure
Plot object. Only returned if display=None.


Example

from atom import ATOMClassifier

atom = ATOMClassifier(X, y)
atom.run(["LR", "RF"], metric="recall_weighted")
atom.RF.plot_feature_importance(show=11, filename="random_forest_importance")
plot_feature_importance
Back to top