plot_feature_importance
Plot a model's feature importance. The importances are normalized in
order to be able to compare them between models. Only for models whose
estimator has a feature_importances_
or coef_
attribute. The
trainer's feature_importance
attribute is updated with the extracted
importance ranking.
Parameters: |
models: str, sequence or None, optional (default=None)
show: int, optional (default=None)
title: str or None, optional (default=None)
figsize: tuple or None, optional (default=None)
filename: str or None, optional (default=None)
display: bool or None, optional (default=True) |
Returns: |
fig: matplotlib.figure.Figure Plot object. Only returned if display=None .
|
Example
from atom import ATOMClassifier
atom = ATOMClassifier(X, y)
atom.run(["LR", "RF"], metric="recall_weighted")
atom.RF.plot_feature_importance(show=11, filename="random_forest_importance")