plot_hyperparameter_importance
method plot_hyperparameter_importance(models=None, metric=0, show=None, title=None, legend=None, figsize=None, filename=None, display=True)[source]
Plot a model's hyperparameter importance.
The hyperparameter importances are calculated using the fANOVA importance evaluator. The sum of all importances for all parameters (per model) is 1. This plot is only available for models that ran hyperparameter tuning.
Parameters |
models: int, str, Model, segment, sequence or None, default=None
Models to plot. If None, all models that used hyperparameter
tuning are selected.
metric: int or str, default=0
Metric to plot (only for multi-metric runs).
show: int or None, default=None
Number of hyperparameters (ordered by importance) to show.
None to show all.
title: str, dict or None, default=None
Title for the plot.
legend: str, dict or None, default=None
Legend for the plot. See the user guide for
an extended description of the choices.
figsize: tuple or None, default=None
Figure's size in pixels, format as (x, y). If None, it
adapts the size to the number of hyperparameters shown.
filename: str, Path or None, default=None
Save the plot using this name. Use "auto" for automatic
naming. The type of the file depends on the provided name
(.html, .png, .pdf, etc...). If
display: bool or None, default=Truefilename has no file type,
the plot is saved as html. If None, the plot is not saved.
Whether to render the plot. If None, it returns the figure.
|
Returns | {#plot_hyperparameter_importance-go.Figure or None}
go.Figure or None
Plot object. Only returned if display=None .
|
See Also
Plot a model's feature importance.
Plot hyperparameter relationships in a study.
Plot the hyperparameter tuning trials.
Example
>>> from atom import ATOMClassifier
>>> from sklearn.datasets import load_breast_cancer
>>> X, y = load_breast_cancer(return_X_y=True, as_frame=True)
>>> atom = ATOMClassifier(X, y, random_state=1)
>>> atom.run(["ET", "RF"], n_trials=10)
>>> atom.plot_hyperparameter_importance()