Skip to content

plot_threshold


method plot_threshold(models=None, metric=None, rows="test", target=0, steps=100, title=None, legend="lower left", figsize=(900, 600), filename=None, display=True)[source]

Plot metric performances against threshold values.

This plot is available only for models with a predict_proba method in a binary or multilabel classification task.

Parameters models: int, str, Model, segment, sequence or None, default=None
Models to plot. If None, all models are selected.

metric: str, func, scorer, sequence or None, default=None
Metric to plot. Choose from any of sklearn's scorers, a function with signature metric(y_true, y_pred, **kwargs) or a scorer object. Use a sequence or add + between options to select more than one. If None, the metric used to run the pipeline is selected.

rows: hashable, segment, sequence or dataframe, default="test"
Selection of rows on which to calculate the metric.

target: int or str, default=0
Target column to look at. Only for multilabel tasks.

steps: int, default=100
Number of thresholds measured.

title: str, dict or None, default=None
Title for the plot.

legend: str, dict or None, default="lower left"
Legend for the plot. See the user guide for an extended description of the choices.

  • If None: No legend is shown.
  • If str: Position to display the legend.
  • If dict: Legend configuration.

figsize: tuple, default=(900, 600)
Figure's size in pixels, format as (x, y).

filename: str, Path or None, default=None
Save the plot using this name. Use "auto" for automatic naming. The type of the file depends on the provided name (.html, .png, .pdf, etc...). If filename has no file type, the plot is saved as html. If None, the plot is not saved.

display: bool or None, default=True
Whether to render the plot. If None, it returns the figure.

Returns{#plot_threshold-go.Figure or None} go.Figure or None
Plot object. Only returned if display=None.


See Also

plot_calibration

Plot the calibration curve for a binary classifier.

plot_confusion_matrix

Plot a model's confusion matrix.

plot_probabilities

Plot the probability distribution of the target classes.


Example

>>> from atom import ATOMClassifier
>>> from sklearn.datasets import make_classification

>>> X, y = make_classification(n_samples=1000, flip_y=0.2, random_state=1)

>>> atom = ATOMClassifier(X, y, random_state=1)
>>> atom.run(["LR", "RF"])
>>> atom.plot_threshold()