Configuration¶
This page describes how to configure the metrics module used to score model predictions.
Options¶
Name |
Allowed |
Default |
Description |
|---|---|---|---|
|
|
|
The type of metrics to use. See Underlying libraries for more details. |
|
|
|
Classification task type. Only used by |
|
|
|
Number of classes for |
|
|
|
Number of labels for |
|
|
|
Aggregation mode. |
|
|
|
Optional target value to ignore when computing classification metrics. |
|
|
0.5 for |
Threshold applied by |
|
|
|
Bounding-box format expected by |
|
|
|
IoU mode passed to detection mean average precision. |
|
|
|
Optional custom IoU thresholds for detection evaluation. |
|
|
|
Optional custom recall thresholds for detection evaluation. |
|
|
|
Optional maximum-detection cutoffs for detection evaluation. |
|
|
|
Whether |
|
|
|
Whether |
|
|
|
Backend used by detection mean average precision. |
YAML example¶
metrics:
_target_: "ClassificationMetrics"
task: "multiclass"
num_classes: 7
num_labels: null
average: "macro"
ignore_index: null
CLI override example¶
uv run raitap metrics=detection metrics.class_metrics=true metrics.extended_summary=true
raitap metrics=detection metrics.class_metrics=true metrics.extended_summary=true