Model evaluation metrics for video object tracking problems. Evaluates prediction quality of both labeled bounding boxes and labeled tracks (i.e. series of bounding boxes sharing same label and instance id).
The bounding boxes match metrics for each intersection-over-union threshold 0.05,0.10,...,0.95,0.96,0.97,0.98,0.99 and each label confidence threshold 0.05,0.10,...,0.95,0.96,0.97,0.98,0.99 pair.
UNIMPLEMENTED. The tracks match metrics for each intersection-over-union threshold 0.05,0.10,...,0.95,0.96,0.97,0.98,0.99 and each label confidence threshold 0.05,0.10,...,0.95,0.96,0.97,0.98,0.99 pair.
evaluatedFrameCountinteger
UNIMPLEMENTED. The number of video frames used to create this evaluation.
evaluatedBoundingBoxCountinteger
UNIMPLEMENTED. The total number of bounding boxes (i.e. summed over all frames) the ground truth used to create this evaluation had.
evaluatedTrackCountinteger
UNIMPLEMENTED. The total number of tracks (i.e. as seen across all frames) the ground truth used to create this evaluation had.
boundingBoxMeanAveragePrecisionnumber
The single metric for bounding boxes evaluation: the meanAveragePrecision averaged over all boundingBoxMetrics.
trackMeanAveragePrecisionnumber
UNIMPLEMENTED. The single metric for tracks accuracy evaluation: the meanAveragePrecision averaged over all trackMetrics.
trackMeanBoundingBoxIounumber
UNIMPLEMENTED. The single metric for tracks bounding box iou evaluation: the meanBoundingBoxIou averaged over all trackMetrics.
trackMeanMismatchRatenumber
UNIMPLEMENTED. The single metric for tracking consistency evaluation: the meanMismatchRate averaged over all trackMetrics.
| JSON representation |
|---|
{ "boundingBoxMetrics": [ { object ( |