Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
keras-team
GitHub Repository: keras-team/keras-io
Path: blob/master/guides/keras_cv/coco_metrics.py
3283 views
1
"""
2
Title: Using KerasCV COCO Metrics
3
Author: [lukewood](https://twitter.com/luke_wood_ml)
4
Date created: 2022/04/13
5
Last modified: 2022/04/13
6
Description: Use KerasCV COCO metrics to evaluate object detection models.
7
Accelerator: None
8
"""
9
10
"""
11
## Overview
12
13
With KerasCV's COCO metrics implementation, you can easily evaluate your object
14
detection model's performance all from within the TensorFlow graph. This guide
15
shows you how to use KerasCV's COCO metrics and integrate it into your own model
16
evaluation pipeline. Historically, users have evaluated COCO metrics as a post training
17
step. KerasCV offers an in graph implementation of COCO metrics, enabling users to
18
evaluate COCO metrics *during* training!
19
20
Let's get started using KerasCV's COCO metrics.
21
"""
22
23
"""
24
## Input format
25
26
All KerasCV components that process bounding boxes, including COCO metrics, require a
27
`bounding_box_format` parameter. This parameter is used to tell the components what
28
format your bounding boxes are in. While this guide uses the `xyxy` format, a full
29
list of supported formats is available in
30
[the bounding_box API documentation](https://keras.io/api/keras_cv/bounding_box/formats/).
31
32
The metrics expect `y_true` and be a `float` Tensor with the shape `[batch,
33
num_images, num_boxes, 5]`, with the ordering of last set of axes determined by the
34
provided format. The same is true of `y_pred`, except that an additional `confidence`
35
axis must be provided.
36
37
Due to the fact that each image may have a different number of bounding boxes,
38
the `num_boxes` dimension may actually have a mismatching shape between images.
39
KerasCV works around this by allowing you to either pass a `RaggedTensor` as an
40
input to the KerasCV COCO metrics, or padding unused bounding boxes with `-1`.
41
42
Utility functions to manipulate bounding boxes, transform between formats, and
43
pad bounding box Tensors with `-1s` are available from the
44
[`keras_cv.bounding_box`](https://github.com/keras-team/keras-cv/blob/master/keras_cv/bounding_box)
45
package.
46
47
"""
48
49
"""
50
## Independent metric use
51
52
The usage first pattern for KerasCV COCO metrics is to manually call
53
`update_state()` and `result()` methods. This pattern is recommended for users
54
who want finer grained control of their metric evaluation, or want to use a
55
different format for `y_pred` in their model.
56
57
Let's run through a quick code example.
58
"""
59
60
"""
61
1.) First, we must construct our metric:
62
"""
63
64
import keras_cv
65
66
# import all modules we will need in this example
67
import tensorflow as tf
68
from tensorflow import keras
69
70
# only consider boxes with areas less than a 32x32 square.
71
metric = keras_cv.metrics.COCORecall(
72
bounding_box_format="xyxy", class_ids=[1, 2, 3], area_range=(0, 32**2)
73
)
74
75
"""
76
2.) Create Some Bounding Boxes:
77
"""
78
79
y_true = tf.ragged.stack(
80
[
81
# image 1
82
tf.constant([[0, 0, 10, 10, 1], [11, 12, 30, 30, 2]], tf.float32),
83
# image 2
84
tf.constant([[0, 0, 10, 10, 1]], tf.float32),
85
]
86
)
87
y_pred = tf.ragged.stack(
88
[
89
# predictions for image 1
90
tf.constant([[5, 5, 10, 10, 1, 0.9]], tf.float32),
91
# predictions for image 2
92
tf.constant([[0, 0, 10, 10, 1, 1.0], [5, 5, 10, 10, 1, 0.9]], tf.float32),
93
]
94
)
95
96
"""
97
3.) Update metric state:
98
"""
99
100
metric.update_state(y_true, y_pred)
101
102
"""
103
4.) Evaluate the result:
104
"""
105
106
metric.result()
107
108
"""
109
Evaluating COCORecall for your object detection model is as simple as that!
110
"""
111
112
"""
113
## Metric use in a model
114
115
You can also leverage COCORecall in your model's training loop. Let's walk through this
116
process.
117
118
1.) Construct your the metric and a dummy model
119
"""
120
121
i = keras.layers.Input((None, 6))
122
model = keras.Model(i, i)
123
124
"""
125
2.) Create some fake bounding boxes:
126
"""
127
128
y_true = tf.constant([[[0, 0, 10, 10, 1], [5, 5, 10, 10, 1]]], tf.float32)
129
y_pred = tf.constant([[[0, 0, 10, 10, 1, 1.0], [5, 5, 10, 10, 1, 0.9]]], tf.float32)
130
131
"""
132
3.) Create the metric and compile the model
133
"""
134
135
recall = keras_cv.metrics.COCORecall(
136
bounding_box_format="xyxy",
137
max_detections=100,
138
class_ids=[1],
139
area_range=(0, 64**2),
140
name="coco_recall",
141
)
142
model.compile(metrics=[recall])
143
144
"""
145
4.) Use `model.evaluate()` to evaluate the metric
146
"""
147
148
model.evaluate(y_pred, y_true, return_dict=True)
149
150
"""
151
Looks great! That's all it takes to use KerasCV's COCO metrics to evaluate object
152
detection models.
153
"""
154
155
"""
156
## Supported constructor parameters
157
158
KerasCV COCO Metrics are sufficiently parameterized to support all of the
159
permutations evaluated in the original COCO challenge, all metrics evaluated in
160
the accompanying `pycocotools` library, and more!
161
162
Check out the full documentation for [`COCORecall`](/api/keras_cv/metrics/coco_recall/) and
163
[`COCOMeanAveragePrecision`](/api/keras_cv/metrics/coco_mean_average_precision/).
164
165
166
## Conclusion & next steps
167
KerasCV makes it easier than ever before to evaluate a Keras object detection model.
168
Historically, users had to perform post training evaluation. With KerasCV, you can
169
perform train time evaluation to see how these metrics evolve over time!
170
171
As an additional exercise for readers, you can:
172
173
- Configure `iou_thresholds`, `max_detections`, and `area_range` to reproduce the suite
174
of metrics evaluted in `pycocotools`
175
- Integrate COCO metrics into a RetinaNet using the
176
[keras.io RetinaNet example](https://keras.io/examples/vision/retinanet/)
177
"""
178
179