You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you want to run the model on a edge device like a Raspberry Pi or if you want to run it on a smartphone it's a good idea to convert your model to Tensorflow Lite format. This can be done with with the ```export_tflite_ssd_graph.py``` file.
After executing the command, there should be two new files in the inference_graph folder. A tflite_graph.pb and a tflite_graph.pbtxt file.
315
+
316
+
Now you have a graph architecture and network operations that are compatible with Tensorflow Lite. To finish the convertion you now need to convert the actual model.
317
+
318
+
### 9. Using TOCO to Create Optimzed TensorFlow Lite Model
319
+
320
+
To convert the frozen graph to Tensorflow Lite we need to run it through the Tensorflow Lite Optimizing Converter (TOCO). TOCO converts the model into an optimized FlatBuffer format that runs efficiently on Tensorflow Lite.
321
+
322
+
For this to work you need to have Tensorflow installed from scratch. This is a tedious task which I wouldn't cover in this tutorial. But you can follow the [official installation guide](https://www.tensorflow.org/install/source_windows). I'd recommend you to create a [Anaconda Environment](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) specificly for this purpose.
323
+
324
+
After building Tensorflow from scratch you're ready to start the with the conversation.
325
+
326
+
#### 9.1 Create Tensorflow Lite model
327
+
328
+
To create a optimized Tensorflow Lite model we need to run TOCO. TOCO is locate in the tensorflow/lite directory, which you should have after install Tensorflow from source.
329
+
330
+
If you want to convert a quantized model you can run the following command:
331
+
332
+
```bash
333
+
export OUTPUT_DIR=/tmp/tflite
334
+
bazel run --config=opt tensorflow/lite/toco:toco -- \
If you are working on Windows you might need to remove the ' if the command doesn't work. For more information on how to use TOCO check out [the official instructions](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/running_on_mobile_tensorflowlite.md).
364
+
365
+
#### 9.2 Create new labelmap for Tensorflow Lite
366
+
367
+
Next you need to create a label map for Tensorflow Lite, since it doesn't have the same format as a classical Tensorflow labelmap.
368
+
369
+
Tensorflow labelmap:
370
+
371
+
```bash
372
+
item {
373
+
name: "a"
374
+
id: 1
375
+
display_name: "a"
376
+
}
377
+
item {
378
+
name: "b"
379
+
id: 2
380
+
display_name: "b"
381
+
}
382
+
item {
383
+
name: "c"
384
+
id: 3
385
+
display_name: "c"
386
+
}
387
+
```
388
+
389
+
The Tensorflow Lite labelmap format only has the display_names (if there is no display_name the name is used).
390
+
391
+
```bash
392
+
a
393
+
b
394
+
c
395
+
```
396
+
397
+
So basically the only thing you need to do is to create a new labelmap file and copy the display_names (names) from the other labelmap file into it.
398
+
399
+
### 10. Using the model for inference
305
400
306
401
After training the model it can be used in many ways. For examples on how to use the model check out my other repositories.
307
402
308
403
*[Inference with Tensorflow 1.x](https://github.com/TannerGilbert/Tutorials/tree/master/Tensorflow%20Object%20Detection)
0 commit comments