|  | 
| 13 | 13 |  }, | 
| 14 | 14 |  { | 
| 15 | 15 |  "cell_type": "code", | 
| 16 |  | - "execution_count": 0, | 
|  | 16 | + "execution_count": null, | 
| 17 | 17 |  "metadata": { | 
| 18 | 18 |  "cellView": "both", | 
| 19 | 19 |  "colab": {}, | 
|  | 
| 52 | 52 |  "id": "TaX0smDP7xQY" | 
| 53 | 53 |  }, | 
| 54 | 54 |  "source": [ | 
| 55 |  | - "In this tutorial, we'll use TensorFlow 2.3 to create an image classification model, train it with a flowers dataset, and convert it to TensorFlow Lite using post-training quantization. Finally, we compile it for compatibility with the Edge TPU (available in [Coral devices](https://coral.ai/products/)).\n", | 
|  | 55 | + "In this tutorial, we'll use TensorFlow 2 to create an image classification model, train it with a flowers dataset, and convert it to TensorFlow Lite using post-training quantization. Finally, we compile it for compatibility with the Edge TPU (available in [Coral devices](https://coral.ai/products/)).\n", | 
| 56 | 56 |  "\n", | 
| 57 | 57 |  "The model is based on a pre-trained version of MobileNet V2. We'll start by retraining only the classification layers, reusing MobileNet's pre-trained feature extractor layers. Then we'll fine-tune the model by updating weights in some of the feature extractor layers. This type of transfer learning is much faster than training the entire model from scratch.\n", | 
| 58 | 58 |  "\n", | 
| 59 | 59 |  "Once it's trained, we'll use post-training quantization to convert all parameters to int8 format, which reduces the model size and increases inferencing speed. This format is also required for compatibility on the Edge TPU.\n", | 
| 60 | 60 |  "\n", | 
| 61 | 61 |  "For more information about how to create a model compatible with the Edge TPU, see the [documentation at coral.ai](https://coral.ai/docs/edgetpu/models-intro/).\n", | 
| 62 | 62 |  "\n", | 
| 63 |  | - "**Note:** This tutorial requires TensorFlow 2.3+ and depends on an early release version of the `TFliteConverter` for full quantization, which currently does not work for all types of models. In particular, this tutorial expects a Keras-built model and this conversion strategy currently doesn't work with models imported from a frozen graph. (If you're using TF 1.x, see [the 1.x version of this tutorial](https://colab.research.google.com/github/google-coral/tutorials/blob/master/retrain_classification_ptq_tf1.ipynb).)" | 
|  | 63 | + "**Note:** This tutorial requires TensorFlow 2.3+ for full quantization, which currently does not work for all types of models. In particular, this tutorial expects a Keras-built model and this conversion strategy currently doesn't work with models imported from a frozen graph. (If you're using TF 1.x, see [the 1.x version of this tutorial](https://colab.research.google.com/github/google-coral/tutorials/blob/master/retrain_classification_ptq_tf1.ipynb).)" | 
| 64 | 64 |  ] | 
| 65 | 65 |  }, | 
| 66 | 66 |  { | 
|  | 
| 102 | 102 |  "id": "02MxhCyFmpzn" | 
| 103 | 103 |  }, | 
| 104 | 104 |  "source": [ | 
| 105 |  | - "**Note:** Until TensorFlow 2.3 is released as stable, we need to install the nightly build in order to use the latest `TFLiteConverter` that supports quantization for input and output tensors:" | 
|  | 105 | + "In order to quantize both the input and output tensors, we need `TFLiteConverter` APIs that are available in TensorFlow r2.3 or higher:" | 
| 106 | 106 |  ] | 
| 107 | 107 |  }, | 
| 108 | 108 |  { | 
| 109 | 109 |  "cell_type": "code", | 
| 110 |  | - "execution_count": 0, | 
| 111 |  | - "metadata": { | 
| 112 |  | - "colab": {}, | 
| 113 |  | - "colab_type": "code", | 
| 114 |  | - "id": "L-YbcBDDmaxO" | 
| 115 |  | - }, | 
| 116 |  | - "outputs": [], | 
| 117 |  | - "source": [ | 
| 118 |  | - "! pip uninstall -y tensorflow\n", | 
| 119 |  | - "! pip install tf-nightly" | 
| 120 |  | - ] | 
| 121 |  | - }, | 
| 122 |  | - { | 
| 123 |  | - "cell_type": "code", | 
| 124 |  | - "execution_count": 0, | 
|  | 110 | + "execution_count": null, | 
| 125 | 111 |  "metadata": { | 
| 126 | 112 |  "colab": {}, | 
| 127 | 113 |  "colab_type": "code", | 
|  | 
| 161 | 147 |  }, | 
| 162 | 148 |  { | 
| 163 | 149 |  "cell_type": "code", | 
| 164 |  | - "execution_count": 0, | 
|  | 150 | + "execution_count": null, | 
| 165 | 151 |  "metadata": { | 
| 166 | 152 |  "colab": {}, | 
| 167 | 153 |  "colab_type": "code", | 
|  | 
| 190 | 176 |  }, | 
| 191 | 177 |  { | 
| 192 | 178 |  "cell_type": "code", | 
| 193 |  | - "execution_count": 0, | 
|  | 179 | + "execution_count": null, | 
| 194 | 180 |  "metadata": { | 
| 195 | 181 |  "colab": {}, | 
| 196 | 182 |  "colab_type": "code", | 
|  | 
| 231 | 217 |  }, | 
| 232 | 218 |  { | 
| 233 | 219 |  "cell_type": "code", | 
| 234 |  | - "execution_count": 0, | 
|  | 220 | + "execution_count": null, | 
| 235 | 221 |  "metadata": { | 
| 236 | 222 |  "colab": {}, | 
| 237 | 223 |  "colab_type": "code", | 
|  | 
| 255 | 241 |  }, | 
| 256 | 242 |  { | 
| 257 | 243 |  "cell_type": "code", | 
| 258 |  | - "execution_count": 0, | 
|  | 244 | + "execution_count": null, | 
| 259 | 245 |  "metadata": { | 
| 260 | 246 |  "colab": {}, | 
| 261 | 247 |  "colab_type": "code", | 
|  | 
| 273 | 259 |  }, | 
| 274 | 260 |  { | 
| 275 | 261 |  "cell_type": "code", | 
| 276 |  | - "execution_count": 0, | 
|  | 262 | + "execution_count": null, | 
| 277 | 263 |  "metadata": { | 
| 278 | 264 |  "colab": {}, | 
| 279 | 265 |  "colab_type": "code", | 
|  | 
| 313 | 299 |  }, | 
| 314 | 300 |  { | 
| 315 | 301 |  "cell_type": "code", | 
| 316 |  | - "execution_count": 0, | 
|  | 302 | + "execution_count": null, | 
| 317 | 303 |  "metadata": { | 
| 318 | 304 |  "colab": {}, | 
| 319 | 305 |  "colab_type": "code", | 
|  | 
| 344 | 330 |  }, | 
| 345 | 331 |  { | 
| 346 | 332 |  "cell_type": "code", | 
| 347 |  | - "execution_count": 0, | 
|  | 333 | + "execution_count": null, | 
| 348 | 334 |  "metadata": { | 
| 349 | 335 |  "colab": {}, | 
| 350 | 336 |  "colab_type": "code", | 
|  | 
| 375 | 361 |  }, | 
| 376 | 362 |  { | 
| 377 | 363 |  "cell_type": "code", | 
| 378 |  | - "execution_count": 0, | 
|  | 364 | + "execution_count": null, | 
| 379 | 365 |  "metadata": { | 
| 380 | 366 |  "colab": {}, | 
| 381 | 367 |  "colab_type": "code", | 
|  | 
| 400 | 386 |  }, | 
| 401 | 387 |  { | 
| 402 | 388 |  "cell_type": "code", | 
| 403 |  | - "execution_count": 0, | 
|  | 389 | + "execution_count": null, | 
| 404 | 390 |  "metadata": { | 
| 405 | 391 |  "colab": {}, | 
| 406 | 392 |  "colab_type": "code", | 
|  | 
| 423 | 409 |  }, | 
| 424 | 410 |  { | 
| 425 | 411 |  "cell_type": "code", | 
| 426 |  | - "execution_count": 0, | 
|  | 412 | + "execution_count": null, | 
| 427 | 413 |  "metadata": { | 
| 428 | 414 |  "colab": {}, | 
| 429 | 415 |  "colab_type": "code", | 
|  | 
| 460 | 446 |  }, | 
| 461 | 447 |  { | 
| 462 | 448 |  "cell_type": "code", | 
| 463 |  | - "execution_count": 0, | 
|  | 449 | + "execution_count": null, | 
| 464 | 450 |  "metadata": { | 
| 465 | 451 |  "colab": {}, | 
| 466 | 452 |  "colab_type": "code", | 
|  | 
| 487 | 473 |  }, | 
| 488 | 474 |  { | 
| 489 | 475 |  "cell_type": "code", | 
| 490 |  | - "execution_count": 0, | 
|  | 476 | + "execution_count": null, | 
| 491 | 477 |  "metadata": { | 
| 492 | 478 |  "colab": {}, | 
| 493 | 479 |  "colab_type": "code", | 
|  | 
| 567 | 553 |  }, | 
| 568 | 554 |  { | 
| 569 | 555 |  "cell_type": "code", | 
| 570 |  | - "execution_count": 0, | 
|  | 556 | + "execution_count": null, | 
| 571 | 557 |  "metadata": { | 
| 572 | 558 |  "colab": {}, | 
| 573 | 559 |  "colab_type": "code", | 
|  | 
| 590 | 576 |  }, | 
| 591 | 577 |  { | 
| 592 | 578 |  "cell_type": "code", | 
| 593 |  | - "execution_count": 0, | 
|  | 579 | + "execution_count": null, | 
| 594 | 580 |  "metadata": { | 
| 595 | 581 |  "colab": {}, | 
| 596 | 582 |  "colab_type": "code", | 
|  | 
| 620 | 606 |  }, | 
| 621 | 607 |  { | 
| 622 | 608 |  "cell_type": "code", | 
| 623 |  | - "execution_count": 0, | 
|  | 609 | + "execution_count": null, | 
| 624 | 610 |  "metadata": { | 
| 625 | 611 |  "colab": {}, | 
| 626 | 612 |  "colab_type": "code", | 
|  | 
| 635 | 621 |  }, | 
| 636 | 622 |  { | 
| 637 | 623 |  "cell_type": "code", | 
| 638 |  | - "execution_count": 0, | 
|  | 624 | + "execution_count": null, | 
| 639 | 625 |  "metadata": { | 
| 640 | 626 |  "colab": {}, | 
| 641 | 627 |  "colab_type": "code", | 
|  | 
| 648 | 634 |  }, | 
| 649 | 635 |  { | 
| 650 | 636 |  "cell_type": "code", | 
| 651 |  | - "execution_count": 0, | 
|  | 637 | + "execution_count": null, | 
| 652 | 638 |  "metadata": { | 
| 653 | 639 |  "colab": {}, | 
| 654 | 640 |  "colab_type": "code", | 
|  | 
| 681 | 667 |  }, | 
| 682 | 668 |  { | 
| 683 | 669 |  "cell_type": "code", | 
| 684 |  | - "execution_count": 0, | 
|  | 670 | + "execution_count": null, | 
| 685 | 671 |  "metadata": { | 
| 686 | 672 |  "colab": {}, | 
| 687 | 673 |  "colab_type": "code", | 
|  | 
| 708 | 694 |  }, | 
| 709 | 695 |  { | 
| 710 | 696 |  "cell_type": "code", | 
| 711 |  | - "execution_count": 0, | 
|  | 697 | + "execution_count": null, | 
| 712 | 698 |  "metadata": { | 
| 713 | 699 |  "colab": {}, | 
| 714 | 700 |  "colab_type": "code", | 
|  | 
| 780 | 766 |  }, | 
| 781 | 767 |  { | 
| 782 | 768 |  "cell_type": "code", | 
| 783 |  | - "execution_count": 0, | 
|  | 769 | + "execution_count": null, | 
| 784 | 770 |  "metadata": { | 
| 785 | 771 |  "colab": {}, | 
| 786 | 772 |  "colab_type": "code", | 
|  | 
| 811 | 797 |  }, | 
| 812 | 798 |  { | 
| 813 | 799 |  "cell_type": "code", | 
| 814 |  | - "execution_count": 0, | 
|  | 800 | + "execution_count": null, | 
| 815 | 801 |  "metadata": { | 
| 816 | 802 |  "colab": {}, | 
| 817 | 803 |  "colab_type": "code", | 
|  | 
| 873 | 859 |  }, | 
| 874 | 860 |  { | 
| 875 | 861 |  "cell_type": "code", | 
| 876 |  | - "execution_count": 0, | 
|  | 862 | + "execution_count": null, | 
| 877 | 863 |  "metadata": { | 
| 878 | 864 |  "colab": {}, | 
| 879 | 865 |  "colab_type": "code", | 
|  | 
| 907 | 893 |  }, | 
| 908 | 894 |  { | 
| 909 | 895 |  "cell_type": "code", | 
| 910 |  | - "execution_count": 0, | 
|  | 896 | + "execution_count": null, | 
| 911 | 897 |  "metadata": { | 
| 912 | 898 |  "colab": {}, | 
| 913 | 899 |  "colab_type": "code", | 
|  | 
| 989 | 975 |  }, | 
| 990 | 976 |  { | 
| 991 | 977 |  "cell_type": "code", | 
| 992 |  | - "execution_count": 0, | 
|  | 978 | + "execution_count": null, | 
| 993 | 979 |  "metadata": { | 
| 994 | 980 |  "colab": {}, | 
| 995 | 981 |  "colab_type": "code", | 
|  | 
| 1018 | 1004 |  }, | 
| 1019 | 1005 |  { | 
| 1020 | 1006 |  "cell_type": "code", | 
| 1021 |  | - "execution_count": 0, | 
|  | 1007 | + "execution_count": null, | 
| 1022 | 1008 |  "metadata": { | 
| 1023 | 1009 |  "colab": {}, | 
| 1024 | 1010 |  "colab_type": "code", | 
|  | 
| 1063 | 1049 |  }, | 
| 1064 | 1050 |  { | 
| 1065 | 1051 |  "cell_type": "code", | 
| 1066 |  | - "execution_count": 0, | 
|  | 1052 | + "execution_count": null, | 
| 1067 | 1053 |  "metadata": { | 
| 1068 | 1054 |  "colab": {}, | 
| 1069 | 1055 |  "colab_type": "code", | 
|  | 
0 commit comments