Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 14f99e6

Browse files
Update colab now that TF r2.3 is stable
PiperOrigin-RevId: 324875735
1 parent 62b4e46 commit 14f99e6

File tree

1 file changed

+31
-45
lines changed

1 file changed

+31
-45
lines changed

‎retrain_classification_ptq_tf2.ipynb‎

Lines changed: 31 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
},
1414
{
1515
"cell_type": "code",
16-
"execution_count": 0,
16+
"execution_count": null,
1717
"metadata": {
1818
"cellView": "both",
1919
"colab": {},
@@ -52,15 +52,15 @@
5252
"id": "TaX0smDP7xQY"
5353
},
5454
"source": [
55-
"In this tutorial, we'll use TensorFlow 2.3 to create an image classification model, train it with a flowers dataset, and convert it to TensorFlow Lite using post-training quantization. Finally, we compile it for compatibility with the Edge TPU (available in [Coral devices](https://coral.ai/products/)).\n",
55+
"In this tutorial, we'll use TensorFlow 2 to create an image classification model, train it with a flowers dataset, and convert it to TensorFlow Lite using post-training quantization. Finally, we compile it for compatibility with the Edge TPU (available in [Coral devices](https://coral.ai/products/)).\n",
5656
"\n",
5757
"The model is based on a pre-trained version of MobileNet V2. We'll start by retraining only the classification layers, reusing MobileNet's pre-trained feature extractor layers. Then we'll fine-tune the model by updating weights in some of the feature extractor layers. This type of transfer learning is much faster than training the entire model from scratch.\n",
5858
"\n",
5959
"Once it's trained, we'll use post-training quantization to convert all parameters to int8 format, which reduces the model size and increases inferencing speed. This format is also required for compatibility on the Edge TPU.\n",
6060
"\n",
6161
"For more information about how to create a model compatible with the Edge TPU, see the [documentation at coral.ai](https://coral.ai/docs/edgetpu/models-intro/).\n",
6262
"\n",
63-
"**Note:** This tutorial requires TensorFlow 2.3+ and depends on an early release version of the `TFliteConverter` for full quantization, which currently does not work for all types of models. In particular, this tutorial expects a Keras-built model and this conversion strategy currently doesn't work with models imported from a frozen graph. (If you're using TF 1.x, see [the 1.x version of this tutorial](https://colab.research.google.com/github/google-coral/tutorials/blob/master/retrain_classification_ptq_tf1.ipynb).)"
63+
"**Note:** This tutorial requires TensorFlow 2.3+ for full quantization, which currently does not work for all types of models. In particular, this tutorial expects a Keras-built model and this conversion strategy currently doesn't work with models imported from a frozen graph. (If you're using TF 1.x, see [the 1.x version of this tutorial](https://colab.research.google.com/github/google-coral/tutorials/blob/master/retrain_classification_ptq_tf1.ipynb).)"
6464
]
6565
},
6666
{
@@ -102,26 +102,12 @@
102102
"id": "02MxhCyFmpzn"
103103
},
104104
"source": [
105-
"**Note:** Until TensorFlow 2.3 is released as stable, we need to install the nightly build in order to use the latest `TFLiteConverter` that supports quantization for input and output tensors:"
105+
"In order to quantize both the input and output tensors, we need `TFLiteConverter` APIs that are available in TensorFlow r2.3 or higher:"
106106
]
107107
},
108108
{
109109
"cell_type": "code",
110-
"execution_count": 0,
111-
"metadata": {
112-
"colab": {},
113-
"colab_type": "code",
114-
"id": "L-YbcBDDmaxO"
115-
},
116-
"outputs": [],
117-
"source": [
118-
"! pip uninstall -y tensorflow\n",
119-
"! pip install tf-nightly"
120-
]
121-
},
122-
{
123-
"cell_type": "code",
124-
"execution_count": 0,
110+
"execution_count": null,
125111
"metadata": {
126112
"colab": {},
127113
"colab_type": "code",
@@ -161,7 +147,7 @@
161147
},
162148
{
163149
"cell_type": "code",
164-
"execution_count": 0,
150+
"execution_count": null,
165151
"metadata": {
166152
"colab": {},
167153
"colab_type": "code",
@@ -190,7 +176,7 @@
190176
},
191177
{
192178
"cell_type": "code",
193-
"execution_count": 0,
179+
"execution_count": null,
194180
"metadata": {
195181
"colab": {},
196182
"colab_type": "code",
@@ -231,7 +217,7 @@
231217
},
232218
{
233219
"cell_type": "code",
234-
"execution_count": 0,
220+
"execution_count": null,
235221
"metadata": {
236222
"colab": {},
237223
"colab_type": "code",
@@ -255,7 +241,7 @@
255241
},
256242
{
257243
"cell_type": "code",
258-
"execution_count": 0,
244+
"execution_count": null,
259245
"metadata": {
260246
"colab": {},
261247
"colab_type": "code",
@@ -273,7 +259,7 @@
273259
},
274260
{
275261
"cell_type": "code",
276-
"execution_count": 0,
262+
"execution_count": null,
277263
"metadata": {
278264
"colab": {},
279265
"colab_type": "code",
@@ -313,7 +299,7 @@
313299
},
314300
{
315301
"cell_type": "code",
316-
"execution_count": 0,
302+
"execution_count": null,
317303
"metadata": {
318304
"colab": {},
319305
"colab_type": "code",
@@ -344,7 +330,7 @@
344330
},
345331
{
346332
"cell_type": "code",
347-
"execution_count": 0,
333+
"execution_count": null,
348334
"metadata": {
349335
"colab": {},
350336
"colab_type": "code",
@@ -375,7 +361,7 @@
375361
},
376362
{
377363
"cell_type": "code",
378-
"execution_count": 0,
364+
"execution_count": null,
379365
"metadata": {
380366
"colab": {},
381367
"colab_type": "code",
@@ -400,7 +386,7 @@
400386
},
401387
{
402388
"cell_type": "code",
403-
"execution_count": 0,
389+
"execution_count": null,
404390
"metadata": {
405391
"colab": {},
406392
"colab_type": "code",
@@ -423,7 +409,7 @@
423409
},
424410
{
425411
"cell_type": "code",
426-
"execution_count": 0,
412+
"execution_count": null,
427413
"metadata": {
428414
"colab": {},
429415
"colab_type": "code",
@@ -460,7 +446,7 @@
460446
},
461447
{
462448
"cell_type": "code",
463-
"execution_count": 0,
449+
"execution_count": null,
464450
"metadata": {
465451
"colab": {},
466452
"colab_type": "code",
@@ -487,7 +473,7 @@
487473
},
488474
{
489475
"cell_type": "code",
490-
"execution_count": 0,
476+
"execution_count": null,
491477
"metadata": {
492478
"colab": {},
493479
"colab_type": "code",
@@ -567,7 +553,7 @@
567553
},
568554
{
569555
"cell_type": "code",
570-
"execution_count": 0,
556+
"execution_count": null,
571557
"metadata": {
572558
"colab": {},
573559
"colab_type": "code",
@@ -590,7 +576,7 @@
590576
},
591577
{
592578
"cell_type": "code",
593-
"execution_count": 0,
579+
"execution_count": null,
594580
"metadata": {
595581
"colab": {},
596582
"colab_type": "code",
@@ -620,7 +606,7 @@
620606
},
621607
{
622608
"cell_type": "code",
623-
"execution_count": 0,
609+
"execution_count": null,
624610
"metadata": {
625611
"colab": {},
626612
"colab_type": "code",
@@ -635,7 +621,7 @@
635621
},
636622
{
637623
"cell_type": "code",
638-
"execution_count": 0,
624+
"execution_count": null,
639625
"metadata": {
640626
"colab": {},
641627
"colab_type": "code",
@@ -648,7 +634,7 @@
648634
},
649635
{
650636
"cell_type": "code",
651-
"execution_count": 0,
637+
"execution_count": null,
652638
"metadata": {
653639
"colab": {},
654640
"colab_type": "code",
@@ -681,7 +667,7 @@
681667
},
682668
{
683669
"cell_type": "code",
684-
"execution_count": 0,
670+
"execution_count": null,
685671
"metadata": {
686672
"colab": {},
687673
"colab_type": "code",
@@ -708,7 +694,7 @@
708694
},
709695
{
710696
"cell_type": "code",
711-
"execution_count": 0,
697+
"execution_count": null,
712698
"metadata": {
713699
"colab": {},
714700
"colab_type": "code",
@@ -780,7 +766,7 @@
780766
},
781767
{
782768
"cell_type": "code",
783-
"execution_count": 0,
769+
"execution_count": null,
784770
"metadata": {
785771
"colab": {},
786772
"colab_type": "code",
@@ -811,7 +797,7 @@
811797
},
812798
{
813799
"cell_type": "code",
814-
"execution_count": 0,
800+
"execution_count": null,
815801
"metadata": {
816802
"colab": {},
817803
"colab_type": "code",
@@ -873,7 +859,7 @@
873859
},
874860
{
875861
"cell_type": "code",
876-
"execution_count": 0,
862+
"execution_count": null,
877863
"metadata": {
878864
"colab": {},
879865
"colab_type": "code",
@@ -907,7 +893,7 @@
907893
},
908894
{
909895
"cell_type": "code",
910-
"execution_count": 0,
896+
"execution_count": null,
911897
"metadata": {
912898
"colab": {},
913899
"colab_type": "code",
@@ -989,7 +975,7 @@
989975
},
990976
{
991977
"cell_type": "code",
992-
"execution_count": 0,
978+
"execution_count": null,
993979
"metadata": {
994980
"colab": {},
995981
"colab_type": "code",
@@ -1018,7 +1004,7 @@
10181004
},
10191005
{
10201006
"cell_type": "code",
1021-
"execution_count": 0,
1007+
"execution_count": null,
10221008
"metadata": {
10231009
"colab": {},
10241010
"colab_type": "code",
@@ -1063,7 +1049,7 @@
10631049
},
10641050
{
10651051
"cell_type": "code",
1066-
"execution_count": 0,
1052+
"execution_count": null,
10671053
"metadata": {
10681054
"colab": {},
10691055
"colab_type": "code",

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /