Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 52c7703

Browse files
committed
fixed some typos
1 parent befe560 commit 52c7703

File tree

1 file changed

+15
-11
lines changed

1 file changed

+15
-11
lines changed

‎basic_Pytorch_introduction_NeuralNetworks.py

Lines changed: 15 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1129,9 +1129,9 @@ def forward(self, input):
11291129
print(acc_train)
11301130

11311131
#%%
1132-
# now we are ready to see how finte-tuning works in Pytorch
1133-
# there are several models in github models repository that we can use
1134-
# lets choose one but before lets see what we have at our disposal
1132+
# OK we are ready to see how fine-tuning works in Pytorch
1133+
# there are several models in models repository that we can use
1134+
# lets choose one but before that lets see what we have at our disposal
11351135
from torchvision import models
11361136
import torch.nn as nn
11371137

@@ -1142,19 +1142,23 @@ def forward(self, input):
11421142
# lets print the model
11431143
print(f'\nORIGINAL MODEL : \n{resnet18}\n')
11441144

1145-
# by looking at the architecure, we notice
1145+
# by looking at the architecure, we notice :
11461146
# (fc): Linear(in_features=512, out_features=1000, bias=True)
1147-
# this means in order to make retrain this network for our usecase
1148-
# lets train this for cifar10 which has 10 classes.
1147+
# In order to retrain this network for our usecase
1148+
# we need to alter this layer. this was trained on imagenet
1149+
# which had 1000 classes. lets train this for cifar10 which
1150+
# has 10 classes. all we need to do is just defining a new
1151+
# fully connected (fc) layer and assigning it back to
1152+
# resnet18.fc attribute!
11491153
resnet18.fc = nn.Linear(512, 10)
1150-
# instead of hardcoding the 512 which we saw from the printed version of
1151-
# our model. we can simply use the in_features attribute of the fc layer!
1152-
# and write :
1154+
# instead of hardcoding the 512 which we saw by looking at the
1155+
# printed version of our model, we can simply use the
1156+
# 'in_features' attribute of the fc layer! and write :
11531157
# resnet18.fc = nn.Linear(resnet18.fc.in_features, 10)
11541158

11551159
print(f'\nNEW MODEL(after adding the new fc layer): \n{resnet18}')
1156-
# now before we dive in to train our net we should frst
1157-
# freeze all layers but this new one, and train for several epochs,
1160+
# now before we dive in to train our network we should first
1161+
# freeze all layers except this new one, and train for several epochs,
11581162
# so that it converges to a reasonable set of weights
11591163
# then we unfreeze all previous layers and train the whole net
11601164
# altogether again.

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /