Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit ff0c622

Browse files
code update
1 parent 1ab1775 commit ff0c622

File tree

8 files changed

+648
-23
lines changed

8 files changed

+648
-23
lines changed

‎05. Improvements to the RNN/.ipynb_checkpoints/5.02 Understanding the LSTM cell-checkpoint.ipynb‎

Lines changed: 14 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,10 @@
3636
"## Forget Gate \n",
3737
"\n",
3838
"The forget gate $f_t$ is responsible for deciding what information should be removed from\n",
39-
"the cell state (memory). Consider the following sentences. Harry is a good singer. He lives in\n",
39+
"the cell state (memory). \n",
40+
"\n",
41+
"\n",
42+
"Consider the following sentences: Harry is a good singer. He lives in\n",
4043
"New York. Zayn is also a good singer.\n",
4144
"\n",
4245
"As soon as we start talking about Zayn, the network will understand that the subject has\n",
@@ -70,8 +73,12 @@
7073
"\n",
7174
"\n",
7275
"The input gate is responsible for deciding what information should be stored in the cell\n",
73-
"state. Let's consider the same example. Harry is a good singer. He lives in New York. Zayn is\n",
76+
"state.\n",
77+
"\n",
78+
"Let's consider the same example: Harry is a good singer. He lives in New York. Zayn is\n",
7479
"also a good singer.\n",
80+
"\n",
81+
"\n",
7582
"After the forget gate removes information from the cell state, the input gate decides what\n",
7683
"information it has to keep in the memory. Here, since the information about Harry is\n",
7784
"removed from the cell state by the forget gate, the input gate decides to update the cell state\n",
@@ -110,8 +117,12 @@
110117
"\n",
111118
"We will have a lot of information in the cell state (memory). The output gate is responsible\n",
112119
"for deciding what information should be taken from the cell state to give as an\n",
113-
"output. Consider the following sentences. Zayn's debut album was a huge success. Congrats\n",
120+
"output. \n",
121+
"\n",
122+
"Consider the following sentences. Zayn's debut album was a huge success. Congrats\n",
114123
"____.\n",
124+
"\n",
125+
"\n",
115126
"The output gate will look up all the information in the cell state and select the correct\n",
116127
"information to fill the blank. Here, congrats is an adjective which is used to describe a noun.\n",
117128
"So the output gate will predict Zayn (noun), to fill the blank. Similar to other gates, it is also\n",
15.4 KB
Loading[フレーム]
10 KB
Loading[フレーム]

‎07. Learning Text Representations/.ipynb_checkpoints/7.02 Continuous Bag of words-checkpoint.ipynb‎

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,10 @@
1616
"of size to choose the context word. If the window size is 2 then we use two words before\n",
1717
"and two words after the target word as the context words.\n",
1818
"\n",
19-
"Let us consider the sentence 'The sun rises in the east' with 'rises' as the target word. If we set\n",
19+
"Let us consider the sentence 'The sun rises in the east' with the word 'rises' as the target word. \n",
20+
"\n",
21+
"\n",
22+
"If we set\n",
2023
"the window size =2 then we take the words 'the' and 'sun' which are the two words before\n",
2124
"and 'in' and 'the' which are the two words after to the target word 'rises' as context words as\n",
2225
"shown below:\n",
@@ -78,6 +81,15 @@
7881
"does it learn the optimal weights using backpropagation? Let us inspect that in the next\n",
7982
"section"
8083
]
84+
},
85+
{
86+
"cell_type": "code",
87+
"execution_count": null,
88+
"metadata": {
89+
"collapsed": true
90+
},
91+
"outputs": [],
92+
"source": []
8193
}
8294
],
8395
"metadata": {

‎07. Learning Text Representations/7.02 Continuous Bag of words.ipynb‎

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,15 @@
8181
"does it learn the optimal weights using backpropagation? Let us inspect that in the next\n",
8282
"section"
8383
]
84+
},
85+
{
86+
"cell_type": "code",
87+
"execution_count": null,
88+
"metadata": {
89+
"collapsed": true
90+
},
91+
"outputs": [],
92+
"source": []
8493
}
8594
],
8695
"metadata": {
29.7 KB
Loading[フレーム]

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /