|  | 
| 1 | 1 | 
 | 
| 2 | 2 | # [Chapter 7. Learning Text Representations](#) | 
| 3 |  | - | 
| 4 |  | -* [7.1. Understanding Word2vec Model](https://github.com/sudharsan13296/Hands-On-Deep-Learning-Algorithms-with-Python/blob/master/07.%20Learning%20Text%20Representations/7.01%20Understanding%20Word2vec%20Model.ipynb) | 
| 5 |  | -* 7.2. Continuous Bag of words | 
| 6 |  | -* 7.3. Math of CBOW | 
| 7 |  | -	* 7.3.1. Deriving Forward Propagation | 
| 8 |  | -	* 7.3.2. Deriving Backward Propagation | 
| 9 |  | -* 7.4. Skip- Gram model | 
| 10 |  | -* 7.5. Math of Skip-Gram  | 
| 11 |  | -	* 7.5.1. Deriving Forward Propagation | 
| 12 |  | -	* 7.5.2. Deriving Backward Propagation | 
| 13 |  | -* 7.6. various training strategies | 
| 14 |  | -	* 7.6.1. Hierarchical Softmax | 
| 15 |  | -	* 7.6.2. Negative sampling | 
| 16 |  | -	* 7.6.3. Subsampling frequent words | 
| 17 |  | -* [ 7.7. Building word2vec model using Gensim](https://github.com/sudharsan13296/Hands-On-Deep-Learning-Algorithms-with-Python/blob/master/07.%20Learning%20Text%20Representations/7.07%20Building%20word2vec%20model%20using%20Gensim.ipynb) | 
| 18 |  | -* [7.8. Visualizing word embeddings in TensorBoard](https://github.com/sudharsan13296/Hands-On-Deep-Learning-Algorithms-with-Python/blob/master/07.%20Learning%20Text%20Representations/7.08%20Visualizing%20Word%20Embeddings%20in%20TensorBoard.ipynb) | 
| 19 |  | -* 7.9. Converting documents to vectors using doc2vec | 
| 20 |  | -	* 7.9.1. PV-DM | 
| 21 |  | -	* 7.9.2. PV-DBOW | 
| 22 |  | -* [7.10. Finding similar documents using Doc2vec](https://github.com/sudharsan13296/Hands-On-Deep-Learning-Algorithms-with-Python/blob/master/07.%20Learning%20Text%20Representations/7.10%20Finding%20similar%20documents%20using%20Doc2Vec.ipynb) | 
| 23 |  | -* 7.11. Understanding skip thoughts algorithm | 
| 24 |  | -* 7.12 Quick thoughts for sentence embeddings | 
0 commit comments