Collection of unfinished tutorials. May be good for educational purposes.
Deliberately slow-moving, explicit tutorial. I tried to thoroughly explain everything that I found in any way confusing.
Implements simple seq2seq model described in Sutskever at al., 2014 and tests it against toy memorization task.
1-seq2seq Picture from Sutskever at al., 2014
Encoder is bidirectional now. Decoder is implemented using
tf.nn.raw_rnn. It feeds previously generated tokens during training as inputs, instead of target sequence.
2-seq2seq-feed-previous Picture from Deep Learning for Chatbots
3 - Using tf.contrib.seq2seq (TF<=1.1)
New dynamic seq2seq appeared in r1.0. Let's try it.
UPDATE: that this tutorial doesn't work with tf version > 1.1, API. I recommend checking out new official tutorial instead to learn high-level seq2seq API.