-
Notifications
You must be signed in to change notification settings - Fork 121
Open
@tangchen2
Description
when the network process the attention match rnn, in the first loop, it actually process question - passage attention. In the function 'attention_rnn' when it calls attention, the outputs_ is [ [ batch, max_p_len, attn_size * 2], [batch, max_q_len, attn_size + 2] ] , i wonder how the outputs = sum(outpus_) run ? Their dimensions are not matched.
Metadata
Metadata
Assignees
Labels
No labels