-
Notifications
You must be signed in to change notification settings - Fork 28
Open
Description
BiDAF/layers/char_embedding.py
Line 28 in 3e5ac9c
| x = x.sum(2) # (N, seq_len, c_embd_size) |
I feel strange to see this code.
Why do you sum over word_len dimension?
Why don't you apply 1D filter over word_len dimension?
Thank you.
bkgoksel
Metadata
Metadata
Assignees
Labels
No labels