Skip to content

Conversation

@diana3372
Copy link

No description provided.

@gautierdag
Copy link
Member

You might want to separate the commits from the previous PR #68 (Sender) from this PR. Or else it gets confusing what are the differences between both PR and we can't tell if you have changed any of the files you created in #68 .

@gautierdag gautierdag self-requested a review March 18, 2019 18:38
# c0
c = torch.zeros([batch_size, self.hidden_size], device=device)

state = (h[None, ...], c[None, ...])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can remove the whole state variable since you are initializing it to zero for all cases. See https://pytorch.org/docs/stable/nn.html?highlight=lstm#torch.nn.LSTM

nn.init.constant_(self.rnn.bias_hh_l0, val=0)
nn.init.constant_(self.rnn.bias_hh_l0[self.hidden_size:2 * self.hidden_size], val=1)

def forward(self, m):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't particularly like having one letter variables if they represent uncommon things or are not iterators. If I could suggest changing m to messages maybe?

@diana3372 diana3372 closed this Mar 18, 2019
@diana3372 diana3372 deleted the receiver branch March 18, 2019 19:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants