Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support zoneout on lstm cell state and add recurrent dropout #29

Open
azraelkuan opened this issue Nov 12, 2020 · 2 comments
Open

Support zoneout on lstm cell state and add recurrent dropout #29

azraelkuan opened this issue Nov 12, 2020 · 2 comments

Comments

@azraelkuan
Copy link

hi,any plan about these two questions

  1. lstm zoneout on cell state the same with hidden state

    if zoneout_prob:
    if training:
    h[-1] = (h[-1] - h[-2]) * zoneout_mask[t] + h[-2]
    else:
    h[-1] = zoneout_prob * h[-2] + (1 - zoneout_prob) * h[-1]

  2. add recurrent dropout the same with keras
    https://github.com/tensorflow/tensorflow/blob/fcc4b966f1265f466e82617020af93670141b009/tensorflow/python/keras/layers/recurrent.py#L2450-L2459

thanks!

@sharvil
Copy link
Contributor

sharvil commented Nov 13, 2020

Do you have a specific use-case in mind for these two additions? I'm open to adding these features – it would be nice to have a sense of the scope and value of these changes.

@azraelkuan
Copy link
Author

azraelkuan commented Nov 15, 2020

@sharvil oh,thanks to your reply.
for the first question:
in the original paper, it has two different zonout on lstm states.
for the second question:
the dropout will be benifit for very long sequence. like tts acoustic modeling

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants