Skip to content
This repository has been archived by the owner on Mar 12, 2020. It is now read-only.

Add recurrent_dropout and recurrent_regularizer to LSTM Layer #29

Open
bmigette opened this issue Jan 16, 2018 · 5 comments
Open

Add recurrent_dropout and recurrent_regularizer to LSTM Layer #29

bmigette opened this issue Jan 16, 2018 · 5 comments

Comments

@bmigette
Copy link

Hello,

Would be great to consider adding ecurrent_dropout and recurrent_regularizer parameters to LSTM Layer
See: https://keras.io/layers/recurrent/#lstm

@deepakkumar1984
Copy link
Member

LSTM is bit tricky and I am in need of support from CNTK team. If you know any developers who understand RNN very well please please request him/her to contribute.

@bmigette
Copy link
Author

I know how theorie goes for RNN And dropout, however I am not familiar with CNTK At all...

Maybe this could help:
https://stackoverflow.com/questions/44924690/keras-the-difference-between-lstm-dropout-and-lstm-recurrent-dropout

https://pdfs.semanticscholar.org/3061/db5aab0b3f6070ea0f19f8e76470e44aefa5.pdf

http://www.aclweb.org/anthology/C16-1165

If you have support from CNTK Team, maybe worth to ask them if there's some example available.

@deepakkumar1984
Copy link
Member

I have asked for help. Will wait and hopefully can implement soon.

@bmigette
Copy link
Author

Good stuff !

@bmigette
Copy link
Author

Any update on this ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants