You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Layer normalization seems to be pretty popular for RNNs nowadays, and it is worth having an implementation available. Several people seem to have already rolled their own (like @ajbrock), so this issue also aims to prevent several people working on this at the same time, as they can announce their intentions here.
The same applies for weight normalization. This issue can be used to discuss both, but implementations could be done in separate PRs.
ajbrock, spro, chenzhekl, kylemcdonald, jundengdeng and 9 more