Skip to content

Conversation

@ZhangYuef
Copy link

To extend the universal support of ANN model I add the following new features for ANN:

  • Add dropout layer
  • Add embedding layer
  • Add two activation functions: relu and tanh
  • Revise some typos and mistaken representations in comments

Please check and comment on those code : )

- add dropout layer and its corresponding model.
- test it on Iris dataset with dropout rate setting as 0.2 get 0.96 accuracy if we do not use the dropout layer (dropout rate = 0) the accuracy is 0.88.
- add embedding layer and its corresponding model for ANN
- this is NOT a default setting for the construction of MLP forward topology
- add relu and tanh activation methods
- use them in feed forward topology part (path: `alink/operator/common/classification/ann/FeedForwardTopology.java`)
- use sigmoid as default one
@CLAassistant
Copy link

CLAassistant commented Sep 20, 2020

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@ZhangYuef ZhangYuef mentioned this pull request Sep 20, 2020
@LastBlackRose LastBlackRose mentioned this pull request Feb 9, 2022
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants