File tree 5 files changed +8
-8
lines changed
5 files changed +8
-8
lines changed Original file line number Diff line number Diff line change @@ -14,5 +14,4 @@ $activationFunction = new SiLU();
14
14
```
15
15
16
16
### References
17
- > - S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning.
18
- > - P. Ramachandran er al. (2017). Swish: A Self-gated Activation Function.
17
+ [ ^ 1 ] : S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning.
Original file line number Diff line number Diff line change @@ -17,4 +17,5 @@ $layer = new Swish(new Constant(1.0));
17
17
```
18
18
19
19
## References
20
- [ ^ 1 ] : P. Ramachandran et al. (2017). Searching for Activation Functions.
20
+ [ ^ 1 ] : P. Ramachandran er al. (2017). Swish: A Self-gated Activation Function.
21
+ [ ^ 2 ] : P. Ramachandran et al. (2017). Searching for Activation Functions.
Original file line number Diff line number Diff line change 14
14
interface Decision extends BinaryNode
15
15
{
16
16
/**
17
- * Return the impurity of the labels as a result of the decision .
17
+ * Return the impurity of the labels within the node .
18
18
*
19
19
* @return float
20
20
*/
Original file line number Diff line number Diff line change 13
13
* the [Sigmoid](sigmoid.md) activation function acting as a self-gating mechanism.
14
14
*
15
15
* References:
16
- * [1] S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function
17
- * Approximation in Reinforcement Learning.
18
- * [2] P. Ramachandran et al. (2017). Swish: A Self-gated Activation Function.
16
+ * [1] S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in
17
+ * Reinforcement Learning.
19
18
*
20
19
* @category Machine Learning
21
20
* @package Rubix/ML
Original file line number Diff line number Diff line change 19
19
* *beta* parameter allows each activation function in the layer to tailor its output to the training set by
20
20
* interpolating between the linear function and ReLU.
21
21
*
22
- * [1] P. Ramachandran et al. (2017). Searching for Activation Functions.
22
+ * [1] P. Ramachandran et al. (2017). Swish: A Self-gated Activation Function.
23
+ * [2] P. Ramachandran et al. (2017). Searching for Activation Functions.
23
24
*
24
25
* @category Machine Learning
25
26
* @package Rubix/ML
You can’t perform that action at this time.
0 commit comments