Skip to content

Commit b9105c0

Browse files
committed
Arrange citations
1 parent 59dc4c0 commit b9105c0

File tree

5 files changed

+8
-8
lines changed

5 files changed

+8
-8
lines changed

docs/neural-network/activation-functions/silu.md

+1-2
Original file line numberDiff line numberDiff line change
@@ -14,5 +14,4 @@ $activationFunction = new SiLU();
1414
```
1515

1616
### References
17-
>- S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning.
18-
>- P. Ramachandran er al. (2017). Swish: A Self-gated Activation Function.
17+
[^1]: S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning.

docs/neural-network/hidden-layers/swish.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -17,4 +17,5 @@ $layer = new Swish(new Constant(1.0));
1717
```
1818

1919
## References
20-
[^1]: P. Ramachandran et al. (2017). Searching for Activation Functions.
20+
[^1]: P. Ramachandran er al. (2017). Swish: A Self-gated Activation Function.
21+
[^2]: P. Ramachandran et al. (2017). Searching for Activation Functions.

src/Graph/Nodes/Decision.php

+1-1
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
interface Decision extends BinaryNode
1515
{
1616
/**
17-
* Return the impurity of the labels as a result of the decision.
17+
* Return the impurity of the labels within the node.
1818
*
1919
* @return float
2020
*/

src/NeuralNet/ActivationFunctions/SiLU.php

+2-3
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,8 @@
1313
* the [Sigmoid](sigmoid.md) activation function acting as a self-gating mechanism.
1414
*
1515
* References:
16-
* [1] S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function
17-
* Approximation in Reinforcement Learning.
18-
* [2] P. Ramachandran et al. (2017). Swish: A Self-gated Activation Function.
16+
* [1] S. Elwing et al. (2017). Sigmoid-Weighted Linear Units for Neural Network Function Approximation in
17+
* Reinforcement Learning.
1918
*
2019
* @category Machine Learning
2120
* @package Rubix/ML

src/NeuralNet/Layers/Swish.php

+2-1
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,8 @@
1919
* *beta* parameter allows each activation function in the layer to tailor its output to the training set by
2020
* interpolating between the linear function and ReLU.
2121
*
22-
* [1] P. Ramachandran et al. (2017). Searching for Activation Functions.
22+
* [1] P. Ramachandran et al. (2017). Swish: A Self-gated Activation Function.
23+
* [2] P. Ramachandran et al. (2017). Searching for Activation Functions.
2324
*
2425
* @category Machine Learning
2526
* @package Rubix/ML

0 commit comments

Comments
 (0)