Skip to content

Commit 0e2d6b2

Browse files
cozekcclauss
authored andcommitted
adding softmax function (TheAlgorithms#1267)
* adding softmax function * wraped lines as asked
1 parent 0e333ae commit 0e2d6b2

File tree

1 file changed

+56
-0
lines changed

1 file changed

+56
-0
lines changed

maths/softmax.py

+56
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
"""
2+
This script demonstrates the implementation of the Softmax function.
3+
4+
Its a function that takes as input a vector of K real numbers, and normalizes
5+
it into a probability distribution consisting of K probabilities proportional
6+
to the exponentials of the input numbers. After softmax, the elements of the
7+
vector always sum up to 1.
8+
9+
Script inspired from its corresponding Wikipedia article
10+
https://en.wikipedia.org/wiki/Softmax_function
11+
"""
12+
13+
import numpy as np
14+
15+
16+
def softmax(vector):
17+
"""
18+
Implements the softmax function
19+
20+
Parameters:
21+
vector (np.array,list,tuple): A numpy array of shape (1,n)
22+
consisting of real values or a similar list,tuple
23+
24+
25+
Returns:
26+
softmax_vec (np.array): The input numpy array after applying
27+
softmax.
28+
29+
The softmax vector adds up to one. We need to ceil to mitigate for
30+
precision
31+
>>> np.ceil(np.sum(softmax([1,2,3,4])))
32+
1.0
33+
34+
>>> vec = np.array([5,5])
35+
>>> softmax(vec)
36+
array([0.5, 0.5])
37+
38+
>>> softmax([0])
39+
array([1.])
40+
"""
41+
42+
# Calculate e^x for each x in your vector where e is Euler's
43+
# number (approximately 2.718)
44+
exponentVector = np.exp(vector)
45+
46+
# Add up the all the exponentials
47+
sumOfExponents = np.sum(exponentVector)
48+
49+
# Divide every exponent by the sum of all exponents
50+
softmax_vector = exponentVector / sumOfExponents
51+
52+
return softmax_vector
53+
54+
55+
if __name__ == "__main__":
56+
print(softmax((0,)))

0 commit comments

Comments
 (0)