isanet.activation¶
Activation Functions Module.
-
class
isanet.activation.
Activation
¶ Bases:
object
Base class for the activation function.
Warning: This class should not be used directly. Use derived classes instead.
-
derivative
(x)¶ Compute the derivative of an activation function on x.
Warning: Overrides this method in order to implement the derivative of an activation function.
- Parameters
x (array-like) – It will performe the derivative on x
-
f
(x)¶ Compute the activation function on x.
Warning: Overrides this method in order to implement the activation function.
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
-
-
class
isanet.activation.
Identity
¶ Bases:
isanet.activation.Activation
This class provide the identity function and its derivative.
-
derivative
(x)¶ Return the derivative of the activation function on x:
f'(x) = 1.
- Parameters
x (array-like) – It will performe the derivative on x
- Returns
- Return type
return the derivative of the activation function on x.
-
f
(x)¶ Return the value of activation function on x:
f(x)=x.
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
- Returns
- Return type
The value of activation function on x.
-
-
class
isanet.activation.
Relu
¶ Bases:
isanet.activation.Activation
This class provide the rectified linear unit function and its derivative.
-
derivative
(x)¶ Return the derivative of the activation function on x:
if x > 0 return 1 else 0
- Parameters
x (array-like) – It will performe the derivative on x
- Returns
- Return type
return the derivative of the activation function on x.
-
f
(x)¶ Return the value of activation function on x:
f(x) = max(0,x)
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
- Returns
- Return type
The value of activation function on x.
-
-
class
isanet.activation.
Sigmoid
(a=1)¶ Bases:
isanet.activation.Activation
This class provide the logistic sigmoid function and its derivative.
-
a
¶ A value usede to dilate and shrink the sigmoid:
1 / (1 + exp(-a*x)).
- Type
float
-
derivative
(x)¶ Return the derivative of the activation function on x:
f'(x) = a*f(x)*(1-f(x)).
- Parameters
x (array-like) – It will performe the derivative on x
- Returns
- Return type
return the derivative of the activation function on x.
-
f
(x)¶ Return the value of activation function on x:
f(x) = 1 / (1 + exp(-a*x)).
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
- Returns
- Return type
The value of activation function on x.
-
-
class
isanet.activation.
Softmax
¶ Bases:
isanet.activation.Activation
Softmax activation function.
Warning: this class has not been fully implemented.
-
derivative
(x)¶ Compute the derivative of an activation function on x.
Warning: Overrides this method in order to implement the derivative of an activation function.
- Parameters
x (array-like) – It will performe the derivative on x
-
f
(x)¶ Compute the activation function on x.
Warning: Overrides this method in order to implement the activation function.
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
-
-
class
isanet.activation.
Tanh
(a=2)¶ Bases:
isanet.activation.Activation
This class provide the hyperbolic tan function and its derivative.
-
a
¶ a value usede to dilate and shrink the tanh:
tanh(a*x/2).
- Type
float,
-
derivative
(x)¶ Return the derivative of the activation function on x:
f'(x) = 1 - tanh(a*x/2)^2
- Parameters
x (array-like) – It will performe the derivative on x
- Returns
- Return type
return the derivative of the activation function on x.
-
f
(x)¶ Return the value of activation function on x:
f(x) = tanh(a*x/2).
- Parameters
x (array-like, shape = [n_samples, out_layer_dim]) – The output of a layer, usually correspond to: x = np.dot(A*W), where A is the input matrix to a layer and W is the weight matrix.
- Returns
- Return type
The value of activation function on x.
-