activation
Module: activation.py
This module implements activation functions for Neural Networks
Version Info
- 01/01/2025: Initial version
celu(x)
elu(x)
gelu(x)
glu(x)
hard_sigmoid(x)
Compute hard_sigmoid activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
hard_swish(x)
Compute hard_swish activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
hard_tanh(x)
Compute hard_tanh activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
leaky_relu(x)
Compute leaky_relu activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
log_sigmoid(x)
Compute log_sigmoid activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
mish(x)
relu(x)
relu6(x)
selu(x)
sigmoid(x)
silu(x)
soft_sign(x)
Compute soft_sign activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
softplus(x)
sparse_plus(x)
Compute sparse_plus activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
sparse_sigmoid(x)
Compute sparse_sigmoid activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |
squareplus(x)
Compute squareplus activation
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x
|
Array
|
input data |
required |