Module: others.py
This module implements other helpful functions for Neural Networks
Authors
- Lokesh Mohanty (lokeshm@iisc.ac.in)
Version Info
- 01/01/2025: Initial version
- 06/01/2025: Rename and add other functions from jax.nn
log_softmax(x)
Compute log_softmax activation
Parameters:
Name |
Type |
Description |
Default |
x
|
Array
|
|
required
|
Source code in scirex/core/dl/nn/others.py
| def log_softmax(x: jax.Array) -> jax.Array:
"""
Compute log_softmax activation
Args:
x: input data
"""
return jax.nn.log_softmax(x)
|
one_hot(x)
Compute one_hot activation
Parameters:
Name |
Type |
Description |
Default |
x
|
Array
|
|
required
|
Source code in scirex/core/dl/nn/others.py
| def one_hot(x: jax.Array) -> jax.Array:
"""
Compute one_hot activation
Args:
x: input data
"""
return jax.nn.one_hot(x)
|
softmax(x)
Compute softmax activation
Parameters:
Name |
Type |
Description |
Default |
x
|
Array
|
|
required
|
Source code in scirex/core/dl/nn/others.py
| def softmax(x: jax.Array) -> jax.Array:
"""
Compute softmax activation
Args:
x: input data
"""
return jax.nn.softmax(x)
|
standardize(x)
Compute standardize activation
Parameters:
Name |
Type |
Description |
Default |
x
|
Array
|
|
required
|
Source code in scirex/core/dl/nn/others.py
| def standardize(x: jax.Array) -> jax.Array:
"""
Compute standardize activation
Args:
x: input data
"""
return jax.nn.standardize(x)
|