tensorflow_wrapper
TensorFlow layer factory implementations.
This module provides factory methods for creating TensorFlow/Keras layers with consistent configuration options. Currently supports Dense and Conv2D layers.
Key classes
- TensorflowDense: for Dense layers
- TensorflowConv2D: for Conv2D layers
Example
dense_layer = TensorflowDense.create_layer(units=64, activation='relu') conv_layer = TensorflowConv2D.create_layer(filters=32, kernel_size=3)
TensorflowConv2D
Factory class for creating Keras Conv2D layers.
Provides a static method for creating and configuring Keras Conv2D layers with consistent parameters.
Example
Create a conv layer with 64 filters and 3x3 kernel
conv1 = TensorflowConv2D.create_layer( ... filters=64, ... kernel_size=3, ... activation='relu' ... )
Create a conv layer with 32 filters and 5x5 kernel
conv2 = TensorflowConv2D.create_layer( ... filters=32, ... kernel_size=(5, 5), ... activation='relu' ... )
Source code in scirex/core/dl/tensorflow_wrapper.py
create_layer(filters, kernel_size, activation=None, kernel_initializer='glorot_uniform', bias_initializer='zeros', dtype=None)
staticmethod
Create and return a Keras Conv2D layer.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filters
|
int
|
Number of output filters |
required |
kernel_size
|
Union[int, Tuple[int, int]]
|
Size of convolution kernel |
required |
activation
|
Optional[Union[str, callable]]
|
Activation function to use |
None
|
kernel_initializer
|
str
|
Initializer for kernel weights |
'glorot_uniform'
|
bias_initializer
|
str
|
Initializer for bias vector |
'zeros'
|
dtype
|
Optional[Union[str, DType]]
|
Data type for layer computations |
None
|
Returns:
Type | Description |
---|---|
Conv2D
|
tf.keras.layers.Conv2D: The configured Conv2D layer |
Source code in scirex/core/dl/tensorflow_wrapper.py
TensorflowDense
Factory class for creating Keras Dense layers.
Provides a static method for creating and configuring Keras Dense layers with consistent parameters.
Example
Create a dense layer with 32 units and ReLU activation
dense1 = TensorflowDense.create_layer( ... units=32, ... activation='relu' ... )
Create an output layer with 1 unit and no activation
dense2 = TensorflowDense.create_layer( ... units=1, ... activation=None ... )
Source code in scirex/core/dl/tensorflow_wrapper.py
create_layer(units, activation=None, kernel_initializer='glorot_uniform', bias_initializer='zeros', dtype=None)
staticmethod
Create and return a Keras Dense layer.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
units
|
int
|
Number of output units |
required |
activation
|
Optional[Union[str, callable]]
|
Activation function to use |
None
|
kernel_initializer
|
str
|
Initializer for kernel weights |
'glorot_uniform'
|
bias_initializer
|
str
|
Initializer for bias vector |
'zeros'
|
dtype
|
Optional[Union[str, DType]]
|
Data type for layer computations |
None
|
Returns:
Type | Description |
---|---|
Dense
|
tf.keras.layers.Dense: The configured Dense layer |