Skip to content

attention

Module: attention.py

This module implements attention layers for Neural Networks

Authors
  • Lokesh Mohanty (lokeshm@iisc.ac.in)
Version Info
  • 06/01/2025: Initial version

MultiheadAttention

Bases: MultiheadAttention

Implements a Multihead Attention layer

Source code in scirex/core/dl/nn/layers/attention.py
class MultiheadAttention(eqx.nn.MultiheadAttention):
    """
    Implements a Multihead Attention layer
    """

RotaryPositionalEmbedding

Bases: RotaryPositionalEmbedding

Implements a Rotary Positional Embedding layer

Source code in scirex/core/dl/nn/layers/attention.py
class RotaryPositionalEmbedding(eqx.nn.RotaryPositionalEmbedding):
    """
    Implements a Rotary Positional Embedding layer
    """