Module pysimt.layers

Different layer types that may be used in seq-to-seq models.

Expand source code
"""Different layer types that may be used in seq-to-seq models."""

# Basic layers
from .ff import FF
from .pool import Pool
from .fusion import Fusion
from .selector import Selector
from .positionwise_ff import PositionwiseFF

from .embedding import TFEmbedding, ProjectedEmbedding

# Attention layers
from .attention import DotAttention
from .attention import MLPAttention
from .attention import UniformAttention
from .attention import ScaledDotAttention
from .attention import MultiheadAttention
from .attention import HierarchicalAttention

# Encoder layers
from .encoders import RecurrentEncoder
from .encoders import TFEncoder
from .encoders import VisualFeaturesEncoder

# Decoder layers
from .decoders import ConditionalGRUDecoder
from .decoders import TFDecoder

Sub-modules

pysimt.layers.attention

Attention variants.

pysimt.layers.decoders

GRU and Transformer-based sequential decoders.

pysimt.layers.embedding

Embedding layer variants.

pysimt.layers.encoders

RNN and Transformer based text, image and speech encoders.

pysimt.layers.ff

A convenience feed-forward layer with non-linearity support.

pysimt.layers.fusion

A convenience layer that merges an arbitrary number of inputs.

pysimt.layers.pool

A convenience layer to apply pooling to a sequential tensor.

pysimt.layers.positionwise_ff

Positionwise feed-forward layer.

pysimt.layers.selector

A utility layer that returns a particular element from the previous layer.

pysimt.layers.transformers

Transformers related sub-layer implementations.