grelu.model.position#

Functions to generate positional encodings.

Functions#

get_central_mask(→ torch.Tensor)

Create a positional embedding based on a central mask.

get_exponential_embedding(→ torch.Tensor)

Create a positional embedding based on exponential decay.

Module Contents#

grelu.model.position.get_central_mask(x: torch.Tensor, out_channels: int) torch.Tensor[source]#

Create a positional embedding based on a central mask.

Parameters:
  • x – Input tensor of shape (N, L, C)

  • out_channels – Number of channels in the output

Returns:

Positional embedding tensor of shape (L, channels)

grelu.model.position.get_exponential_embedding(x: torch.Tensor, out_channels: int, min_half_life: float = 3.0) torch.Tensor[source]#

Create a positional embedding based on exponential decay.

Parameters:
  • x – Input tensor of shape (N, L, C)

  • out_channels – Number of channels in the output

  • min_half_life – Minimum half-life for exponential decay

Returns:

Positional embedding tensor of shape (L, channels)