Transformer Attention Heatmaps Viewer

These visualizations display the transformer's attention patterns across input time steps for a selection of signal parameters and attention heads.

Legend: ❌ = Cross-Attention 🔄 = Self-Attention
Labels ↓ / Heads → head0 head1 head2 head3 head4