The standard attention mechanism in transformers that becomes increasingly expensive as sequence length grows, because it compares every token to every other token.