State-space models (Mamba) enable efficient EEG foundation models that work across varying electrode setups—crucial for real-world clinical deployment where equipment differs across hospitals.
LuMamba is an efficient EEG foundation model that handles different electrode configurations by combining topology-invariant encodings with linear-complexity state-space modeling. Pre-trained on 21,000+ hours of unlabeled EEG data, it achieves strong performance on clinical tasks while using 377× fewer computations than transformer-based alternatives.