From Brain to Text
NEST uses a transformer-based encoder-decoder architecture to decode EEG signals into natural language. Here is a complete overview of our pipeline, model, and research context.
Why We Built NEST
Brain-computer interfaces hold transformative potential for people with motor or speech disabilities, for scientific research into cognition, and for the future of human-computer interaction.
Existing approaches require invasive surgery, proprietary hardware, or closed-source implementations. NEST is our answer: a fully open, non-invasive, reproducible framework that researchers and engineers can build on.
By publishing weights, datasets, and complete training code under an MIT license, we aim to accelerate the entire field of EEG-based BCI research.
End-to-End Architecture
NEST transforms raw EEG recordings into natural language through a four-stage pipeline designed for accuracy and speed.
EEG Input
105-channel EEG at 500Hz. Captured during natural reading with standard research-grade headset.
Preprocessing
Bandpass filtering (0.5–100Hz), artifact removal via ICA, epoch extraction aligned to word onsets.
Transformer
6-layer EEG encoder with 8-head attention. Cross-attention decoder generates word-level text tokens.
Text Output
Natural language text decoded from brain signals, ready for NLP downstream tasks.
Architecture Details
The NEST model is built on a standard transformer architecture adapted for the unique structure of EEG time-series data.
EEG Encoder
6 Transformer Layers
512 hidden dimensions, 8 multi-head attention layers. Processes temporal and spatial EEG patterns into a rich latent representation.
Cross-Attention Bridge
8 Attention Heads
Learns alignment between EEG feature sequences and text token positions. Enables the decoder to selectively attend to relevant brain patterns.
Text Decoder
6 Transformer Layers
50,000 vocabulary size. Autoregressive generation with beam search decoding and length normalization.
Training Details
Trained on the ZuCo dataset with 12 subjects, 400+ natural reading sentences, and 100 epochs of supervised learning.
Hardware & Software
NEST is designed to run on standard research hardware. No exotic setups required.
Minimum Requirements
- ✓ Python 3.9+
- ✓ PyTorch 2.0+
- ✓ 8GB RAM
- ✓ CPU-only inference (slow)
- ✓ 10GB disk space
Recommended for Training
- ★ NVIDIA GPU (RTX 3080+)
- ★ CUDA 11.8+
- ★ 16GB+ GPU VRAM
- ★ 32GB System RAM
- ★ 50GB disk space
Start Your BCI Research
Dive into the demo, read the research paper, or clone the repo and start training your own models.