Transform Brain Signals
Into Natural Language
NEST is an open-source deep learning framework that decodes EEG brain activity into readable text with state-of-the-art accuracy on the ZuCo benchmark.
The Future of Brain-Computer Interfaces
NEST bridges the gap between neuroscience and natural language processing, enabling a new generation of assistive technology and cognitive research tools.
Works with standard 105-channel EEG headsets — no surgical implants, no specialized hardware beyond a research-grade EEG cap.
Optimized transformer architecture processes raw EEG signals and generates text output in under 100ms per word on modern GPUs.
Benchmarked on ZuCo dataset with 12 subjects and 400+ sentences. Reproducible results with released checkpoints and training code.
Pre-trained on ZuCo, fine-tunable on custom datasets. Works with reading, listening, and imagination paradigms.
Attention maps reveal which EEG channels and time windows contribute most to each decoded word — enabling neuroscientific insights.
MIT licensed with full training pipeline, pre-trained weights, dataset loaders, and evaluation scripts. Built for the research community.
From Brain Activity to Text
NEST uses a transformer-based encoder-decoder architecture to translate raw EEG waveforms into natural language sequences.
EEG Recording
Capture 105-channel brain signals at 500Hz while subject reads or imagines text
Preprocessing
Bandpass filtering, epoch extraction, and normalization across channels and subjects
NEST Model
6-layer transformer encoder processes EEG features, cross-attention decoder generates text tokens
Text Output
Decoded thoughts appear as natural language text — ready for downstream NLP tasks
Transformer-Based Pipeline
End-to-end architecture combining EEG signal processing with large language model decoding.
Try It Yourself
Experience NEST's real-time EEG-to-text decoding with our interactive visualization.
Click "Start Demo" to begin decoding...
Powerful Features
Everything you need for brain signal decoding research and production applications.
Real-Time Decoding
Process EEG signals and generate text output in milliseconds with our optimized inference pipeline. Sub-100ms latency on NVIDIA GPUs.
Competitive Accuracy
Subject-independent evaluation on ZuCo — train on 8 subjects, test on 2 held-out. Honest WER/CER metrics reported from real trained checkpoints.
Simple Python API
Install with pip, load a checkpoint, and start decoding in under 10 lines of code. Comprehensive documentation and examples included.
Pre-Trained Checkpoints
Download ready-to-use model weights trained on ZuCo. Multiple model sizes from lightweight to high-accuracy variants.
Research Ready
Full training pipeline for custom datasets. Transfer learning support, attention visualization, and electrode importance maps.
MIT Open Source
Fully open-source codebase with permissive licensing. Active development, community contributions welcome.
Our results demonstrate that transformer architectures can effectively bridge the modality gap between EEG signals and natural language, opening new avenues for non-invasive brain-computer interfaces.
From the NEST Technical Report, ZuCo Benchmark Results (2026)
Start Decoding Brain Signals Today
Install NEST with pip and decode your first EEG recording in minutes.
# Install NEST pip install nest-eeg # Or install from source git clone https://github.com/wazder/NEST.git cd NEST && pip install -e . # Run inference from nest import NESTDecoder decoder = NESTDecoder.from_pretrained("nest-zuco-v1") text = decoder.decode(eeg_epochs)