Orpheus Music Transformer

SOTA 8k multi-instrumental music transformer trained on 2.31M+ high-quality MIDIs

Check out Godzilla MIDI Dataset on Hugging Face

Duplicate in Hugging Face

for faster execution and endless generation!

Key Features

  • Efficient Architecture with RoPE: Compact and very fast 479M full attention autoregressive transformer with RoPE.
  • Extended Sequence Length: 8k tokens that comfortably fit most music compositions and facilitate long-term music structure generation.
  • Premium Training Data: Trained solely on the highest-quality MIDIs from the Godzilla MIDI dataset.
  • Optimized MIDI Encoding: Extremely efficient MIDI representation using only 3 tokens per note and 7 tokens per tri-chord.
  • Distinct Encoding Order: Features a unique duration/velocity last MIDI encoding order for refined musical expression.
  • Full-Range Instrumental Learning: True full-range MIDI instruments encoding enabling the model to learn each instrument separately.
  • Natural Composition Endings: Outro tokens that help generate smooth and natural musical conclusions.

If you enjoyed Orpheus Music Transformer, please star and duplicate. It helps a lot! 🤗

⭐ Star this Space

🔁 Duplicate this Space

⭐ Star models repo

Upload seed MIDI or click 'Generate' for random output

PLEASE NOTE:

  • Orpheus Music Transformer is a primarily music continuation/co-composition model!
  • The model works best if given some music context to work with
  • Random generation from SOS token/embeddings may not always produce good results

Generation options

16 6656
16 1024
0.1 1
0.1 0.99

Batch Previews

Add/Remove Batch

0 9