Skip to content

isEmmanuelOlowe/moe_merger

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

MOE Merger

Merge moe layers and make a smaller model

install the script by:

git clone https://github.com/isEmmanuelOlowe/moe_merger
cd moe_merger
pip install -r requirements.txt

This will convert Jamba 16 experts into 4 experts.

The notebook is available here here.

python main.py --model_id "ai21labs/Jamba-v0.1" --temp_dir "tmp/temp" --save_dir "tmp/Jamba-4xMoE_slerp" --num_experts_per_tok=2 --num_local_experts=4

This script is inspired by work on Mistral Merger availabele here.

About

Merge expert layers using safetensors and slerp

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published