view article Article Sparse Mixture of Experts Language Model from Scratch: Extending makeMoE with Expert Capacity By AviSoori1x • Mar 18 • 7
view article Article makeMoE: Implement a Sparse Mixture of Experts Language Model from Scratch By AviSoori1x • May 7 • 39
Model Merging Collection Model Merging is a very popular technique nowadays in LLM. Here is a chronological list of papers on the space that will help you get started with it! • 30 items • Updated Jun 12 • 217