--- license: cc-by-nc-nd-4.0 language: - ko pipeline_tag: text-generation tags: - Mixture of experts --- Model Card for ME-MOE-7Bx2_test Developed by : 메가스터디교육, 프리딕션, 마이스 Base Model : megastudyedu/ME-7B-v1.0 Expert Models : megastudyedu/ME-7B-v1.1, macadeliccc/WestLake-7B-v2-laser-truthy-dpo Method : merge-kit을 활용하여 MOE를 구현했습니다.