Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
doshisha-mil
/
llm-jp-3-13b-LoRAMoE4MATH_v1
like
0
Follow
MIL, Doshisha University
5
PEFT
p1atdev/gsm8k-ja-slim
HachiML/alpaca_jp_math
baber/hendrycks_math
Japanese
Mixture of Experts
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
kimura
commited on
5 days ago
Commit
64eb4bb
·
verified
·
1 Parent(s):
a6921ef
Update README.md
Browse files
Files changed (1)
hide
show
README.md
+1
-0
README.md
CHANGED
Viewed
@@ -20,5 +20,6 @@ LoRAMoE は,複数の低ランクアダプター(LoRA)と Mixture-of-Exper
20
21
22
執筆中
23
24
20
21
22
執筆中
23
+
普通の方法じゃインポートできません.
24
25