Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
doshisha-mil
/
llm-jp-3-13b-LoRAMoE4MATH_v1
like
0
Follow
MIL, Doshisha University
5
PEFT
p1atdev/gsm8k-ja-slim
HachiML/alpaca_jp_math
baber/hendrycks_math
Japanese
Mixture of Experts
License:
apache-2.0
Model card
Files
Files and versions
Community
Use this model
kimura
commited on
5 days ago
Commit
4403cb3
·
verified
·
1 Parent(s):
64eb4bb
Update README.md
Browse files
Files changed (1)
hide
show
README.md
+1
-2
README.md
CHANGED
Viewed
@@ -20,6 +20,5 @@ LoRAMoE は,複数の低ランクアダプター(LoRA)と Mixture-of-Exper
20
21
22
執筆中
23
-
普通の方法じゃインポートできません.
24
-
25
20
21
22
執筆中
23
+
普通の方法では読み込めません.
24