File size: 1,692 Bytes
b1e1bc0
 
 
366f2d7
 
025528f
 
b1e1bc0
 
 
 
366f2d7
b1e1bc0
366f2d7
 
b1e1bc0
366f2d7
b1e1bc0
025528f
b1e1bc0
 
366f2d7
 
b1e1bc0
a9bdeb8
b1e1bc0
 
366f2d7
 
b1e1bc0
 
366f2d7
 
b1e1bc0
 
366f2d7
 
b1e1bc0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
---
library_name: peft
base_model: KT-AI/midm-bitext-S-7B-inst-v1
datasets:
- nsmc
metrics:
- accuracy 90.0%
---

# Model Card for Model ID

## Model Description

### midm-bitext-S-7B-inst-v1 ๋ฏธ์„ธ ํŠœ๋‹
ํ•ด๋‹น ๋ชจ๋ธ์€ ๋„ค์ด๋ฒ„ ์˜ํ™” ๋ฆฌ๋ทฐ ๋ฐ์ดํ„ฐ์…‹์ธ NSMC์— ๋Œ€ํ•ด KT-AI/midm-bitext-S-7B-inst-v1์„ ๋ฏธ์„ธํŠœ๋‹ํ•œ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

์˜ํ™” ๋ฆฌ๋ทฐ ํ…์ŠคํŠธ๋ฅผ ํ”„๋กฌํ”„ํŠธ์— ํฌํ•จํ•˜์—ฌ ๋ชจ๋ธ์— ์ž…๋ ฅ์‹œ,'๊ธ์ •' ๋˜๋Š” '๋ถ€์ •' ์ด๋ผ๊ณ  ์˜ˆ์ธก ํ…์ŠคํŠธ๋ฅผ ์ง์ ‘ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.

๊ฒฐ๊ณผ์ ์œผ๋กœ, ์ •ํ™•๋„ 90.0%๋ฅผ ๊ฐ€์ง€๋Š” ๋ชจ๋ธ์„ ์™„์„ฑํ–ˆ์Šต๋‹ˆ๋‹ค.


### Train, Test ๋ฐ์ดํ„ฐ์…‹
ํ•ด๋‹น ๋ชจ๋ธ์€ NSMC์˜ train ๋ฐ์ดํ„ฐ์˜ ์ƒ์œ„ 2,000๊ฐœ์˜ ์ƒ˜ํ”Œ์„ ํ•™์Šต์— ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.

ํ•ด๋‹น ๋ชจ๋ธ์€ NSMC์˜ test ๋ฐ์ดํ„ฐ์˜ ์ƒ์œ„ 1,000๊ฐœ์˜ ์ƒ˜ํ”Œ์„ ํ‰๊ฐ€์— ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.


## Training_step_loss
![image/png](https://cdn-uploads.huggingface.co/production/uploads/651bf3be3fa6c4e182910420/WIBCaCIemHUS1QXqKzyPy.png)


## Confusion_Matrix
![image/png](https://cdn-uploads.huggingface.co/production/uploads/651bf3be3fa6c4e182910420/X2nTz9ltBFbAeqWJBfQz_.png)


## Accuracy_Classification_Report
![image/png](https://cdn-uploads.huggingface.co/production/uploads/651bf3be3fa6c4e182910420/GJpzAVO7ee3cQcbHxiYSG.png)


## Training procedure

The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16

### Framework versions

- PEFT 0.7.0