File size: 5,650 Bytes
66fa5a6
 
 
 
 
 
 
 
 
 
 
 
9702c2f
5745a62
9702c2f
2a16f77
69ed144
eae4516
4d895b0
 
 
 
eae4516
66fa5a6
 
6065260
66fa5a6
6065260
 
a7b3c53
1eb10b7
7ee2a8e
9211d5d
185e8f6
 
66fa5a6
 
1029356
 
66fa5a6
 
1029356
5295464
5410e75
 
1029356
66fa5a6
 
 
297a4c3
66fa5a6
 
 
53bea96
fc25979
0334bb4
2c03338
 
 
7e191fe
 
5295464
66fa5a6
25bdaae
65b45a7
 
 
25bdaae
3269cc5
65b45a7
66fa5a6
 
ac6d9f2
cdac047
ac6d9f2
1b1355b
ac6d9f2
 
 
cdac047
ac6d9f2
66fa5a6
 
 
 
 
 
 
 
 
 
 
 
ba49d06
66fa5a6
 
 
 
 
 
 
 
 
 
ed69f62
 
d2b5f41
ed69f62
87b9c95
66fa5a6
 
 
 
 
 
87b9c95
66fa5a6
 
795d67d
66fa5a6
 
 
 
 
 
 
 
 
 
3a3e7b5
 
66fa5a6
 
4de8265
66fa5a6
4c445f6
 
 
4de8265
66fa5a6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
---
base_model:
- Bllossom/llama-3-Korean-Bllossom-70B
library_name: transformers
tags:
- mergekit
- merge

---
πŸŒ‹πŸŒ‹ Huggingface Upload Issue

Maximum individual file size is 50.0GB to upload huggingface.

To clear it, Split the file into part_aa, part_ab, part_ac chunks as my "Practical Idea".

After you download this repo on your folder path, command like this.

Download from Huggingface (change your download path, in this case "./")
```
huggingface-cli download asiansoul/llama-3-Korean-Bllossom-120B-GGUF --local-dir='./'
```

Merge split files into one gguf file (in this case, run this on "./" download path)
```
cat part_* > llama-3-korean-bllossom-120b-Q4_K_M.gguf
```

I thought uploading it as a GGUF rather than a simple original file was for your benefit, so I'm uploading it like this even if it takes a bit of trouble.
```
Perhaps this will be the first GGUF model to upload such a large GGUF file of over 50GB to huggingface?

Other 120B model for the individual file size is under 50GB, That is why it can be uploaded.

Sometimes we need to use a trick called chunks.

```

Please wait to upload..... 

### πŸ‡°πŸ‡· About the JayLee "AsianSoul"

```
"A leader who can make you rich πŸ’΅ !!!"

"Prove yourself with actual results, not just saying I know more than you!!!"
```

<a href="https://ibb.co/4g2SJVM"><img src="https://i.ibb.co/PzMWt64/Screenshot-2024-05-18-at-11-08-12-PM.png" alt="Screenshot-2024-05-18-at-11-08-12-PM" border="0"></a>

### About this model storytelling

This is a 128B model based on [Bllossom/llama-3-Korean-Bllossom-70B](https://huggingface.co./Bllossom/llama-3-Korean-Bllossom-70B)

β˜• I started this Korean 120B model merge while drinking an iced Americano at Starbucks referring to other [Cognitive Computations 120B](https://huggingface.co./cognitivecomputations/MegaDolphin-120b).

If you walk around Starbucks in Seoul, Korea, you will see someone creating a merge and an application based on it. 

At that time, please come up to me and say "hello".

"Also, if you want to create the Application project you want and provide me with support, I will create the entire architecture for you whatever it is."

🏎️ I am a person whose goal is to turn the great results created by great genius scientists & groups around the world into profitable ones.

```
My role model is J. Robert Oppenheimer!!!

J. Robert Oppenheimer is highly regarded for his ability to gather and lead a team of brilliant scientists, merging their diverse expertise and efforts towards a common goal. 
```
[Learn more about J. Robert Oppenheimer](https://en.wikipedia.org/wiki/J._Robert_Oppenheimer).

I hope this 120B is a helpful model for your future.

```
🌍 Collaboration is always welcome 🌍

πŸ‘Š You can't beat these giant corporations & groups alone and you can never become rich. 

Now we have to come together. 

People who can actually become rich together, let's collaborate with me.!!! 🍸
```

```
About Bllossom/llama-3-Korean-Bllossom-70B
- Full model released in Korean over 100GB by Blossom team
- First in Korean! Expansion of Korean vocabulary to over 30,000 words
- Capable of processing Korean context that is approximately 25% longer than Llama3
- Connecting Korean-English knowledge using the Korean-English Parallel Corpus (pre-study)
- Fine tuning using data produced by linguists considering Korean culture and language
- Reinforcement learning

πŸ›°οΈ About asiansoul/llama-3-Korean-Bllossom-120B-GGUF
- Q4_K_M : On a GPU with 68GB / more OR a CPU with 68G / more memory
- More Quantization ones i hope to upload, but your computer won't be able to handle it then. you know what i mean!!
```

### Models Merged

The following models were included in the merge:
* [Bllossom/llama-3-Korean-Bllossom-70B](https://huggingface.co./Bllossom/llama-3-Korean-Bllossom-70B)


### Ollama 

Check the information indicated above and run it when your computer is ready. 

πŸ₯Ά Otherwise, your computer will freeze.

* Create

```
ollama create Bllossom -f ./Modelfile_Q4_K_M 

```

* MODELFILE (you can change this for your preference)

```
FROM ./llama-3-korean-bllossom-120b-Q4_K_M.gguf
TEMPLATE """{{ if .System }}<|start_header_id|>system<|end_header_id|>

{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>

{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>

{{ .Response }}<|eot_id|>"""


SYSTEM """
당신은 μœ μš©ν•œ AI μ–΄μ‹œμŠ€ν„΄νŠΈμž…λ‹ˆλ‹€. μ‚¬μš©μžμ˜ μ§ˆμ˜μ— λŒ€ν•΄ μΉœμ ˆν•˜κ³  μ •ν™•ν•˜κ²Œ λ‹΅λ³€ν•΄μ•Ό ν•©λ‹ˆλ‹€.
You are a helpful AI assistant, you'll need to answer users' queries in a friendly and accurate manner.
"""

PARAMETER num_ctx 1024
PARAMETER num_keep 24
PARAMETER temperature 0.6
PARAMETER top_p 0.9
PARAMETER num_predict 2048
PARAMETER num_thread 20
PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"
```

### Configuration

The following YAML configuration was used to produce this model:

```yaml
slices:
- sources:
  - layer_range: [0, 20]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [10, 30]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [20, 40]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [30, 50]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [40, 60]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [50, 70]
    model: Bllossom/llama-3-Korean-Bllossom-70B
- sources:
  - layer_range: [60, 80]
    model: Bllossom/llama-3-Korean-Bllossom-70B
merge_method: passthrough
dtype: float16

```