Nexesenex commited on
Commit
195c80b
·
verified ·
1 Parent(s): 0eb03e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -1,13 +1,13 @@
1
  ---
2
  base_model:
3
- - Nexesenex/Llama_3.x_8b_Smarteaz_0.21_R1
4
- - Nexesenex/Llama_3.x_8b_Smarteaz_0.11a
5
- - Nexesenex/Llama_3.x_8b_Smarteaz_0.21_SN
6
  library_name: transformers
7
  tags:
8
  - mergekit
9
  - merge
10
-
11
  ---
12
  # merge
13
 
@@ -16,13 +16,13 @@ This is a merge of pre-trained language models created using [mergekit](https://
16
  ## Merge Details
17
  ### Merge Method
18
 
19
- This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Nexesenex/Llama_3.x_8b_Smarteaz_0.11a](https://huggingface.co/Nexesenex/Llama_3.x_8b_Smarteaz_0.11a) as a base.
20
 
21
  ### Models Merged
22
 
23
  The following models were included in the merge:
24
- * [Nexesenex/Llama_3.x_8b_Smarteaz_0.21_R1](https://huggingface.co/Nexesenex/Llama_3.x_8b_Smarteaz_0.21_R1)
25
- * [Nexesenex/Llama_3.x_8b_Smarteaz_0.21_SN](https://huggingface.co/Nexesenex/Llama_3.x_8b_Smarteaz_0.21_SN)
26
 
27
  ### Configuration
28
 
@@ -31,16 +31,16 @@ The following YAML configuration was used to produce this model:
31
  ```yaml
32
  merge_method: model_stock
33
  models:
34
- - model: Nexesenex/Llama_3.x_8b_Smarteaz_0.21_R1
35
  parameters:
36
  weight: 1.0
37
- - model: Nexesenex/Llama_3.x_8b_Smarteaz_0.21_SN
38
  parameters:
39
  weight: 1.0
40
- base_model: Nexesenex/Llama_3.x_8b_Smarteaz_0.11a
41
  dtype: bfloat16
42
  normalize: true
43
  chat_template: auto
44
  tokenizer:
45
  source: union
46
- ```
 
1
  ---
2
  base_model:
3
+ - Nexesenex/Llama_3.1_8b_Smarteaz_0.21_R1
4
+ - Nexesenex/Llama_3.1_8b_Smarteaz_0.11a
5
+ - Nexesenex/Llama_3.1_8b_Smarteaz_0.21_SN
6
  library_name: transformers
7
  tags:
8
  - mergekit
9
  - merge
10
+ license: llama3.1
11
  ---
12
  # merge
13
 
 
16
  ## Merge Details
17
  ### Merge Method
18
 
19
+ This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Nexesenex/Llama_3.1_8b_Smarteaz_0.11a](https://huggingface.co/Nexesenex/Llama_3.x_8b_Smarteaz_0.11a) as a base.
20
 
21
  ### Models Merged
22
 
23
  The following models were included in the merge:
24
+ * [Nexesenex/Llama_3.1_8b_Smarteaz_0.21_R1](https://huggingface.co/Nexesenex/Llama_3.1_8b_Smarteaz_0.21_R1)
25
+ * [Nexesenex/Llama_3.1_8b_Smarteaz_0.21_SN](https://huggingface.co/Nexesenex/Llama_3.1_8b_Smarteaz_0.21_SN)
26
 
27
  ### Configuration
28
 
 
31
  ```yaml
32
  merge_method: model_stock
33
  models:
34
+ - model: Nexesenex/Llama_3.1_8b_Smarteaz_0.21_R1
35
  parameters:
36
  weight: 1.0
37
+ - model: Nexesenex/Llama_3.1_8b_Smarteaz_0.21_SN
38
  parameters:
39
  weight: 1.0
40
+ base_model: Nexesenex/Llama_3.1_8b_Smarteaz_0.11a
41
  dtype: bfloat16
42
  normalize: true
43
  chat_template: auto
44
  tokenizer:
45
  source: union
46
+ ```