jaspercatapang commited on
Commit
aa33d5e
·
verified ·
1 Parent(s): a6de6f3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ This is a Mixture-of-Experts (MoE) or a merge of pre-trained language models cre
18
  ## Merge Details
19
  ### Merge Method
20
 
21
- This model is an expertimental merge using the [linear](https://arxiv.org/abs/2203.05482) merge method.
22
 
23
  ### Models Merged
24
 
 
18
  ## Merge Details
19
  ### Merge Method
20
 
21
+ This model is an expertimental merge using the [linear](https://arxiv.org/abs/2203.05482) merge method. This is to assess the degree of which the DPO has an effect, in terms of censoring, as used in [jondurbin/bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2).
22
 
23
  ### Models Merged
24