Update README.md
Browse files
README.md
CHANGED
@@ -1,23 +1,37 @@
|
|
1 |
---
|
|
|
|
|
|
|
2 |
tags:
|
3 |
-
- merge
|
4 |
- mergekit
|
|
|
5 |
- lazymergekit
|
6 |
- tiiuae/falcon-11B
|
7 |
-
|
8 |
-
|
9 |
-
-
|
10 |
---
|
|
|
11 |
|
12 |
-
|
13 |
|
14 |
-
|
15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
* [tiiuae/falcon-11B](https://huggingface.co/tiiuae/falcon-11B)
|
17 |
|
18 |
-
|
|
|
|
|
|
|
19 |
|
20 |
```yaml
|
|
|
21 |
slices:
|
22 |
- sources:
|
23 |
- model: tiiuae/falcon-11B
|
@@ -29,27 +43,18 @@ merge_method: passthrough
|
|
29 |
dtype: bfloat16
|
30 |
```
|
31 |
|
32 |
-
|
33 |
|
34 |
-
|
35 |
-
!pip install -qU transformers accelerate
|
36 |
|
37 |
-
|
38 |
-
|
39 |
-
import torch
|
40 |
|
41 |
-
|
42 |
-
|
43 |
|
44 |
-
|
45 |
-
|
46 |
-
pipeline = transformers.pipeline(
|
47 |
-
"text-generation",
|
48 |
-
model=model,
|
49 |
-
torch_dtype=torch.float16,
|
50 |
-
device_map="auto",
|
51 |
-
)
|
52 |
|
53 |
-
|
54 |
-
|
55 |
-
```
|
|
|
1 |
---
|
2 |
+
base_model:
|
3 |
+
- tiiuae/falcon-11B
|
4 |
+
library_name: transformers
|
5 |
tags:
|
|
|
6 |
- mergekit
|
7 |
+
- merge
|
8 |
- lazymergekit
|
9 |
- tiiuae/falcon-11B
|
10 |
+
license: apache-2.0
|
11 |
+
language:
|
12 |
+
- da
|
13 |
---
|
14 |
+
# sliced
|
15 |
|
16 |
+
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
17 |
|
18 |
+
## Merge Details
|
19 |
+
### Merge Method
|
20 |
+
|
21 |
+
This model was pruned using the passthrough merge method.
|
22 |
+
|
23 |
+
### Models Merged
|
24 |
+
|
25 |
+
The following models were included in the merge:
|
26 |
* [tiiuae/falcon-11B](https://huggingface.co/tiiuae/falcon-11B)
|
27 |
|
28 |
+
### Configuration
|
29 |
+
|
30 |
+
The following YAML configuration was used to produce this model:
|
31 |
+
|
32 |
|
33 |
```yaml
|
34 |
+
|
35 |
slices:
|
36 |
- sources:
|
37 |
- model: tiiuae/falcon-11B
|
|
|
43 |
dtype: bfloat16
|
44 |
```
|
45 |
|
46 |
+
[PruneMe](https://github.com/arcee-ai/PruneMe) has been utilized using the wikimedia/wikipedia Danish (da) subset by investigating layer similarity with 2000 samples. The layer ranges for pruning were determined based on this analysis to maintain performance while reducing model size.
|
47 |
|
48 |
+
![Layer Similarity Plot](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/hXfcozWzFUd8Df7HsaHK-.png)
|
|
|
49 |
|
50 |
+
## Direct Use
|
51 |
+
Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.)
|
|
|
52 |
|
53 |
+
## Out-of-Scope Use
|
54 |
+
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
|
55 |
|
56 |
+
## Bias, Risks, and Limitations
|
57 |
+
Falcon2-5.5B is trained mostly on English, but also German, Spanish, French, Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish. It will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
|
|
|
|
|
|
|
|
|
|
|
|
|
58 |
|
59 |
+
## Recommendations
|
60 |
+
We recommend users of Falcon2-5.5B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.
|
|