add some multilingual examples
Browse files
README.md
CHANGED
@@ -5,6 +5,7 @@ tags:
|
|
5 |
- generated_from_trainer
|
6 |
- email generation
|
7 |
- email
|
|
|
8 |
datasets:
|
9 |
- aeslc
|
10 |
- postbot/multi-emails-100k
|
@@ -14,32 +15,32 @@ widget:
|
|
14 |
|
15 |
Hope you are doing well. I just wanted to reach out and ask if differential calculus will be on the exam"
|
16 |
example_title: "email to prof"
|
17 |
-
- text: "
|
18 |
-
example_title: "
|
19 |
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and ask about office hours"
|
20 |
example_title: "office hours"
|
21 |
-
- text: "
|
22 |
-
example_title: "festival"
|
23 |
-
- text: "
|
24 |
example_title: "event"
|
25 |
- text: "URGENT - I need the TPS reports"
|
26 |
example_title: "URGENT"
|
27 |
-
- text: "
|
28 |
-
example_title: "
|
29 |
- text: "Hello there.\n\nI just wanted to reach out and check in to"
|
30 |
example_title: "checking in"
|
31 |
- text: "Hello <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if you've enjoyed your time with us"
|
32 |
example_title: "work well"
|
33 |
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if we could catch up"
|
34 |
example_title: "catch up"
|
35 |
-
- text: "
|
36 |
-
example_title: "
|
37 |
parameters:
|
38 |
min_length: 32
|
39 |
max_length: 128
|
40 |
no_repeat_ngram_size: 2
|
41 |
do_sample: True
|
42 |
-
temperature: 0.
|
43 |
top_k: 20
|
44 |
top_p: 0.95
|
45 |
repetition_penalty: 3.5
|
@@ -47,7 +48,7 @@ parameters:
|
|
47 |
---
|
48 |
|
49 |
|
50 |
-
# bloom-1b1-emailgen-v1
|
51 |
|
52 |
This model is a fine-tuned version of [bigscience/bloom-1b1](https://huggingface.co/bigscience/bloom-1b1) on the ` postbot/multi-emails-100k` dataset.
|
53 |
|
@@ -60,7 +61,8 @@ More information needed
|
|
60 |
|
61 |
## Intended uses & limitations
|
62 |
|
63 |
-
|
|
|
64 |
|
65 |
## Training and evaluation data
|
66 |
|
|
|
5 |
- generated_from_trainer
|
6 |
- email generation
|
7 |
- email
|
8 |
+
- emailgen
|
9 |
datasets:
|
10 |
- aeslc
|
11 |
- postbot/multi-emails-100k
|
|
|
15 |
|
16 |
Hope you are doing well. I just wanted to reach out and ask if differential calculus will be on the exam"
|
17 |
example_title: "email to prof"
|
18 |
+
- text: "嘿<NAME>\n\n感谢你注册我的每周通讯。在我们开始之前,你必须确认你的电子邮件地址。."
|
19 |
+
example_title: "通讯"
|
20 |
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and ask about office hours"
|
21 |
example_title: "office hours"
|
22 |
+
- text: "Grüße <NAME>,\n\nIch hoffe, du hattest einen schönen Abend beim Wurstessen der Firma. Ich melde mich, weil"
|
23 |
+
example_title: "Wurstessen festival"
|
24 |
+
- text: "Guten Morgen Harold,\n\nich habe mich gefragt, wann die nächste"
|
25 |
example_title: "event"
|
26 |
- text: "URGENT - I need the TPS reports"
|
27 |
example_title: "URGENT"
|
28 |
+
- text: "Hoi Archibald,\n\nik hoop dat deze e-mail je goed doet."
|
29 |
+
example_title: "e-mails die je vinden"
|
30 |
- text: "Hello there.\n\nI just wanted to reach out and check in to"
|
31 |
example_title: "checking in"
|
32 |
- text: "Hello <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if you've enjoyed your time with us"
|
33 |
example_title: "work well"
|
34 |
- text: "Hi <NAME>,\n\nI hope this email finds you well. I wanted to reach out and see if we could catch up"
|
35 |
example_title: "catch up"
|
36 |
+
- text: "Jestem <NAME>,\n\nWłaśnie wprowadziłem się do obszaru i chciałem dotrzeć i uzyskać kilka szczegółów na temat tego, gdzie mogę dostać artykuły spożywcze i"
|
37 |
+
example_title: "zakupy spożywcze"
|
38 |
parameters:
|
39 |
min_length: 32
|
40 |
max_length: 128
|
41 |
no_repeat_ngram_size: 2
|
42 |
do_sample: True
|
43 |
+
temperature: 0.2
|
44 |
top_k: 20
|
45 |
top_p: 0.95
|
46 |
repetition_penalty: 3.5
|
|
|
48 |
---
|
49 |
|
50 |
|
51 |
+
# bloom-1b1-emailgen - v1
|
52 |
|
53 |
This model is a fine-tuned version of [bigscience/bloom-1b1](https://huggingface.co/bigscience/bloom-1b1) on the ` postbot/multi-emails-100k` dataset.
|
54 |
|
|
|
61 |
|
62 |
## Intended uses & limitations
|
63 |
|
64 |
+
- **this model did not have any of the original layers frozen during training**
|
65 |
+
- while this is still an area of investigation, the model likely needs to have some layers frozen during fine-tuning to retain the multilingual capabilities in balance with learning how to write emails.
|
66 |
|
67 |
## Training and evaluation data
|
68 |
|