Update README.md
Browse files
README.md
CHANGED
@@ -1,9 +1,49 @@
|
|
1 |
---
|
2 |
tags:
|
3 |
- generated_from_trainer
|
|
|
|
|
|
|
4 |
model-index:
|
5 |
- name: span-marker-bert-base-multilingual-cased-multinerd
|
6 |
-
results:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
---
|
8 |
|
9 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -11,28 +51,76 @@ should probably proofread and complete it, then remove this comment. -->
|
|
11 |
|
12 |
# span-marker-bert-base-multilingual-cased-multinerd
|
13 |
|
14 |
-
This model is a fine-tuned version of [](https://huggingface.co/) on an
|
15 |
-
It achieves the following results on the
|
16 |
- Loss: 0.0049
|
17 |
- Overall Precision: 0.9242
|
18 |
- Overall Recall: 0.9281
|
19 |
- Overall F1: 0.9261
|
20 |
- Overall Accuracy: 0.9852
|
21 |
|
22 |
-
## Model description
|
23 |
|
24 |
-
|
|
|
|
|
|
|
25 |
|
26 |
-
##
|
27 |
|
28 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
29 |
|
30 |
-
## Training and evaluation data
|
31 |
|
32 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
|
34 |
## Training procedure
|
35 |
|
|
|
|
|
36 |
### Training hyperparameters
|
37 |
|
38 |
The following hyperparameters were used during training:
|
@@ -61,4 +149,4 @@ The following hyperparameters were used during training:
|
|
61 |
- Transformers 4.30.2
|
62 |
- Pytorch 2.0.1+cu117
|
63 |
- Datasets 2.14.3
|
64 |
-
- Tokenizers 0.13.3
|
|
|
1 |
---
|
2 |
tags:
|
3 |
- generated_from_trainer
|
4 |
+
- ner
|
5 |
+
- named-entity-recognition
|
6 |
+
- span-marker
|
7 |
model-index:
|
8 |
- name: span-marker-bert-base-multilingual-cased-multinerd
|
9 |
+
results:
|
10 |
+
- task:
|
11 |
+
type: token-classification
|
12 |
+
name: Named Entity Recognition
|
13 |
+
dataset:
|
14 |
+
type: Babelscape/multinerd
|
15 |
+
name: MultiNERD
|
16 |
+
split: test
|
17 |
+
revision: 2814b78e7af4b5a1f1886fe7ad49632de4d9dd25
|
18 |
+
metrics:
|
19 |
+
- type: f1
|
20 |
+
value: 0.9261
|
21 |
+
name: F1
|
22 |
+
- type: precision
|
23 |
+
value: 0.9242
|
24 |
+
name: Precision
|
25 |
+
- type: recall
|
26 |
+
value: 0.9281
|
27 |
+
name: Recal
|
28 |
+
license: apache-2.0
|
29 |
+
datasets:
|
30 |
+
- Babelscape/multinerd
|
31 |
+
metrics:
|
32 |
+
- precision
|
33 |
+
- recall
|
34 |
+
- f1
|
35 |
+
pipeline_tag: token-classification
|
36 |
+
language:
|
37 |
+
- de
|
38 |
+
- en
|
39 |
+
- es
|
40 |
+
- fr
|
41 |
+
- it
|
42 |
+
- nl
|
43 |
+
- pl
|
44 |
+
- pt
|
45 |
+
- ru
|
46 |
+
- zh
|
47 |
---
|
48 |
|
49 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
51 |
|
52 |
# span-marker-bert-base-multilingual-cased-multinerd
|
53 |
|
54 |
+
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on an [Babelscape/multinerd](https://huggingface.co/datasets/Babelscape/multinerd) dataset.
|
55 |
+
It achieves the following results on the test set:
|
56 |
- Loss: 0.0049
|
57 |
- Overall Precision: 0.9242
|
58 |
- Overall Recall: 0.9281
|
59 |
- Overall F1: 0.9261
|
60 |
- Overall Accuracy: 0.9852
|
61 |
|
|
|
62 |
|
63 |
+
This is a replication of Tom's work. Everything remains unchanged,
|
64 |
+
except that we extended the number of training epochs to 3 for a
|
65 |
+
slightly longer training duration and set the gradient_accumulation_steps to 2.
|
66 |
+
Please refer to the official [model page](https://huggingface.co/tomaarsen/span-marker-mbert-base-multinerd) to review their results and training script
|
67 |
|
68 |
+
## Label set
|
69 |
|
70 |
+
| Class | Description | Examples |
|
71 |
+
|-------|-------------|----------|
|
72 |
+
| **PER (person)** | People | Ray Charles, Jessica Alba, Leonardo DiCaprio, Roger Federer, Anna Massey. |
|
73 |
+
| **ORG (organization)** | Associations, companies, agencies, institutions, nationalities and religious or political groups | University of Edinburgh, San Francisco Giants, Google, Democratic Party. |
|
74 |
+
| **LOC (location)** | Physical locations (e.g. mountains, bodies of water), geopolitical entities (e.g. cities, states), and facilities (e.g. bridges, buildings, airports). | Rome, Lake Paiku, Chrysler Building, Mount Rushmore, Mississippi River. |
|
75 |
+
| **ANIM (animal)** | Breeds of dogs, cats and other animals, including their scientific names. | Maine Coon, African Wild Dog, Great White Shark, New Zealand Bellbird. |
|
76 |
+
| **BIO (biological)** | Genus of fungus, bacteria and protoctists, families of viruses, and other biological entities. | Herpes Simplex Virus, Escherichia Coli, Salmonella, Bacillus Anthracis. |
|
77 |
+
| **CEL (celestial)** | Planets, stars, asteroids, comets, nebulae, galaxies and other astronomical objects. | Sun, Neptune, Asteroid 187 Lamberta, Proxima Centauri, V838 Monocerotis. |
|
78 |
+
| **DIS (disease)** | Physical, mental, infectious, non-infectious, deficiency, inherited, degenerative, social and self-inflicted diseases. | Alzheimer’s Disease, Cystic Fibrosis, Dilated Cardiomyopathy, Arthritis. |
|
79 |
+
| **EVE (event)** | Sport events, battles, wars and other events. | American Civil War, 2003 Wimbledon Championships, Cannes Film Festival. |
|
80 |
+
| **FOOD (food)** | Foods and drinks. | Carbonara, Sangiovese, Cheddar Beer Fondue, Pizza Margherita. |
|
81 |
+
| **INST (instrument)** | Technological instruments, mechanical instruments, musical instruments, and other tools. | Spitzer Space Telescope, Commodore 64, Skype, Apple Watch, Fender Stratocaster. |
|
82 |
+
| **MEDIA (media)** | Titles of films, books, magazines, songs and albums, fictional characters and languages. | Forbes, American Psycho, Kiss Me Once, Twin Peaks, Disney Adventures. |
|
83 |
+
| **PLANT (plant)** | Types of trees, flowers, and other plants, including their scientific names. | Salix, Quercus Petraea, Douglas Fir, Forsythia, Artemisia Maritima. |
|
84 |
+
| **MYTH (mythological)** | Mythological and religious entities. | Apollo, Persephone, Aphrodite, Saint Peter, Pope Gregory I, Hercules. |
|
85 |
+
| **TIME (time)** | Specific and well-defined time intervals, such as eras, historical periods, centuries, years and important days. No months and days of the week. | Renaissance, Middle Ages, Christmas, Great Depression, 17th Century, 2012. |
|
86 |
+
| **VEHI (vehicle)** | Cars, motorcycles and other vehicles. | Ferrari Testarossa, Suzuki Jimny, Honda CR-X, Boeing 747, Fairey Fulmar. |
|
87 |
|
|
|
88 |
|
89 |
+
|
90 |
+
## Inference Example
|
91 |
+
|
92 |
+
```python
|
93 |
+
# install span_marker
|
94 |
+
(env)$ pip install span_marker
|
95 |
+
|
96 |
+
|
97 |
+
from span_marker import SpanMarkerModel
|
98 |
+
|
99 |
+
model = SpanMarkerModel.from_pretrained("lxyuan/span-marker-bert-base-multilingual-cased-multinerd")
|
100 |
+
|
101 |
+
description = "Singapore is renowned for its hawker centers offering dishes \
|
102 |
+
like Hainanese chicken rice and laksa, while Malaysia boasts dishes such as \
|
103 |
+
nasi lemak and rendang, reflecting its rich culinary heritage."
|
104 |
+
|
105 |
+
entities = model.predict(description)
|
106 |
+
|
107 |
+
entities
|
108 |
+
>>>
|
109 |
+
[
|
110 |
+
{'span': 'Singapore', 'label': 'LOC', 'score': 0.999988317489624, 'char_start_index': 0, 'char_end_index': 9},
|
111 |
+
{'span': 'Hainanese chicken rice', 'label': 'FOOD', 'score': 0.9894770383834839, 'char_start_index': 66, 'char_end_index': 88},
|
112 |
+
{'span': 'laksa', 'label': 'FOOD', 'score': 0.9224908947944641, 'char_start_index': 93, 'char_end_index': 98},
|
113 |
+
{'span': 'Malaysia', 'label': 'LOC', 'score': 0.9999839067459106, 'char_start_index': 106, 'char_end_index': 114}]
|
114 |
+
|
115 |
+
# missed: nasi lemak as FOOD
|
116 |
+
# missed: rendang as FOOD
|
117 |
+
# :(
|
118 |
+
```
|
119 |
|
120 |
## Training procedure
|
121 |
|
122 |
+
One can reproduce the result running this [script](https://huggingface.co/tomaarsen/span-marker-mbert-base-multinerd/blob/main/train.py)
|
123 |
+
|
124 |
### Training hyperparameters
|
125 |
|
126 |
The following hyperparameters were used during training:
|
|
|
149 |
- Transformers 4.30.2
|
150 |
- Pytorch 2.0.1+cu117
|
151 |
- Datasets 2.14.3
|
152 |
+
- Tokenizers 0.13.3
|