File size: 1,836 Bytes
d83e3c0 eefd7c2 4864897 d83e3c0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
tags:
- token-classification
task_ids:
- named-entity-recognition
datasets:
- djagatiya/ner-ontonotes-v5-eng-v4
widget:
- text: "On September 1st George won 1 dollar while watching Game of Thrones."
---
# (NER) bert-base-cased : conll2012_ontonotesv5-english-v4
This `bert-base-cased` NER model was finetuned on `conll2012_ontonotesv5` version `english-v4` dataset. <br>
Check out [NER-System Repository](https://github.com/djagatiya/NER-System) for more information.
## Evaluation
- Precision: 87.85
- Recall: 89.63
- F1-Score: 88.73
> check out this [eval.log](eval.log) file for evaluation metrics and classification report.
```
precision recall f1-score support
CARDINAL 0.86 0.87 0.86 935
DATE 0.84 0.88 0.86 1602
EVENT 0.65 0.67 0.66 63
FAC 0.69 0.71 0.70 135
GPE 0.97 0.93 0.95 2240
LANGUAGE 0.76 0.73 0.74 22
LAW 0.54 0.55 0.54 40
LOC 0.73 0.80 0.76 179
MONEY 0.87 0.90 0.88 314
NORP 0.93 0.96 0.94 841
ORDINAL 0.80 0.87 0.83 195
ORG 0.88 0.90 0.89 1795
PERCENT 0.88 0.90 0.89 349
PERSON 0.94 0.95 0.94 1988
PRODUCT 0.62 0.76 0.69 76
QUANTITY 0.74 0.81 0.77 105
TIME 0.61 0.67 0.64 212
WORK_OF_ART 0.56 0.66 0.61 166
micro avg 0.88 0.90 0.89 11257
macro avg 0.77 0.81 0.79 11257
weighted avg 0.88 0.90 0.89 11257
``` |