Jorge Lopez Grisman
commited on
Commit
•
51d1934
1
Parent(s):
869fe62
Update README.md
Browse filesUpdating read with limitations paragraph and adding custom examples.
README.md
CHANGED
@@ -2,6 +2,10 @@
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
|
|
|
|
|
|
|
|
5 |
datasets:
|
6 |
- conll2003
|
7 |
metrics:
|
@@ -51,13 +55,26 @@ It achieves the following results on the evaluation set:
|
|
51 |
|
52 |
More information needed
|
53 |
|
54 |
-
|
55 |
|
56 |
-
|
57 |
|
58 |
-
## Training and evaluation data
|
59 |
|
60 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
61 |
|
62 |
## Training procedure
|
63 |
|
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
+
language: en
|
6 |
+
widget:
|
7 |
+
- text: "My name is Scott and I live in Columbus."
|
8 |
+
- text: "Apple was founded in 1976 by Steve Jobs, Steve Wozniak and Ronald Wayne."
|
9 |
datasets:
|
10 |
- conll2003
|
11 |
metrics:
|
|
|
55 |
|
56 |
More information needed
|
57 |
|
58 |
+
#### Limitations and bias
|
59 |
|
60 |
+
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains. Furthermore, the model occassionally tags subword tokens as entities and post-processing of results may be necessary to handle those cases.
|
61 |
|
|
|
62 |
|
63 |
+
#### How to use
|
64 |
+
|
65 |
+
You can use this model with Transformers *pipeline* for NER.
|
66 |
+
|
67 |
+
```python
|
68 |
+
from transformers import pipeline
|
69 |
+
from transformers import AutoTokenizer, AutoModelForTokenClassification
|
70 |
+
tokenizer = AutoTokenizer.from_pretrained("Jorgeutd/bert-large-uncased-finetuned-ner")
|
71 |
+
model = AutoModelForTokenClassification.from_pretrained("Jorgeutd/bert-large-uncased-finetuned-ner")
|
72 |
+
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
|
73 |
+
example = "My name is Scott and I live in Ohio"
|
74 |
+
ner_results = nlp(example)
|
75 |
+
print(ner_results)
|
76 |
+
```
|
77 |
+
|
78 |
|
79 |
## Training procedure
|
80 |
|