Edit model card

SOTA Entity Recognition English Foundation Model by NuMind 🔥

This is the BERT model from our Paper: NuNER: Entity Recognition Encoder Pre-training via LLM-Annotated Data

This is the model used in Section 4.2 when comparing against TadNER.

For other sections, NuNER v1.0 is used.

Checkout other models by NuMind:

  • SOTA Multilingual Entity Recognition Foundation Model: link
  • SOTA Sentiment Analysis Foundation Model: English, Multilingual

About

bert-base-uncased fine-tuned on NuNER data.

Metrics:

Read more about evaluation protocol datasets in Section 4.2 of our paper.

Usage

Embeddings can be used out of the box or fine-tuned on specific datasets.

Get embeddings:

import torch
import transformers


model = transformers.AutoModel.from_pretrained(
    'numind/NuNER-BERT-v1.0',
    output_hidden_states=True
)
tokenizer = transformers.AutoTokenizer.from_pretrained(
    'numind/NuNER-BERT-v1.0'
)

text = [
    "NuMind is an AI company based in Paris and USA.",
    "See other models from us on https://huggingface.co./numind"
]
encoded_input = tokenizer(
    text,
    return_tensors='pt',
    padding=True,
    truncation=True
)
output = model(**encoded_input)

# for better quality
emb = torch.cat(
    (output.hidden_states[-1], output.hidden_states[-7]),
    dim=2
)

# for better speed
# emb = output.hidden_states[-1]

Citation

@misc{bogdanov2024nuner,
      title={NuNER: Entity Recognition Encoder Pre-training via LLM-Annotated Data}, 
      author={Sergei Bogdanov and Alexandre Constantin and Timothée Bernard and Benoit Crabbé and Etienne Bernard},
      year={2024},
      eprint={2402.15343},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Dataset used to train numind/NuNER-BERT-v1.0

Collection including numind/NuNER-BERT-v1.0