Token Classification
GLiNER
PyTorch
English
NER
GLiNER
information extraction
encoder
entity recognition
modernbert
Ihor commited on
Commit
244e683
·
verified ·
1 Parent(s): d611a0d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -23,7 +23,7 @@ base_model:
23
 
24
  GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoders (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios.
25
 
26
- This particular version utilize bi-encoder architecture, where textual encoder is [ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large) and entity label encoder is sentence transformer - [BGE-base-en](https://huggingface.co/BAAI/bge-base-en-v1.5).
27
 
28
  Such architecture brings several advantages over uni-encoder GLiNER:
29
  * An unlimited amount of entities can be recognized at a single time;
@@ -81,7 +81,7 @@ European Championship => competitions
81
 
82
  If you want to use **flash attention** or increase sequence length, please, check the following code:
83
 
84
- Firstly, install flash attention and triton packages:
85
  ```bash
86
  pip install flash-attn triton
87
  ```
 
23
 
24
  GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoders (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios.
25
 
26
+ This particular version utilizes bi-encoder architecture, where the textual encoder is [ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large) and entity label encoder is sentence transformer - [BGE-base-en](https://huggingface.co/BAAI/bge-base-en-v1.5).
27
 
28
  Such architecture brings several advantages over uni-encoder GLiNER:
29
  * An unlimited amount of entities can be recognized at a single time;
 
81
 
82
  If you want to use **flash attention** or increase sequence length, please, check the following code:
83
 
84
+ Firstly, install Flash Attention and Triton packages:
85
  ```bash
86
  pip install flash-attn triton
87
  ```