Shitao commited on
Commit
8ce799c
·
verified ·
1 Parent(s): a45d34f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -13
README.md CHANGED
@@ -2491,15 +2491,6 @@ model-index:
2491
 
2492
  For more details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
2493
 
2494
- FlagEmbedding focuses on retrieval-augmented LLMs, consisting of the following projects currently:
2495
- - **LLM-based Dense Retrieval**: BGE-EN-Mistral, BGE-Multilingual-Gemma2
2496
- - **Long-Context LLM**: [Activation Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon)
2497
- - **Fine-tuning of LM** : [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail)
2498
- - **Dense Retrieval**: [BGE-M3](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3), [LLM Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), [BGE Embedding](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/baai_general_embedding)
2499
- - **Reranker Model**: [BGE Reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
2500
- - **Benchmark**: [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB)
2501
-
2502
-
2503
  **BGE-EN-Mistral** primarily demonstrates the following capabilities:
2504
  - In-context learning ability: By providing few-shot examples in the query, it can significantly enhance the model's ability to handle new tasks.
2505
  - Outstanding performance: The model has achieved state-of-the-art (SOTA) performance on both BEIR and AIR-Bench.
@@ -2519,9 +2510,10 @@ We will release the technical report and training data for **BGE-EN-Mistral** in
2519
 
2520
  ### Using FlagEmbedding
2521
  ```
2522
- pip install -U FlagEmbedding
 
 
2523
  ```
2524
- If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
2525
 
2526
  ```python
2527
  from FlagEmbedding import FlagICLModel
@@ -2603,8 +2595,8 @@ documents = [
2603
  ]
2604
  input_texts = queries + documents
2605
 
2606
- tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-en-mistral')
2607
- model = AutoModel.from_pretrained('BAAI/bge-en-mistral')
2608
  model.eval()
2609
 
2610
  max_length = 4096
 
2491
 
2492
  For more details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
2493
 
 
 
 
 
 
 
 
 
 
2494
  **BGE-EN-Mistral** primarily demonstrates the following capabilities:
2495
  - In-context learning ability: By providing few-shot examples in the query, it can significantly enhance the model's ability to handle new tasks.
2496
  - Outstanding performance: The model has achieved state-of-the-art (SOTA) performance on both BEIR and AIR-Bench.
 
2510
 
2511
  ### Using FlagEmbedding
2512
  ```
2513
+ git clone https://github.com/FlagOpen/FlagEmbedding.git
2514
+ cd FlagEmbedding
2515
+ pip install -e .
2516
  ```
 
2517
 
2518
  ```python
2519
  from FlagEmbedding import FlagICLModel
 
2595
  ]
2596
  input_texts = queries + documents
2597
 
2598
+ tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-en-icl')
2599
+ model = AutoModel.from_pretrained('BAAI/bge-en-icl')
2600
  model.eval()
2601
 
2602
  max_length = 4096