khaimaitien commited on
Commit
d619e17
1 Parent(s): 75a2026

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ pipeline_tag: text-generation
5
+ ---
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+ This model aims to handle **Multi-hop Question answering** by splitting a multi-hop questions into a sequence of single questions, handle these single questions then summarize the information to get the final answer.
11
+ ## Model Details
12
+ This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the dataset: [khaimaitien/qa-expert-multi-hop-qa-V1.0](https://huggingface.co/datasets/khaimaitien/qa-expert-multi-hop-qa-V1.0)
13
+
14
+ You can get more information about how to **use/train** the model from this repo: https://github.com/khaimt/qa_expert
15
+
16
+ ### Model Sources [optional]
17
+
18
+ <!-- Provide the basic links for the model. -->
19
+
20
+ - **Repository:** [https://github.com/khaimt/qa_expert]
21
+
22
+ ## How to Get Started with the Model
23
+ First, you need to clone the repo: https://github.com/khaimt/qa_expert
24
+
25
+ Then install the requirements:
26
+
27
+ ```shell
28
+ pip install -r requirements.txt
29
+ ```
30
+ Here is the example code:
31
+
32
+ ```python
33
+ from qa_expert import get_inference_model, InferenceType
34
+ def retrieve(query: str) -> str:
35
+ # You need to implement this retrieval function, input is a query and output is a string
36
+ # This can be treated as the function to call in function calling of OpenAI
37
+ return context
38
+
39
+ model_inference = get_inference_model(InferenceType.hf, "khaimaitien/qa-expert-7B-V1.0")
40
+ answer, messages = model_inference.generate_answer(question, retriever_func)
41
+ ```
42
+