GGUF
English
llama
medical
Inference Endpoints
turquise commited on
Commit
6ef7d60
Β·
verified Β·
1 Parent(s): a5da965

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +155 -0
README.md ADDED
@@ -0,0 +1,155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - keivalya/MedQuad-MedicalQnADataset
4
+ - turquise/Comprehensive_Medical_QA_Dataset
5
+ language:
6
+ - en
7
+ base_model:
8
+ - meta-llama/Llama-3.1-8B-Instruct
9
+ tags:
10
+ - medical
11
+ ---
12
+ # Fine-Tuning Llama-3.1 with Comprehensive Medical Q&A Dataset
13
+
14
+ This project fine-tunes the **Llama-3.1 8B Model** using the **Comprehensive Medical Q&A Dataset** to build a specialized model capable of answering medical questions.
15
+
16
+ ---
17
+
18
+ ## πŸš€ Features
19
+
20
+ - Fine-tuned on a diverse dataset of over **43,000 medical Q&A pairs**.
21
+ - Supports **31 distinct types of medical queries**, including treatments, chronic diseases, and protocols.
22
+ - Provides answers sourced from doctors, nurses, and pharmacists.
23
+
24
+ ---
25
+
26
+ ## πŸ“‚ Dataset Overview
27
+
28
+ ### **Comprehensive Medical Q&A Dataset**
29
+
30
+ - **Source:** [Huggingface Hub](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset)
31
+ - **License:** CC0 1.0 Universal (Public Domain Dedication)
32
+
33
+ #### **Key Details**
34
+ - **Total Questions:** 43,000+
35
+ - **Categories:** 31 medical question types (`qtype`)
36
+ - **Columns:**
37
+ - `qtype`: Type of medical question (e.g., Treatment, Symptoms).
38
+ - `Question`: Patient's medical question.
39
+ - `Answer`: Expert response (from doctors, nurses, and pharmacists).
40
+
41
+ ### **How the Dataset is Used**
42
+ - **Filtering:** Questions are filtered by `qtype` for domain-specific fine-tuning.
43
+ - **Analysis:** Queries are analyzed to understand patterns, such as correlations between treatments and chronic conditions.
44
+ - **Applications:** Insights can be applied to build medical educational tools, predictive models, and virtual assistants.
45
+
46
+ For more details, check the [dataset documentation](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset).
47
+
48
+ ---
49
+
50
+ ## πŸ’» How to Use This Model
51
+
52
+ The fine-tuned model is available on Hugging Face under the repository: [`turquise/MedQA_q4`](https://huggingface.co/turquise/MedQA_q4). Below are several ways to use the model:
53
+
54
+ ### **Using llama-cpp-python Library**
55
+ ```python
56
+ from llama_cpp import Llama
57
+
58
+ # Load the model
59
+ llm = Llama.from_pretrained(
60
+ repo_id="turquise/MedQA_q4",
61
+ filename="MedQA.Q4_K_M.gguf",
62
+ )
63
+
64
+ # Query the model
65
+ output = llm(
66
+ "What is Medullary Sponge Kidney?",
67
+ max_tokens=512,
68
+ echo=True
69
+ )
70
+ print(output)
71
+ ```
72
+
73
+ ### **Using llama.cpp**
74
+ #### **Install via Homebrew**
75
+ ```bash
76
+ brew install llama.cpp
77
+
78
+ llama-cli \
79
+ --hf-repo "turquise/MedQA_q4" \
80
+ --hf-file MedQA.Q4_K_M.gguf \
81
+ -p "What is Medullary Sponge Kidney?"
82
+ ```
83
+ #### **Use Pre-Built Binary**
84
+ ```bash
85
+ # Download pre-built binary from:
86
+ # https://github.com/ggerganov/llama.cpp/releases
87
+
88
+ ./llama-cli \
89
+ --hf-repo "turquise/MedQA_q4" \
90
+ --hf-file MedQA.Q4_K_M.gguf \
91
+ -p "What is Medullary Sponge Kidney?"
92
+ ```
93
+ #### **Build from Source Code**
94
+ ```bash
95
+ git clone https://github.com/ggerganov/llama.cpp.git
96
+ cd llama.cpp
97
+ cmake -B build -DLLAMA_CURL=ON
98
+ cmake --build build -j --target llama-cli
99
+
100
+ ./build/bin/llama-cli \
101
+ --hf-repo "turquise/MedQA_q4" \
102
+ --hf-file MedQA.Q4_K_M.gguf \
103
+ -p "What is Medullary Sponge Kidney?"
104
+ ```
105
+
106
+ ---
107
+
108
+ ## πŸ€– Example Usages
109
+ This model can assist with the following tasks:
110
+
111
+ - Answering medical questions:
112
+ ```python
113
+ question = "What are the symptoms of diabetes?"
114
+ output = llm(question, max_tokens=512)
115
+ print(output)
116
+ ```
117
+ - Providing insights for healthcare education: Example: Answering queries about diseases, treatments, and chronic conditions.
118
+ - Supporting virtual assistants by handling frequently asked healthcare-related questions.
119
+
120
+ ---
121
+
122
+ ## ⚠️ Disclaimer
123
+
124
+ - This model **does not provide medical advice** and should not replace professional medical consultation.
125
+ - For any health-related questions or concerns, please consult a doctor or a licensed healthcare professional.
126
+
127
+ ---
128
+
129
+ ## πŸ€– Applications
130
+
131
+ This fine-tuned model can be used to:
132
+ - Build **virtual assistants** and chatbots for healthcare-related queries.
133
+ - Assist healthcare professionals by handling routine inquiries.
134
+ - Enhance **medical education platforms** with AI-powered insights.
135
+
136
+ ---
137
+
138
+ ## πŸ“œ Acknowledgements
139
+
140
+ - Dataset: [Huggingface Hub - MedQuad](https://huggingface.co/datasets/keivalya/MedQuad-MedicalQnADataset).
141
+ - Fine-tuning framework: [Unsloth](https://github.com/unslothai/unsloth).
142
+
143
+ If you use this project or dataset in your research, please credit the original authors.
144
+
145
+ ---
146
+
147
+ ## πŸ“ License
148
+
149
+ This project is open-sourced under the **CC0 1.0 Universal License**. See the dataset [license details](https://creativecommons.org/publicdomain/zero/1.0/).
150
+
151
+ ---
152
+
153
+ ## πŸ“§ Contact
154
+
155
+ For questions or collaboration, reach out via [HF Model Community](https://huggingface.co/turquise/MedQA_q4/discussions).