GGUF
English
llama
medical
Inference Endpoints
File size: 4,618 Bytes
6ef7d60
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
---
datasets:
- keivalya/MedQuad-MedicalQnADataset
- turquise/Comprehensive_Medical_QA_Dataset
language:
- en
base_model:
- meta-llama/Llama-3.1-8B-Instruct
tags:
- medical
---
# Fine-Tuning Llama-3.1 with Comprehensive Medical Q&A Dataset

This project fine-tunes the **Llama-3.1 8B Model** using the **Comprehensive Medical Q&A Dataset** to build a specialized model capable of answering medical questions. 

---

## πŸš€ Features

- Fine-tuned on a diverse dataset of over **43,000 medical Q&A pairs**.
- Supports **31 distinct types of medical queries**, including treatments, chronic diseases, and protocols.
- Provides answers sourced from doctors, nurses, and pharmacists.

---

## πŸ“‚ Dataset Overview

### **Comprehensive Medical Q&A Dataset**

- **Source:** [Huggingface Hub](https://huggingface.co./datasets/keivalya/MedQuad-MedicalQnADataset)
- **License:** CC0 1.0 Universal (Public Domain Dedication)

#### **Key Details**
- **Total Questions:** 43,000+  
- **Categories:** 31 medical question types (`qtype`)  
- **Columns:**  
  - `qtype`: Type of medical question (e.g., Treatment, Symptoms).  
  - `Question`: Patient's medical question.  
  - `Answer`: Expert response (from doctors, nurses, and pharmacists).  

### **How the Dataset is Used**
- **Filtering:** Questions are filtered by `qtype` for domain-specific fine-tuning.  
- **Analysis:** Queries are analyzed to understand patterns, such as correlations between treatments and chronic conditions.  
- **Applications:** Insights can be applied to build medical educational tools, predictive models, and virtual assistants.

For more details, check the [dataset documentation](https://huggingface.co./datasets/keivalya/MedQuad-MedicalQnADataset).

---

## πŸ’» How to Use This Model

The fine-tuned model is available on Hugging Face under the repository: [`turquise/MedQA_q4`](https://huggingface.co./turquise/MedQA_q4). Below are several ways to use the model:

### **Using llama-cpp-python Library**
```python
from llama_cpp import Llama

# Load the model
llm = Llama.from_pretrained(
    repo_id="turquise/MedQA_q4",
    filename="MedQA.Q4_K_M.gguf",
)

# Query the model
output = llm(
    "What is Medullary Sponge Kidney?",
    max_tokens=512,
    echo=True
)
print(output)
```

### **Using llama.cpp**
#### **Install via Homebrew**
```bash
brew install llama.cpp

llama-cli \
  --hf-repo "turquise/MedQA_q4" \
  --hf-file MedQA.Q4_K_M.gguf \
  -p "What is Medullary Sponge Kidney?"
```
#### **Use Pre-Built Binary**
```bash
# Download pre-built binary from:
# https://github.com/ggerganov/llama.cpp/releases

./llama-cli \
  --hf-repo "turquise/MedQA_q4" \
  --hf-file MedQA.Q4_K_M.gguf \
  -p "What is Medullary Sponge Kidney?"
```
#### **Build from Source Code**
```bash
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
cmake -B build -DLLAMA_CURL=ON
cmake --build build -j --target llama-cli

./build/bin/llama-cli \
  --hf-repo "turquise/MedQA_q4" \
  --hf-file MedQA.Q4_K_M.gguf \
  -p "What is Medullary Sponge Kidney?"
```

---

## πŸ€– Example Usages
This model can assist with the following tasks:

- Answering medical questions:
```python
question = "What are the symptoms of diabetes?"
output = llm(question, max_tokens=512)
print(output)
```
- Providing insights for healthcare education: Example: Answering queries about diseases, treatments, and chronic conditions.
- Supporting virtual assistants by handling frequently asked healthcare-related questions.

---

## ⚠️ Disclaimer

- This model **does not provide medical advice** and should not replace professional medical consultation.
- For any health-related questions or concerns, please consult a doctor or a licensed healthcare professional.

---

## πŸ€– Applications

This fine-tuned model can be used to:
- Build **virtual assistants** and chatbots for healthcare-related queries.
- Assist healthcare professionals by handling routine inquiries.
- Enhance **medical education platforms** with AI-powered insights.

---

## πŸ“œ Acknowledgements

- Dataset: [Huggingface Hub - MedQuad](https://huggingface.co./datasets/keivalya/MedQuad-MedicalQnADataset).
- Fine-tuning framework: [Unsloth](https://github.com/unslothai/unsloth).

If you use this project or dataset in your research, please credit the original authors.

---

## πŸ“ License

This project is open-sourced under the **CC0 1.0 Universal License**. See the dataset [license details](https://creativecommons.org/publicdomain/zero/1.0/).

---

## πŸ“§ Contact

For questions or collaboration, reach out via [HF Model Community](https://huggingface.co./turquise/MedQA_q4/discussions).