File size: 3,811 Bytes
49cd859 6a86fd0 49cd859 6a86fd0 bddfbb3 6a86fd0 5eb3d04 64508fb 6a86fd0 64508fb 6a86fd0 49cd859 bddfbb3 49cd859 bddfbb3 bd70421 8f64ee7 bd70421 bddfbb3 bd70421 d9c527c bd70421 d9c527c bd70421 bddfbb3 bd70421 d9c527c bd70421 bddfbb3 bd70421 64508fb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 |
---
metrics:
- rouge
- bleu
- bleurt
model-index:
- name: ibleducation/ibl-tutoring-chat-7B
results:
- task:
name: truthfulqa_gen
type: text-generation
dataset:
type: truthful_qa
name: Truthful QA
metrics:
- type: bleurt
name: bleurt_max
value: -0.5572
- type: bleurt
name: bleurt_acc
value: 0.4321
- type: bleurt
name: bleurt_diff
value: -0.0725
- type: bleu
name: bleu_max
value: 22.5935
- type: bleu
name: bleu_acc
value: 0.3758
- type: bleu
name: bleu_diff
value: -2.5541
- type: rouge
name: rouge1_max
value: 50.0851
- type: rouge
name: rouge1_acc
value: 0.3978
- type: rouge
name: rouge1_diff
value: -3.5142
- type: rouge
name: rouge2_max
value: 34.7473
- type: rouge
name: rouge2_acc
value: 0.339
- type: rouge
name: rouge2_diff
value: -4.5082
- type: rouge
name: rougeL_max
value: 46.1054
- type: rouge
name: rougeL_acc
value: 0.3745
- type: rouge
name: rougeL_diff
value: -4.0046
datasets:
- ibleducation/ibl-best-practices-instructor-dataset
---
library_name: transformers
---
# ibleducation/ibl-tutoring-chat-7B
ibleducation/ibl-tutoring-chat-7B is a model finetuned on top of openchat/openchat_3.5
This model is finetuned to give responses in a way befitting of a professional teacher.
It is finetuned to exhibit characteristics and virtues such as compassion, encouragement, friendliness and more.
## Example Conversations
1. Question: I'm having trouble understanding this art movement. Can you help me? \
Answer:
```code
Yes I can assist you, that is what I am here for. We can start by analyzing the artists' techniques, their motivations, and find creative ways to connect with the movement. Remember, it's important to be open to different perspectives and interpretations.
```
2. Question: I am thinking of stopping school do you think it is right? \
Answer:
```code
I understand that school can be challenging, but I believe that education is important. I'm here to for you. Where you have troubles, I will be happy to help
```
## Model Details
- **Developed by:** [IBL Education](https://ibl.ai)
- **Model type:** [Mistral-7B-v0.1](https://huggingface.co./mistralai/Mistral-7B-v0.1)
- **Base Model:** [OpenChat 3.5](https://huggingface.co./openchat/openchat_3.5)
- **Language:** English
- **Finetuned from weights:** [OpenChat 3.5](https://huggingface.co./openchat/openchat_3.5)
- **Finetuned on data:**
- [ibleducation/ibl-best-practices-instructor-dataset](https://huggingface.co./datasets/ibleducation/ibl-best-practices-instructor-dataset)
- **Model License:** Apache 2.0
## How to Use ibl-tutoring-chat-7B Model from Python Code (HuggingFace transformers) ##
### Install the necessary packages
Requires: [transformers](https://pypi.org/project/transformers/) 4.35.0 or later, and [accelerate](https://pypi.org/project/accelerate/) 0.23.0 or later.
```shell
pip install transformers==4.35.0
pip install accelerate==0.23.0
```
### You can then try the following example code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch
model_id = "ibleducation/ibl-tutoring-chat-7B"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
prompt = "<s>What makes a good teacher?</s>"
response = pipeline(prompt)
print(response['generated_text'])
```
**Important** - Use the prompt template below for ibl-tutoring-chat-7B:
```
<s>{prompt}</s>
``` |