--- license: mit datasets: - Locutusque/InstructMix language: - en metrics: - bleu - perplexity pipeline_tag: text-generation widget: - text: '<|USER|> Design a Neo4j database and Cypher function snippet to Display Extreme Dental hygiene: Using Mouthwash for Analysis for Beginners. Implement if/else or switch/case statements to handle different conditions related to the Consent. Provide detailed comments explaining your control flow and the reasoning behind each decision. <|ASSISTANT|> ' --- # Model Card for Model ID This a fine-tuned version of gpt2 on Locutusque/InstructMix. ## Model Details This model performs significantly better than Locutusque/gpt2-conversational-or-qa. Here are the training results: - BLEU - 30 - Perplexity - 5 ### Model Description - **Developed by:** Locutusque - **Shared by [optional]:** [More Information Needed] - **Model type:** GPT-2 - **Language(s) (NLP):** English - **License:** [More Information Needed] - **Finetuned from model [optional]:** GPT-2 ### Model Sources [optional] - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses This model is designed to follow instructions, or partake in conversations. ### Direct Use Instruction-following or conversational. ### Downstream Use [optional] [More Information Needed] ### Out-of-Scope Use [More Information Needed] ## Bias, Risks, and Limitations [More Information Needed] ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ```python import torch from transformers import GPT2Tokenizer, GPT2LMHeadModel tokenizer = GPT2Tokenizer.from_pretrained('gpt2-large-conversational-retrain') model = GPT2LMHeadModel.from_pretrained('gpt2-large-conversational-retrain') device = torch.device("cuda" if torch.cuda.is_available() else "cpu") model.to(device) def generate_text(model, tokenizer, prompt, max_length=1024): prompt = f'<|USER|> {prompt} <|ASSISTANT|> ' input_ids = tokenizer.encode(prompt, add_special_tokens=True, return_tensors="pt").to(device) attention_mask = torch.ones_like(input_ids).to(device) output = model.generate(input_ids, max_length=max_length, do_sample=True, temperature=0.3, top_k=23, top_p=0.7, repetition_penalty=1.176, pad_token_id=tokenizer.pad_token_id, eos_token_id=tokenizer.eos_token_id, attention_mask=attention_mask) output_ids = tokenizer.decode(output[0], skip_special_tokens=False) return output_ids # Loop to interact with the model while True: prompt = input("Enter a prompt (or 'q' to quit): ") if prompt == "q": break output_text = generate_text(model, tokenizer, prompt) print(output_text) ``` ## Training Details ### Training Data https://huggingface.co./datasets/Locutusque/InstructMix This model has so far been trained on 10% of the linked data, with more training sessions to come. ### Training Procedure #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** fp16 non-mixed precision #### Speeds, Sizes, Times [optional] [More Information Needed] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data [More Information Needed] #### Factors [More Information Needed] #### Metrics [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] [More Information Needed] ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]