DeciLM 6B-Instruct

DeciLM 6B-Instruct is a model for short-form instruction following. It is built by LoRA fine-tuning DeciLM 6B on a subset of the OpenOrca dataset.

  • Developed by: Deci
  • Model type: DeciLM is an auto-regressive language model using an optimized transformer decoder architecture that includes variable Grouped-Query Attention.
  • Language(s) (NLP): English
  • License: Llama 2 Community License Agreement with an extention of Deci regarding hosting service providers.

Model Sources

Uses

The model is intended for commercial and research use in English and can be fine-tuned for use in other languages.

How to Get Started with the Model

Use the code below to get started with the model.

# pip install -q transformers

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

checkpoint = "Deci/DeciLM-6b-instruct"
device = "cuda" # for GPU usage or "cpu" for CPU usage

tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype=torch.bfloat16, trust_remote_code=True).to(device)

inputs = tokenizer.encode("How do I make french toast? Think through it step by step", return_tensors="pt").to(device)
outputs = model.generate(inputs, max_new_tokens=100, do_sample=True, top_p=0.95)
print(tokenizer.decode(outputs[0]))

Training Details

DeciLM 6B underwent training utilizing the SlimPijamas dataset, leveraging advanced proprietary methodologies allowing for fast training. DeciLM 6B was further finetuned on a subset of the OpenOrca dataset, giving rise to DeciLM-6B-Instruct.

Evaluation

Below are DeciLM's 6B-instruct evaluation results.

Average ARC Challenge* ARC Easy* BoolQ HellaSwag* LAMBDA OpenAI OpenBookQA PIQA TruthfulQA Winogrande
62.01 44.43 70.58 77.34 74.57 70.1 33 77.52 43.89 67.64
Accuracy-norm score*

Runtime Benchmarks

Inference Tool/Hardware A10 (tokens/sec)
PyTorch 652.49
Infery LLM 2,029.6
  • Throughput (tokens/sec) - Measured with optimal batch - PyTorch BS 64, Infery LLM BS 128
  • In order to replicate the results of the PyTorch benchmark, use this code example

Disclaimer

DeciLM 6B-Instruct has not been aligned for safety or trained using RLHF.

How to Cite

Please cite this model using this format.

@misc{DeciFoundationModels,
title = {DeciLM 6B Instruct},
author = {DeciAI Research Team},
year = {2023}
url={[https://huggingface.co./Deci/DeciLM-6b-instruct](https://huggingface.co./Deci/DeciLM-6b-instruct)},
}
Downloads last month
123
Safetensors
Model size
5.72B params
Tensor type
BF16
Β·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Model tree for Deci/DeciLM-6b-instruct

Adapters
1 model
Finetunes
16 models

Datasets used to train Deci/DeciLM-6b-instruct

Spaces using Deci/DeciLM-6b-instruct 6

Collection including Deci/DeciLM-6b-instruct

Evaluation results