base_model: LDCC/LDCC-SOLAR-10.7B pipeline_tag: text-generation

msy127/ft-240209-sft

Our Team

Research & Engineering Product Management
David Sohn David Sohn

Model Details

Base Model

LDCC/LDCC-SOLAR-10.7B

Trained On

  • OS: Ubuntu 22.04
  • GPU: A100 40GB 1ea
  • transformers: v4.37

Implementation Code

This model contains the chat_template instruction format.
You can use the code below.

# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="msy127/ft-240209-sft")

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("msy127/ft-240209-sft")
model = AutoModelForCausalLM.from_pretrained("msy127/ft-240209-sft")

Introduction to our service platform

  • AI Companion service platform that talks while looking at your face.
  • You can preview the future of the world's best, character.ai.
  • https://livetalkingai.com
Downloads last month
51
Safetensors
Model size
10.9B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for msy127/ft-240209-sft

Quantizations
1 model