base_model: LDCC/LDCC-SOLAR-10.7B pipeline_tag: text-generation
msy127/ft-240209-sft
Our Team
Research & Engineering | Product Management |
---|---|
David Sohn | David Sohn |
Model Details
Base Model
Trained On
- OS: Ubuntu 22.04
- GPU: A100 40GB 1ea
- transformers: v4.37
Implementation Code
This model contains the chat_template instruction format.
You can use the code below.
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="msy127/ft-240209-sft")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("msy127/ft-240209-sft")
model = AutoModelForCausalLM.from_pretrained("msy127/ft-240209-sft")
Introduction to our service platform
- AI Companion service platform that talks while looking at your face.
- You can preview the future of the world's best, character.ai.
- https://livetalkingai.com
- Downloads last month
- 51
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.