base model :
- google/gemma-7b
dataset :
- ayoubkirouane/Small-Instruct-Alpaca_Format
- yahma/alpaca-cleaned
get started :
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="ayoubkirouane/Gemma_7b_Alpaca")
or :
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("ayoubkirouane/Gemma_7b_Alpaca")
model = AutoModelForCausalLM.from_pretrained("ayoubkirouane/Gemma_7b_Alpaca")
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.