Transformers
GGUF
mergekit
Merge
Mistral_Star
Mistral_Quiet
Mistral
Mixtral
Question-Answer
Token-Classification
Sequence-Classification
SpydazWeb-AI
chemistry
biology
legal
code
climate
medical
LCARS_AI_StarTrek_Computer
text-generation-inference
chain-of-thought
tree-of-knowledge
forest-of-thoughts
visual-spacial-sketchpad
alpha-mind
knowledge-graph
entity-detection
encyclopedia
wikipedia
stack-exchange
Reddit
Cyber-series
MegaMind
Cybertron
SpydazWeb
Spydaz
LCARS
star-trek
mega-transformers
Mulit-Mega-Merge
Multi-Lingual
Afro-Centric
African-Model
Ancient-One
llama-cpp
gguf-my-repo
Inference Endpoints
conversational
c10x/_Spydaz_Web_AI_ChatQA_Reasoning101_Project-Q4_K_M-GGUF
This model was converted to GGUF format from LeroyDyer/_Spydaz_Web_AI_ChatQA_Reasoning101_Project
using llama.cpp via the ggml.ai's GGUF-my-repo space.
Refer to the original model card for more details on the model.
Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
brew install llama.cpp
Invoke the llama.cpp server or the CLI.
CLI:
llama-cli --hf-repo c10x/_Spydaz_Web_AI_ChatQA_Reasoning101_Project-Q4_K_M-GGUF --hf-file _spydaz_web_ai_chatqa_reasoning101_project-q4_k_m.gguf -p "The meaning to life and the universe is"
Server:
llama-server --hf-repo c10x/_Spydaz_Web_AI_ChatQA_Reasoning101_Project-Q4_K_M-GGUF --hf-file _spydaz_web_ai_chatqa_reasoning101_project-q4_k_m.gguf -c 2048
Note: You can also use this checkpoint directly through the usage steps listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
git clone https://github.com/ggerganov/llama.cpp
Step 2: Move into the llama.cpp folder and build it with LLAMA_CURL=1
flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
cd llama.cpp && LLAMA_CURL=1 make
Step 3: Run inference through the main binary.
./llama-cli --hf-repo c10x/_Spydaz_Web_AI_ChatQA_Reasoning101_Project-Q4_K_M-GGUF --hf-file _spydaz_web_ai_chatqa_reasoning101_project-q4_k_m.gguf -p "The meaning to life and the universe is"
or
./llama-server --hf-repo c10x/_Spydaz_Web_AI_ChatQA_Reasoning101_Project-Q4_K_M-GGUF --hf-file _spydaz_web_ai_chatqa_reasoning101_project-q4_k_m.gguf -c 2048
- Downloads last month
- 10