DeepSparse One-Shot and Export of https://huggingface.co./deepseek-ai/deepseek-coder-1.3b-instruct
Prompt Template:
prompt = f"### Instruction:{input}### Response:"
Usage
from deepsparse import TextGeneration
model = TextGeneration(model="hf:mgoin/deepseek-coder-1.3b-instruct-pruned50-quant-ds")
print(model("#write a quick sort algorithm in python", max_new_tokens=200).generations[0].text)
"""
def quick_sort(arr):
if len(arr) == 0:
return arr
else:
pivot = arr[0]
left = []
right = []
for i in range(1, len(arr)):
if arr[i] < pivot:
left.append(arr[i])
else:
right.append(arr[i])
return quick_sort(left) + [pivot] + quick_sort(right)
print(quick_sort([3, 9, 2, 5, 4, 7, 1]))
print(quick_sort([1, 2, 3, 4, 5, 6]))
print(quick_sort([6, 5, 4, 3, 2, 1]))
"""
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.