nehcgs commited on
Commit
f4fba79
1 Parent(s): 7a1afda

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -11,7 +11,7 @@ pipeline_tag: text-generation
11
  library_name: transformers
12
  ---
13
 
14
- # katanemolabs/Arch-Function-7B
15
 
16
  ## Overview
17
  The Katanemo Arch-Function collection of large language models (LLMs) is a collection state-of-the-art (SOTA) LLMs specifically designed for **function calling** tasks. The models are designed to understand complex function signatures, identify required parameters, and produce accurate function call outputs based on natural language prompts. Achieving performance on par with GPT-4, these models set a new benchmark in the domain of function-oriented tasks, making them suitable for scenarios where automated API interaction and function execution is crucial.
@@ -192,7 +192,7 @@ import json
192
  from typing import Any, Dict, List
193
  from transformers import AutoModelForCausalLM, AutoTokenizer
194
 
195
- model_name = "katanemolabs/Arch-Function-7B"
196
  model = AutoModelForCausalLM.from_pretrained(
197
  model_name, device_map="auto", torch_dtype="auto", trust_remote_code=True
198
  )
 
11
  library_name: transformers
12
  ---
13
 
14
+ # katanemo/Arch-Function-7B
15
 
16
  ## Overview
17
  The Katanemo Arch-Function collection of large language models (LLMs) is a collection state-of-the-art (SOTA) LLMs specifically designed for **function calling** tasks. The models are designed to understand complex function signatures, identify required parameters, and produce accurate function call outputs based on natural language prompts. Achieving performance on par with GPT-4, these models set a new benchmark in the domain of function-oriented tasks, making them suitable for scenarios where automated API interaction and function execution is crucial.
 
192
  from typing import Any, Dict, List
193
  from transformers import AutoModelForCausalLM, AutoTokenizer
194
 
195
+ model_name = "katanemo/Arch-Function-7B"
196
  model = AutoModelForCausalLM.from_pretrained(
197
  model_name, device_map="auto", torch_dtype="auto", trust_remote_code=True
198
  )