usmankhanic commited on
Commit
c01a71c
·
verified ·
1 Parent(s): 2744453

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -12,7 +12,7 @@ YAML tags: "coming soon"
12
 
13
  ## Overview
14
 
15
- The **APEX-E3 Dec-Enc Function-Call Model (v0.011)** is a fine-tuned T5-small model specialized in generating *function call structures* from plain English queries. Rather than producing unstructured text, this model outputs instructions for specific function calls, including all relevant parameters.
16
 
17
  With a special focus on capital market use cases, it was trained on queries that map directly to functions like **`selectStocks`**, **`run_backtest`**, and **`optimizer`**, ensuring precise extraction of parameters needed for advanced trading, backtesting, and portfolio optimization workflows.
18
 
@@ -25,7 +25,7 @@ This solution is **lightweight**, **highly performant**, and—thanks to our **n
25
  - Produces direct calls to your functions with minimal overhead.
26
 
27
  2. **Private & Agentic**
28
- - Ideal for organizations seeking on-premises or private cloud solutions where data control and agentic autonomy are paramount.
29
 
30
  3. **Ultra-Easy Training Approach**
31
  - Using a simple Python/Flask app and structured JSON, you can re-train or extend the model on your own custom function definitions in *minutes*, no large-scale ML infrastructure required.
@@ -40,7 +40,7 @@ This solution is **lightweight**, **highly performant**, and—thanks to our **n
40
  ## Model Details
41
 
42
  - **Base Model**: [T5-small](https://huggingface.co/t5-small)
43
- - **Fine-Tuning**: Customized data mapping natural language to function calls, specifically in capital markets contexts.
44
  - **Parameter Count**: ~60M
45
  - **Tokenizer**: T5 SentencePiece tokenizer
46
 
 
12
 
13
  ## Overview
14
 
15
+ The **APEX-E3 Dec-Enc Function-Call Model (v0.011)** is a fine-tuned T5-small model specialised in generating *function call structures* from plain English queries. Rather than producing unstructured text, this model outputs instructions for specific function calls, including all relevant parameters.
16
 
17
  With a special focus on capital market use cases, it was trained on queries that map directly to functions like **`selectStocks`**, **`run_backtest`**, and **`optimizer`**, ensuring precise extraction of parameters needed for advanced trading, backtesting, and portfolio optimization workflows.
18
 
 
25
  - Produces direct calls to your functions with minimal overhead.
26
 
27
  2. **Private & Agentic**
28
+ - Ideal for organisations seeking on-premises or private cloud solutions where data control and agentic autonomy are paramount.
29
 
30
  3. **Ultra-Easy Training Approach**
31
  - Using a simple Python/Flask app and structured JSON, you can re-train or extend the model on your own custom function definitions in *minutes*, no large-scale ML infrastructure required.
 
40
  ## Model Details
41
 
42
  - **Base Model**: [T5-small](https://huggingface.co/t5-small)
43
+ - **Fine-Tuning**: Customised data mapping natural language to function calls, specifically in capital markets contexts.
44
  - **Parameter Count**: ~60M
45
  - **Tokenizer**: T5 SentencePiece tokenizer
46