|
--- |
|
base_model: |
|
- meta-llama/Llama-3.1-8B-Instruct |
|
language: |
|
- en |
|
license: apache-2.0 |
|
tags: |
|
- text-generation-inference |
|
- home |
|
- assistant |
|
status: available |
|
--- |
|
|
|
|
|
This model is a fine-tuned version of the [Llama-3.1-8B-Instruct](https://huggingface.co./meta-llama/Llama-3.1-8B-Instruct) model, designed to interact with and control smart home devices via a Home Assistant integration. |
|
|
|
The model follows the Llama 3.1 prompt format. It can provide information about the Home Assistant environment, including available devices and services, and can generate function-calling code to execute specific tasks within the Home Assistant ecosystem. |
|
|
|
|
|
Example "system" prompt: |
|
``` |
|
You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed or answer the following question with the information provided only. |
|
The current time and date is 08:12 AM on Thursday March 14, 2024 |
|
Services: light.turn_off(), light.turn_on(rgb_color,brightness), fan.turn_on(), fan.turn_off() |
|
Devices: |
|
light.office 'Office Light' = on;80% |
|
fan.office 'Office fan' = off |
|
light.kitchen 'Kitchen Light' = on;80%;red |
|
light.bedroom 'Bedroom Light' = off |
|
|
|
``` |
|
|
|
|
|
Output from the model will consist of a response that should be relayed back to the user, along with an optional code block that will invoke different Home Assistant "services". The output format from the model for function calling is as follows: |
|
|
|
|
|
````` |
|
turning on the kitchen lights for you now |
|
```homeassistant |
|
{ "service": "light.turn_on", "target_device": "light.kitchen" } |
|
``` |
|
````` |