Skip to content
This repository was archived by the owner on Jul 1, 2024. It is now read-only.

Commit c836951

Browse files
committed
update docs
1 parent d019256 commit c836951

11 files changed

Lines changed: 347 additions & 17 deletions

File tree

docs/content/1.getting-started/1.index.md

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,15 @@ title: Introduction
33
description: Welcome to Ava, your very own personal assistant.
44
---
55

6-
TODO
6+
Ava is a personal assistant designed to be easy to use and highly customizable. It can help you with a variety of tasks, such as automating tasks, retrieving information, controlloing your smart home, and performing custom actions based on your needs.
7+
8+
Ava is built on top of Large Language Models (LLMs) and Text Embedding Models. You can configure Ava to use different LLMs and Text Embedding Model to suit your needs.
9+
10+
You can extend Ava's functionality by adding new skills, which are simply regular HTTP calls with instruction definitions for the LLMs to consider.
11+
12+
Ava can be set up in two ways:
13+
14+
- [Home Assistant Add-on and Component](/installation/haos) - The easiest way to get started with Ava.
15+
- [Standalone Docker Container](/installation/standalone) - Run Ava on any machine with Docker.
16+
17+
API documentation is on the way, for now you can just use `/v1/chat/completions` endpoint with `POST` same way as you would use OpenAI API to interact with your Ava instance. Compatablity is not guaranteed, but it should work for most use cases except for streaming.

docs/content/2.installation/1.haos.md

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -3,19 +3,18 @@ title: Home Assistant OS
33
description: Get Ava up and running on your Home Assistant OS instance
44
---
55

6-
7-
## Home Assistant Installation
6+
You would need to install the Ava add-on repository in Home Assistant, install Ollama and Ava Server add-ons, and then configure the Ava custom integration in Home Assistant, the process will take you 5-10 minutes given you have an Home Assistant OS instance already running.
87

98
### Add-ons
109

1110
[![Open Home Assistant instance and add repository](https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg)](https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https://github.com/0x77dev/ava)
1211

13-
- **Ollama**: The easiest way to get up and running with large language models locally.
14-
- **Ava Server**: Ava server for Home Assistant integration and OpenAI chat completion compatible endpoint.
12+
- **Ollama**: The easiest way to get up and running with large language models locally.
13+
- **Ava Server**: Ava server for Home Assistant integration and OpenAI chat completion compatible endpoint.
1514

16-
Just install the addons from the repository and configure them as needed.
15+
Just install the add-ons from the repository and configure them as needed.
1716

18-
Ava server has default configuration for Anthropic as LLM and Ollama add-on as embeddings provider.
17+
Ava server has a default configuration for Anthropic as the LLM and the Ollama add-on as the embeddings provider.
1918

2019
### Custom component
2120

@@ -27,8 +26,8 @@ Ava server has default configuration for Anthropic as LLM and Ollama add-on as e
2726

2827
Copy the [`custom_components/ava`](./custom_components/ava) folder to `/config/custom_components` in Home Assistant.
2928

30-
Download the custom component zip via [GitZip](https://kinolien.github.io/gitzip/?download=https://github.com/0x77dev/ava/tree/main/custom_components/ava).
29+
Alternatively, you can download the custom component zip via [GitZip](https://kinolien.github.io/gitzip/?download=https://github.com/0x77dev/ava/tree/main/custom_components/ava).
3130

3231
#### Configuration
3332

34-
Set up the Ava integration in Home Assistant and configure the server URL and API key. It comes with default configuration for Home Assistant Ava Server add-on.
33+
Set up the Ava integration in Home Assistant and configure the server URL and API key. It comes with a default configuration for the Home Assistant Ava Server add-on.

docs/content/2.installation/2.standalone.md

Lines changed: 38 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,41 @@ title: Standalone
33
description: Get Ava up and running on any machine with Docker
44
---
55

6-
TODO
6+
If you plan to run Ava on a machine without Home Assistant, you can use the standalone Docker container. This container is designed to be easy to use and highly customizable.
7+
8+
## Docker Image
9+
10+
[GitHub Packages](https://github.com/0x77dev/ava/pkgs/container/ava%2Fserver)
11+
12+
We provide both arm64 and amd64 images for Ava. You can pull the latest image from Docker Hub by running:
13+
14+
### Release Channels
15+
16+
- `latest` - Latest stable release
17+
- `edge` - Latest development release
18+
- `vX.Y.Z` - Specific version
19+
20+
21+
### Pulling the image
22+
23+
```bash
24+
docker pull docker pull ghcr.io/0x77dev/ava/server:edge
25+
```
26+
27+
### Configuration
28+
29+
Refer to the [Configuration section](/configuration) for more information on how to configure Ava using environment variables.
30+
31+
### Running the container
32+
33+
```bash
34+
export LLM='{"namespace":"anthropic","name":"claude-3-opus-20240229","token": "place your token here"}'
35+
export EMBEDDINGS='{"namespace":"ollama","name":"snowflake-arctic-embed, "host": "http://host.docker.internal:11434"}'
36+
export SKILLS='[]'
37+
export PORT=3000
38+
39+
docker run -d --name ava-server \
40+
-p $PORT:$PORT \
41+
-e LLM -e EMBEDDINGS -e SKILLS -e PORT \
42+
ghcr.io/0x77dev/ava/server:edge
43+
```
Lines changed: 200 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,204 @@
11
---
22
title: Configuration
3+
description: Configure Ava to your liking
34
---
45

5-
TODO
6+
You can configure skills and models in Ava Server by setting their respective environment variables. The following is a list of all available environment variables and their default values.
7+
8+
The configuration consists of three environment variables: `LLM`, `EMBEDDINGS`, and `SKILLS`.
9+
10+
## Schema
11+
12+
::field{name="LLM" type="JSON String" required}
13+
::collapsible
14+
The `LLM` environment variable is used to set the language model that Ava will use. Examples for different providers:
15+
16+
::tabs
17+
::div
18+
---
19+
label: Anthropic
20+
icon: i-simple-icons-anthropic
21+
---
22+
23+
```json
24+
{
25+
"namespace": "anthropic",
26+
"name": "claude-3-opus-20240229",
27+
"token": "YOUR_API_TOKEN"
28+
}
29+
```
30+
::
31+
32+
::div
33+
---
34+
label: OpenAI
35+
icon: i-simple-icons-openai
36+
---
37+
38+
```json
39+
{
40+
"namespace": "openai",
41+
"name": "gpt-3.5-turbo",
42+
"token": "YOUR_API_TOKEN"
43+
}
44+
```
45+
::
46+
47+
::div
48+
---
49+
label: Ollama
50+
icon: i-ava-ollama-logo
51+
---
52+
53+
```json
54+
{
55+
"namespace": "anthropic",
56+
"name": "claude-3-opus-20240229"
57+
}
58+
```
59+
::
60+
::
61+
::
62+
63+
::field{name="EMBEDDINGS" type="JSON String" required}
64+
::collapsible
65+
The `EMBEDDINGS` environment variable is used to set the text embeddings model that Ava will use. Examples for different providers:
66+
67+
::tabs
68+
::div
69+
---
70+
label: OpenAI
71+
icon: i-simple-icons-openai
72+
---
73+
74+
```json
75+
{
76+
"namespace": "openai",
77+
"name": "text-embedding-3-small",
78+
"token": "YOUR_API_TOKEN"
79+
}
80+
```
81+
::
82+
83+
::div
84+
---
85+
label: Ollama
86+
icon: i-ava-ollama-logo
87+
---
88+
89+
```json
90+
{
91+
"namespace": "anthropic",
92+
"name": "arctic-snowflake-embed:22m",
93+
// You can also specify base url if not localhost
94+
"baseURL": "http://04c4e5a1-ollama:11434"
95+
}
96+
```
97+
::
98+
::
99+
::
100+
101+
::field{name="SKILLS" type="JSON String"}
102+
::collapsible
103+
The `SKILLS` environment variable is used to set the HTTP skills that Ava will use. Here are both configuration example and simple skill implementation:
104+
105+
::tabs
106+
::div
107+
---
108+
label: Configuration
109+
---
110+
111+
```json
112+
[
113+
{
114+
"name": "Calculator",
115+
// You can describe the skill here,
116+
// maybe add some examples or instructions for the model to consider
117+
"description": "Performs mathematical calculations, input example: 2 + 2",
118+
"url": "http://localhost:3000/skills/calculator",
119+
"returnDirect": true
120+
},
121+
{
122+
"name": "Wikipedia",
123+
"description": "Searches Wikipedia for information",
124+
"url": "http://localhost:3000/skills/wikipedia",
125+
"returnDirect": false
126+
}
127+
]
128+
```
129+
::
130+
131+
::div
132+
---
133+
label: Node.js Example
134+
icon: i-simple-icons-javascript
135+
---
136+
137+
```javascript
138+
const { createServer } = require('http');
139+
140+
createServer((req, res) => {
141+
if (req.url === '/skills/calculator') {
142+
// read body and perform calculation
143+
// result will be returned directly to the user
144+
res.end('2 + 2 = 4');
145+
} else if (req.url === '/skills/wikipedia') {
146+
// read body and perform search
147+
res.end('result here will be returned to Ava to be processed further');
148+
} else {
149+
res.end('Skill not found');
150+
}
151+
}).listen(3000);
152+
```
153+
::
154+
::
155+
::
156+
157+
::field{name="HOMEASSISTANT" type="JSON String"}
158+
::collapsible
159+
The `HOMEASSISTANT` environment variable is used to set the Home Assistant instance that Ava will use. Examples for different providers:
160+
161+
::tabs
162+
::div
163+
---
164+
label: Configuration
165+
---
166+
167+
```json
168+
{
169+
"url": "http://homeassistant.local:8123",
170+
"token": "Your long lived access token",
171+
"disabledEntitiesPrefixes": ["update", "device_tracker"],
172+
// supervisor option used only inside the addon
173+
// by default addon will configure HOMEASSISTANT env
174+
"supervisor": false
175+
}
176+
```
177+
::
178+
179+
::div
180+
---
181+
label: Node.js Example
182+
icon: i-simple-icons-javascript
183+
---
184+
185+
```javascript
186+
const { createServer } = require('http');
187+
188+
createServer((req, res) => {
189+
if (req.url === '/skills/calculator') {
190+
// read body and perform calculation
191+
// result will be returned directly to the user
192+
res.end('2 + 2 = 4');
193+
} else if (req.url === '/skills/wikipedia') {
194+
// read body and perform search
195+
res.end('result here will be returned to Ava to be processed further');
196+
} else {
197+
res.end('Skill not found');
198+
}
199+
}).listen(3000);
200+
```
201+
::
202+
::
203+
::
204+

docs/content/4.addons/1.ollama.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
---
2+
title: Ollama
3+
description: The easiest way to get up and running with large language models locally
4+
---
5+
6+
[![Open Home Assistant instance and add repository](https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg)](https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https://github.com/0x77dev/ava)
7+
8+
Models are stored in `/config/ollama`.
9+
10+
Utilize the `OLLAMA_HOST="http://homeassistant.local:11434"` environment variable to use Ollama CLI on another machine.
11+
12+
Refer to the [Ollama documentation](https://github.com/ollama/ollama/tree/main/docs) for further details..
13+
14+
Example for pulling models:
15+
16+
Using Ollama CLI:
17+
```bash
18+
export OLLAMA_HOST="http://homeassistant.local:11434"
19+
ollama pull nomic-text-embed
20+
```
21+
22+
Using curl:
23+
```bash
24+
curl http://homeassistant.local:11434/api/pull -d '{
25+
"name": "nomic-text-embed"
26+
}'
27+
```
28+
29+
[Ava Server](/addons/server) will automatically pull models on start when they are requested from Ollama.

docs/content/4.addons/2.server.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
---
2+
description: Home Assistant add-on for hosting your instance of Ava
3+
---
4+
5+
[![Open Home Assistant instance and add repository](https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg)](https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https://github.com/0x77dev/ava)
6+
7+
By default configuration uses Anthropic Opus 3 for language model, snowflake-arctic-embed using [Ollama](/addons/ollama) for text embedding model, and empty HTTP skills list.
8+
9+
In default configuration Ollama's `baseURL` is set to `http://04c4e5a1-ollama:11434` which is the default address for Ollama add-on in Ava's repository.
10+
11+
For more information on how to configure Ava, refer to the [Configuration section](https://ava.0x77.dev/configuration).
12+
13+
Server by default will pull Ollama models on start when they are requested from Ollama.

docs/content/4.addons/_dir.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
title: Add-ons Repository
2+
description: Home Assistant Add-ons Repository documentation

0 commit comments

Comments
 (0)