Skip to content

Commit 5dcf1ae

Browse files
noahkissclaude
andcommitted
Add openai-local provider for custom OpenAI-compatible endpoints
- Add OpenAiLocalProvider class that auto-discovers models via /models endpoint - Add example entry in llms.json - Fix config-example docs to use correct npm/api syntax (was type/base_url) - Add link to main docs site in config-example README Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 5dbd9ac commit 5dcf1ae

3 files changed

Lines changed: 19 additions & 9 deletions

File tree

config-example/README.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Custom Configuration Example
22

3-
This directory shows how to use custom `llms.json` and `providers.json` configuration files with the Docker container.
3+
This directory shows how to use custom `llms.json` and `providers.json` configuration files with the Docker container. For full documentation, see [llmspy.org/docs/configuration](https://llmspy.org/docs/configuration).
44

55
## Quick Start
66

@@ -93,24 +93,23 @@ docker run -p 8000:8000 \
9393

9494
### Custom Provider Configuration
9595

96-
Add a custom OpenAI-compatible provider:
96+
Add a custom OpenAI-compatible provider (LiteLLM, vLLM, text-generation-webui, etc.):
9797

9898
```json
9999
{
100100
"providers": {
101-
"my-custom-provider": {
101+
"my-backend": {
102102
"enabled": true,
103-
"type": "OpenAiProvider",
104-
"base_url": "https://api.example.com/v1",
105-
"api_key": "$MY_CUSTOM_API_KEY",
106-
"models": {
107-
"my-model": "provider-model-name"
108-
}
103+
"npm": "openai-local",
104+
"api": "http://localhost:8000/v1",
105+
"api_key": "$MY_BACKEND_API_KEY"
109106
}
110107
}
111108
}
112109
```
113110

111+
The `openai-local` provider auto-discovers models via the `/models` endpoint. You can name the provider anything you want and use any environment variable for the API key. (Note: `npm` specifies the provider type, not an npm package.)
112+
114113
### Enable Only Free Providers
115114

116115
```json

llms/llms.json

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -247,6 +247,12 @@
247247
"api": "http://127.0.0.1:1234/v1",
248248
"models": {}
249249
},
250+
"openai-local": {
251+
"enabled": false,
252+
"npm": "openai-local",
253+
"api": "http://localhost:8000/v1",
254+
"api_key": "$OPENAI_LOCAL_API_KEY"
255+
},
250256
"google": {
251257
"enabled": true,
252258
"safety_settings": [

llms/main.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1431,6 +1431,10 @@ async def get_models(self):
14311431
return ret
14321432

14331433

1434+
class OpenAiLocalProvider(LMStudioProvider):
1435+
sdk = "openai-local"
1436+
1437+
14341438
def get_provider_model(model_name):
14351439
for provider in g_handlers.values():
14361440
provider_model = provider.provider_model(model_name)
@@ -2838,6 +2842,7 @@ def __init__(self, cli_args: argparse.Namespace, extra_args: Dict[str, Any]):
28382842
CodestralProvider,
28392843
OllamaProvider,
28402844
LMStudioProvider,
2845+
OpenAiLocalProvider,
28412846
]
28422847
self.aspect_ratios = {
28432848
"1:1": "1024×1024",

0 commit comments

Comments
 (0)