Skip to content

Commit 774e981

Browse files
committed
Default server host to localhost for improved security
Set the default server host to '127.0.0.1' in server configuration and update all relevant server start calls to use this value. Updated README with a security note explaining the default binding and instructions for allowing external connections. This change helps prevent unintended external access by default.
1 parent 9814b58 commit 774e981

2 files changed

Lines changed: 8 additions & 6 deletions

File tree

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -216,15 +216,15 @@ You can then run the optillm proxy as follows.
216216
```bash
217217
python optillm.py
218218
2024-09-06 07:57:14,191 - INFO - Starting server with approach: auto
219-
2024-09-06 07:57:14,191 - INFO - Server configuration: {'approach': 'auto', 'mcts_simulations': 2, 'mcts_exploration': 0.2, 'mcts_depth': 1, 'best_of_n': 3, 'model': 'gpt-4o-mini', 'rstar_max_depth': 3, 'rstar_num_rollouts': 5, 'rstar_c': 1.4, 'base_url': ''}
219+
2024-09-06 07:57:14,191 - INFO - Server configuration: {'approach': 'auto', 'mcts_simulations': 2, 'mcts_exploration': 0.2, 'mcts_depth': 1, 'best_of_n': 3, 'model': 'gpt-4o-mini', 'rstar_max_depth': 3, 'rstar_num_rollouts': 5, 'rstar_c': 1.4, 'base_url': '', 'host': '127.0.0.1'}
220220
* Serving Flask app 'optillm'
221221
* Debug mode: off
222222
2024-09-06 07:57:14,212 - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
223-
* Running on all addresses (0.0.0.0)
224223
* Running on http://127.0.0.1:8000
225-
* Running on http://192.168.10.48:8000
226224
2024-09-06 07:57:14,212 - INFO - Press CTRL+C to quit
227225
```
226+
227+
> **Security Note**: By default, optillm binds to `127.0.0.1` (localhost only) for security. To allow external connections (e.g., for Docker or remote access), use `--host 0.0.0.0`. Only do this on trusted networks or with proper authentication configured via `--optillm-api-key`.
228228
## Usage
229229

230230
Once the proxy is running, you can use it as a drop in replacement for an OpenAI client by setting the `base_url` as `http://localhost:8000/v1`.

optillm/server.py

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -189,6 +189,7 @@ def count_reasoning_tokens(text: str, tokenizer=None) -> int:
189189
'base_url': '',
190190
'optillm_api_key': '',
191191
'return_full_response': False,
192+
'host': '127.0.0.1', # Default to localhost for security; use 0.0.0.0 to allow external connections
192193
'port': 8000,
193194
'log': 'info',
194195
'ssl_verify': True,
@@ -1264,7 +1265,8 @@ def process_batch_requests(batch_requests):
12641265
import gradio as gr
12651266
# Start server in a separate thread
12661267
import threading
1267-
server_thread = threading.Thread(target=app.run, kwargs={'host': '0.0.0.0', 'port': port})
1268+
host = server_config['host']
1269+
server_thread = threading.Thread(target=app.run, kwargs={'host': host, 'port': port})
12681270
server_thread.daemon = True
12691271
server_thread.start()
12701272

@@ -1311,12 +1313,12 @@ def chat_with_optillm(message, history):
13111313
description=f"Connected to OptILLM proxy at {base_url}"
13121314
)
13131315
demo.queue() # Enable queue to handle long operations properly
1314-
demo.launch(server_name="0.0.0.0", share=False)
1316+
demo.launch(server_name=host, share=False)
13151317
except ImportError:
13161318
logger.error("Gradio is required for GUI. Install it with: pip install gradio")
13171319
return
13181320

1319-
app.run(host='0.0.0.0', port=port)
1321+
app.run(host=server_config['host'], port=port)
13201322

13211323
if __name__ == "__main__":
13221324
main()

0 commit comments

Comments
 (0)