An agent-native canvas where notes, code, charts, visual explainers, widgets, documents and AI work together on one board.
For the ones who think in **connections**, not bullet points.
π Website Β· π Live App Β· π₯ Watch Demo Β· π MIT License
β Star if Dim0 is useful to you. It helps others find the project.
Canvas workspace: notes, charts, visual explainers, and AI agent on the same board.
dim0.mp4
Assistant performs multi-reasoning steps with tool calls.
Dim0 is built around one architectural bet: the canvas, not the chat, should be the primary interface for thinking with AI.
Many tools treat AI as an add-on to the main product. Dim0 is built the other way around: the board is a powerful workspace in its own right, with notes, code, widgets, nested boards, presentation frames, and documents living side by side, and it is built for an agent that can read context, use tools, and write results directly back into that workspace.
The agent is not just a chatbot. It can:
- Read live board context and selected nodes
- Reason across multiple steps with tool use
- Search the web and execute code
- Create and edit notes directly on the board
- Generate widgets and visual outputs inside the canvas
play-with-canvas.mp4
Everything on the board is a node. Dim0 supports:
- π· Shape nodes for diagrams and spatial structure
- π Rich text notes for writing and editing inside the board
- π» Code sandbox nodes for writing and running code
- π Widget nodes for embedded HTML/JS outputs like charts, visual explainers, and interactive tools
- π Document nodes for uploaded files and retrieval context
- ποΈ Nested boards for hierarchical organization
- ποΈ Frame-based presentation directly from the canvas
The assistant layer is built on the OpenAI Agents SDK and board-aware tools. It can work with:
- π§ Board context from the current graph and selected nodes
- βοΈ Note creation and editing tools
- π Web search and fetch tools
- π§ͺ Code execution via Daytona-backed sandboxes
- π¨ Widget and visual output generation
- π§ Semantic storage and retrieval backed by Qdrant
Model support includes OpenAI, Anthropic, Google Gemini, Mistral, Moonshot, DeepSeek, Qwen, and Z.ai.
- Hosted app: https://app.dim0.net
- Website: https://dim0.net
- Self-host: follow the setup below
This repository contains the full Dim0 product stack:
backend/: API, agent logic, prompts, model integrations, persistencewebui/: React frontend for the canvas, chat, and board UXbuild/: Docker Compose and build-related assets
- Node.js (LTS recommended)
uvfor Python dependency management- Docker + Docker Compose (optional, recommended for local services)
Before running Dim0, create a root .env from .env.sample and review the variables there:
cp .env.sample .envAt minimum, set:
OPENAI_API_KEYMISTRAL_API_KEYOPENROUTER_API_KEYLINKUP_API_KEY
Additional providers and tools can be enabled through the rest of .env.sample.
Important notes:
- Backend and frontend both read the root
.env - Only variables prefixed with
VITE_are exposed to the frontend
Pull and start the published stack:
make pull
make runOpen http://localhost:3000.
Stop it:
make down-runStop it and remove volumes:
make kill-runIf you want to run the source code locally instead of the published images, use the steps below.
make up-dbcd backend
uv sync
uv run python -m topix.api.appThe backend uses API_PORT from .env and defaults to 8081.
cd webui
npm install
npm run devThe frontend uses APP_PORT from .env and defaults to 5175.
The root .env.sample is the source of truth for available configuration.
It includes app ports and origins, model provider keys, search and image provider keys, local service settings, and backend auth and tracing options.
Use it as a checklist when setting up your local environment.
Deployment and local services are managed through Docker Compose with Makefile shortcuts.
| Command | What it does |
|---|---|
make up |
Build if needed and start all services |
make up-build |
Rebuild images, then start all services |
make build |
Build images only |
make rebuild |
Rebuild images without cache |
make down |
Stop and remove containers |
make kill |
Stop and remove containers, images, and volumes |
| Command | What it does |
|---|---|
make ps |
Show service status |
make logs |
Tail logs for all services |
make logs-s SERVICE=backend-dev |
Tail logs for one service |
make up-s SERVICE=backend-dev |
Start one service |
make build-s SERVICE=webui-dev |
Build one service |
make restart-s SERVICE=backend-dev |
Rebuild and restart one service |
make exec SERVICE=backend-dev CMD="bash" |
Open a shell in a service |
| Command | What it does |
|---|---|
make up-db |
Start only database services |
make down-db |
Stop only database services |
You can override the compose profile and env file at invocation time:
make up PROFILE=local ENVFILE=.envYou can also override ports and origins for quick tests:
make up PROFILE=dev API_PORT=9090 API_HOST_PORT=9090 API_ORIGIN=http://localhost:9090This repo can publish public Docker Hub images for self-hosting:
winlp4ever/dim0-backendwinlp4ever/dim0-webui
Example:
docker pull winlp4ever/dim0-backend:0.1.5
docker pull winlp4ever/dim0-webui:0.1.5You can also run the published images locally:
make pull
make run
make down-run
make kill-runDim0 uses one shared semantic version for the whole product. The source of truth is the repo-root VERSION file, and release tooling syncs that version into:
backend/pyproject.tomlwebui/package.jsonwebui/src-tauri/Cargo.toml
Version bumps use Commitizen with Conventional Commits.
Useful commands:
make version-check
make version-sync
make version-bumpThe repository also includes GitHub Actions workflows for version checks, releases, and Docker publishing.
- If the frontend cannot reach the API, check
VITE_API_URLin.env - If ports are already in use, change
API_PORTorAPP_PORT - If env changes are not applied, restart the backend and frontend after editing
.env - Use
make configto inspect the fully resolved Compose configuration
This repository is available under the MIT License.