Skip to content

Commit d8c4b98

Browse files
Add Python project structure, models, and tools
Co-authored-by: MinecraftFuns <25814618+MinecraftFuns@users.noreply.github.com>
1 parent 1ffbe53 commit d8c4b98

17 files changed

Lines changed: 2449 additions & 0 deletions

python/langchain/.python-version

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
3.12

python/langchain/README.md

Lines changed: 133 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,133 @@
1+
# PromptPipe Agent
2+
3+
LangChain-based agentic conversation flow for PromptPipe. This Python service handles all intelligent conversation processing, while the Go layer acts as a pure message delivery service.
4+
5+
## Architecture
6+
7+
This service implements a 3-bot architecture using LangChain:
8+
9+
- **Coordinator Agent**: Routes conversations and manages overall flow
10+
- **Intake Agent**: Conducts intake conversations and builds user profiles
11+
- **Feedback Agent**: Tracks user feedback and updates profiles
12+
13+
### Tools
14+
15+
Each agent has access to specialized tools:
16+
17+
- **StateTransitionTool**: Manages transitions between conversation states
18+
- **ProfileSaveTool**: Saves and retrieves user profiles
19+
- **SchedulerTool**: Schedules daily habit prompts
20+
- **PromptGeneratorTool**: Generates personalized habit prompts
21+
22+
## Installation
23+
24+
This project uses `uv` for package management:
25+
26+
```bash
27+
# Install dependencies
28+
uv sync
29+
30+
# Install with dev dependencies
31+
uv sync --extra dev
32+
```
33+
34+
## Configuration
35+
36+
Create a `.env` file:
37+
38+
```bash
39+
OPENAI_API_KEY=your_openai_api_key
40+
OPENAI_MODEL=gpt-4o-mini
41+
OPENAI_TEMPERATURE=0.1
42+
43+
# Path to Go application state directory
44+
PROMPTPIPE_STATE_DIR=/var/lib/promptpipe
45+
46+
# Prompt files
47+
INTAKE_BOT_PROMPT_FILE=../../prompts/intake_bot_system.txt
48+
COORDINATOR_PROMPT_FILE=../../prompts/conversation_system_3bot.txt
49+
FEEDBACK_TRACKER_PROMPT_FILE=../../prompts/feedback_tracker_system.txt
50+
PROMPT_GENERATOR_PROMPT_FILE=../../prompts/prompt_generator_system.txt
51+
52+
# Timeouts
53+
FEEDBACK_INITIAL_TIMEOUT=15m
54+
FEEDBACK_FOLLOWUP_DELAY=3h
55+
56+
# API Configuration
57+
API_HOST=0.0.0.0
58+
API_PORT=8001
59+
```
60+
61+
## Running
62+
63+
```bash
64+
# Development mode
65+
uv run uvicorn promptpipe_agent.api.main:app --reload --port 8001
66+
67+
# Production mode
68+
uv run uvicorn promptpipe_agent.api.main:app --host 0.0.0.0 --port 8001
69+
```
70+
71+
## Testing
72+
73+
```bash
74+
# Run all tests
75+
uv run pytest
76+
77+
# Run with coverage
78+
uv run pytest --cov=promptpipe_agent --cov-report=html
79+
80+
# Run specific test file
81+
uv run pytest tests/unit/test_coordinator_agent.py
82+
```
83+
84+
## Development
85+
86+
```bash
87+
# Format code
88+
uv run black promptpipe_agent tests
89+
90+
# Lint
91+
uv run ruff check promptpipe_agent tests
92+
93+
# Type check
94+
uv run mypy promptpipe_agent
95+
```
96+
97+
## API Endpoints
98+
99+
### POST /process-message
100+
101+
Process a user message through the conversation flow.
102+
103+
**Request:**
104+
```json
105+
{
106+
"participant_id": "part_abc123",
107+
"message": "User's message text",
108+
"phone_number": "+15551234567"
109+
}
110+
```
111+
112+
**Response:**
113+
```json
114+
{
115+
"response": "Agent's response text",
116+
"state": "COORDINATOR",
117+
"metadata": {}
118+
}
119+
```
120+
121+
### GET /health
122+
123+
Health check endpoint.
124+
125+
## Integration with Go Service
126+
127+
The Go service delegates message processing to this Python service via HTTP:
128+
129+
1. Go receives WhatsApp message
130+
2. Go calls Python's `/process-message` endpoint
131+
3. Python processes message through appropriate agent
132+
4. Python returns response
133+
5. Go sends response via WhatsApp

python/langchain/promptpipe_agent/__init__.py

Whitespace-only changes.

python/langchain/promptpipe_agent/agents/__init__.py

Whitespace-only changes.

python/langchain/promptpipe_agent/api/__init__.py

Whitespace-only changes.
Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
"""Configuration management for PromptPipe Agent."""
2+
3+
from pydantic import Field
4+
from pydantic_settings import BaseSettings, SettingsConfigDict
5+
6+
7+
class Settings(BaseSettings):
8+
"""Application settings loaded from environment variables."""
9+
10+
model_config = SettingsConfigDict(
11+
env_file=".env",
12+
env_file_encoding="utf-8",
13+
case_sensitive=False,
14+
extra="ignore",
15+
)
16+
17+
# OpenAI Configuration
18+
openai_api_key: str = Field(..., description="OpenAI API key")
19+
openai_model: str = Field(default="gpt-4o-mini", description="OpenAI model to use")
20+
openai_temperature: float = Field(default=0.1, description="OpenAI temperature")
21+
22+
# State Directory
23+
promptpipe_state_dir: str = Field(
24+
default="/var/lib/promptpipe", description="PromptPipe state directory"
25+
)
26+
27+
# Prompt Files (relative to project root or absolute paths)
28+
intake_bot_prompt_file: str = Field(
29+
default="../../prompts/intake_bot_system.txt",
30+
description="Path to intake bot system prompt",
31+
)
32+
coordinator_prompt_file: str = Field(
33+
default="../../prompts/conversation_system_3bot.txt",
34+
description="Path to coordinator system prompt",
35+
)
36+
feedback_tracker_prompt_file: str = Field(
37+
default="../../prompts/feedback_tracker_system.txt",
38+
description="Path to feedback tracker system prompt",
39+
)
40+
prompt_generator_prompt_file: str = Field(
41+
default="../../prompts/prompt_generator_system.txt",
42+
description="Path to prompt generator system prompt",
43+
)
44+
45+
# Timeouts
46+
feedback_initial_timeout: str = Field(
47+
default="15m", description="Initial feedback timeout (e.g., 15m)"
48+
)
49+
feedback_followup_delay: str = Field(
50+
default="3h", description="Follow-up feedback delay (e.g., 3h)"
51+
)
52+
53+
# Chat History
54+
chat_history_limit: int = Field(
55+
default=-1,
56+
description="Limit for chat history (-1: unlimited, 0: none, N: last N messages)",
57+
)
58+
59+
# API Configuration
60+
api_host: str = Field(default="0.0.0.0", description="API host")
61+
api_port: int = Field(default=8001, description="API port")
62+
63+
# Debug Mode
64+
debug: bool = Field(default=False, description="Enable debug mode")
65+
66+
67+
# Global settings instance
68+
settings = Settings()

python/langchain/promptpipe_agent/models/__init__.py

Whitespace-only changes.
Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
"""Data models for PromptPipe Agent."""
2+
3+
from datetime import datetime
4+
from enum import Enum
5+
from typing import Any, Optional
6+
7+
from pydantic import BaseModel, Field
8+
9+
10+
class ConversationState(str, Enum):
11+
"""Conversation flow states."""
12+
13+
COORDINATOR = "COORDINATOR"
14+
INTAKE = "INTAKE"
15+
FEEDBACK = "FEEDBACK"
16+
CONVERSATION_ACTIVE = "CONVERSATION_ACTIVE"
17+
18+
19+
class MessageRole(str, Enum):
20+
"""Message roles in conversation."""
21+
22+
USER = "user"
23+
ASSISTANT = "assistant"
24+
SYSTEM = "system"
25+
26+
27+
class ConversationMessage(BaseModel):
28+
"""A single message in the conversation history."""
29+
30+
role: MessageRole
31+
content: str
32+
timestamp: datetime = Field(default_factory=datetime.now)
33+
34+
35+
class ConversationHistory(BaseModel):
36+
"""Full conversation history for a participant."""
37+
38+
messages: list[ConversationMessage] = Field(default_factory=list)
39+
40+
41+
class UserProfile(BaseModel):
42+
"""User profile containing personalization data."""
43+
44+
participant_id: str
45+
habit_domain: Optional[str] = None
46+
prompt_anchor: Optional[str] = None
47+
motivational_frame: Optional[str] = None
48+
preferred_time: Optional[str] = None
49+
other_personalization: Optional[str] = None
50+
created_at: datetime = Field(default_factory=datetime.now)
51+
updated_at: datetime = Field(default_factory=datetime.now)
52+
53+
def to_context_string(self) -> str:
54+
"""Convert profile to a context string for LLM."""
55+
parts = []
56+
if self.habit_domain:
57+
parts.append(f"Habit Domain: {self.habit_domain}")
58+
if self.prompt_anchor:
59+
parts.append(f"Prompt Anchor: {self.prompt_anchor}")
60+
if self.motivational_frame:
61+
parts.append(f"Motivational Frame: {self.motivational_frame}")
62+
if self.preferred_time:
63+
parts.append(f"Preferred Time: {self.preferred_time}")
64+
if self.other_personalization:
65+
parts.append(f"Other Personalization: {self.other_personalization}")
66+
return "\n".join(parts) if parts else "No profile data available."
67+
68+
69+
class ProcessMessageRequest(BaseModel):
70+
"""Request to process a user message."""
71+
72+
participant_id: str
73+
message: str
74+
phone_number: str
75+
76+
77+
class ProcessMessageResponse(BaseModel):
78+
"""Response from processing a user message."""
79+
80+
response: str
81+
state: ConversationState
82+
metadata: dict[str, Any] = Field(default_factory=dict)
83+
84+
85+
class HealthResponse(BaseModel):
86+
"""Health check response."""
87+
88+
status: str
89+
version: str = "0.1.0"

0 commit comments

Comments
 (0)