Skip to content

Commit b738fd9

Browse files
feat(examples): migrate AI examples to OpenTelemetry instrumentation (#482)
## Problem Our AI examples use PostHog's direct SDK wrappers (`posthog.ai.openai`, `posthog.ai.anthropic`, etc.) for tracking LLM calls. We want to silently deprecate these in favor of standard OpenTelemetry auto-instrumentation, which is more portable and follows industry conventions. ## Changes Migrates Python AI examples from PostHog wrappers to OpenTelemetry auto-instrumentation: - **OpenAI-compatible providers** (Groq, DeepSeek, Mistral, xAI, Together AI, Ollama, Cohere, Hugging Face, Perplexity, Cerebras, Fireworks AI, OpenRouter, Helicone, Vercel AI Gateway, Portkey) → `opentelemetry-instrumentation-openai-v2` - **OpenAI** (all files), **Azure OpenAI**, **Instructor**, **Autogen**, **Mirascope**, **Semantic Kernel**, **smolagents** → `opentelemetry-instrumentation-openai-v2` - **Anthropic** (chat, streaming, extended thinking) → `opentelemetry-instrumentation-anthropic` - **LangChain**, **LangGraph** → `opentelemetry-instrumentation-langchain` - **LlamaIndex** → `opentelemetry-instrumentation-llamaindex` - **Gemini** → `opentelemetry-instrumentation-google-generativeai` All OTel-based examples set resource attributes to demonstrate the full feature set: ```python resource = Resource( attributes={ SERVICE_NAME: "example-groq-app", "posthog.distinct_id": "example-user", "foo": "bar", "conversation_id": "abc-123", } ) ``` These map to `distinct_id` and custom event properties via PostHog's OTLP ingestion endpoint. **Kept as-is:** CrewAI (uses LiteLLM callbacks, internally manages its own TracerProvider), LiteLLM/DSPy (use LiteLLM's built-in PostHog callback), OpenAI Agents (uses dedicated `posthog.ai.openai_agents.instrument()`), Pydantic AI (already OTel via `Agent.instrument_all()`), AWS Bedrock (already OTel via `opentelemetry-instrumentation-botocore`). Key implementation details: - Uses `SimpleSpanProcessor` instead of `BatchSpanProcessor` so spans export immediately without needing `provider.shutdown()` - `# noqa: E402` on intentional late imports after `Instrumentor().instrument()` calls - Azure OpenAI example uses the generic `gpt-4o` model name instead of a deployment-specific one ## How did you test this code? Manually ran each example against real provider API keys via `llm-analytics-apps/run-examples.sh` to verify: 1. Each script runs successfully end-to-end 2. Traces arrive at PostHog as `$ai_generation` events 3. Resource attributes (`posthog.distinct_id`, `foo`, `conversation_id`) flow through as event properties 4. `distinct_id` is correctly set on each event All examples passing `ruff format` and `ruff check`. This is an agent-authored PR — I haven't manually tested each provider end-to-end beyond spot checks, though all examples follow the same pattern and the migration was verified on several providers. ## Publish to changelog? No ## Docs update The onboarding docs will be updated separately in PostHog/posthog#53668 and PostHog/posthog.com#16236. ## 🤖 LLM context Co-authored with Claude Code. Related PRs: - PostHog/posthog-js#3349 (Node.js examples) - PostHog/posthog#53668 (in-app onboarding docs) - PostHog/posthog.com#16236 (docs.posthog.com TOC updates)
1 parent 794cf07 commit b738fd9

101 files changed

Lines changed: 11434 additions & 2807 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

examples/example-ai-anthropic/chat.py

Lines changed: 27 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,36 @@
1-
"""Anthropic chat with tool calling, tracked by PostHog."""
1+
"""Anthropic chat with tool calling, tracked via OpenTelemetry."""
22

33
import os
44
import json
55
import urllib.request
6-
from posthog import Posthog
7-
from posthog.ai.anthropic import Anthropic
6+
from opentelemetry import trace
7+
from opentelemetry.sdk.trace import TracerProvider
8+
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
9+
from posthog.ai.otel import PostHogSpanProcessor
10+
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
811

9-
posthog = Posthog(
10-
os.environ["POSTHOG_API_KEY"],
11-
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
12+
resource = Resource(
13+
attributes={
14+
SERVICE_NAME: "example-anthropic-app",
15+
"posthog.distinct_id": "example-user",
16+
"foo": "bar",
17+
"conversation_id": "abc-123",
18+
}
19+
)
20+
provider = TracerProvider(resource=resource)
21+
provider.add_span_processor(
22+
PostHogSpanProcessor(
23+
api_key=os.environ["POSTHOG_API_KEY"],
24+
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
25+
)
1226
)
13-
client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"], posthog_client=posthog)
27+
trace.set_tracer_provider(provider)
28+
29+
AnthropicInstrumentor().instrument()
30+
31+
import anthropic # noqa: E402
32+
33+
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
1434

1535
tools = [
1636
{
@@ -40,7 +60,6 @@ def get_weather(latitude: float, longitude: float, location_name: str) -> str:
4060
message = client.messages.create(
4161
model="claude-sonnet-4-5-20250929",
4262
max_tokens=1024,
43-
posthog_distinct_id="example-user",
4463
tools=tools,
4564
messages=[{"role": "user", "content": "What's the weather like in San Francisco?"}],
4665
)
@@ -53,5 +72,3 @@ def get_weather(latitude: float, longitude: float, location_name: str) -> str:
5372
elif block.type == "tool_use":
5473
result = get_weather(**block.input)
5574
print(result)
56-
57-
posthog.shutdown()

examples/example-ai-anthropic/extended_thinking.py

Lines changed: 27 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,41 @@
1-
"""Anthropic extended thinking, tracked by PostHog.
1+
"""Anthropic extended thinking, tracked via OpenTelemetry.
22
33
Extended thinking lets Claude show its reasoning process before responding.
44
"""
55

66
import os
7-
from posthog import Posthog
8-
from posthog.ai.anthropic import Anthropic
7+
from opentelemetry import trace
8+
from opentelemetry.sdk.trace import TracerProvider
9+
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
10+
from posthog.ai.otel import PostHogSpanProcessor
11+
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
912

10-
posthog = Posthog(
11-
os.environ["POSTHOG_API_KEY"],
12-
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
13+
resource = Resource(
14+
attributes={
15+
SERVICE_NAME: "example-anthropic-app",
16+
"posthog.distinct_id": "example-user",
17+
"foo": "bar",
18+
"conversation_id": "abc-123",
19+
}
1320
)
14-
client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"], posthog_client=posthog)
21+
provider = TracerProvider(resource=resource)
22+
provider.add_span_processor(
23+
PostHogSpanProcessor(
24+
api_key=os.environ["POSTHOG_API_KEY"],
25+
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
26+
)
27+
)
28+
trace.set_tracer_provider(provider)
29+
30+
AnthropicInstrumentor().instrument()
31+
32+
import anthropic # noqa: E402
33+
34+
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
1535

1636
message = client.messages.create(
1737
model="claude-sonnet-4-5-20250929",
1838
max_tokens=16000,
19-
posthog_distinct_id="example-user",
2039
thinking={"type": "enabled", "budget_tokens": 10000},
2140
messages=[
2241
{
@@ -31,5 +50,3 @@
3150
print(f"Thinking: {block.thinking}\n")
3251
elif block.type == "text":
3352
print(f"Answer: {block.text}")
34-
35-
posthog.shutdown()

examples/example-ai-anthropic/pyproject.toml

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,11 @@ name = "example-ai-anthropic"
33
version = "0.1.0"
44
requires-python = ">=3.10"
55
dependencies = [
6-
"posthog==7.9.12",
7-
"anthropic==0.86.0",
6+
"anthropic>=0.80.0",
7+
"opentelemetry-instrumentation-anthropic>=0.24.0",
8+
"opentelemetry-sdk>=1.30.0",
9+
"posthog[otel]>=7.11.0",
810
]
11+
12+
[tool.uv.sources]
13+
posthog = { path = "../.." }

examples/example-ai-anthropic/streaming.py

Lines changed: 27 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,38 @@
1-
"""Anthropic streaming chat, tracked by PostHog."""
1+
"""Anthropic streaming chat, tracked via OpenTelemetry."""
22

33
import os
4-
from posthog import Posthog
5-
from posthog.ai.anthropic import Anthropic
4+
from opentelemetry import trace
5+
from opentelemetry.sdk.trace import TracerProvider
6+
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
7+
from posthog.ai.otel import PostHogSpanProcessor
8+
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
69

7-
posthog = Posthog(
8-
os.environ["POSTHOG_API_KEY"],
9-
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
10+
resource = Resource(
11+
attributes={
12+
SERVICE_NAME: "example-anthropic-app",
13+
"posthog.distinct_id": "example-user",
14+
"foo": "bar",
15+
"conversation_id": "abc-123",
16+
}
1017
)
11-
client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"], posthog_client=posthog)
18+
provider = TracerProvider(resource=resource)
19+
provider.add_span_processor(
20+
PostHogSpanProcessor(
21+
api_key=os.environ["POSTHOG_API_KEY"],
22+
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
23+
)
24+
)
25+
trace.set_tracer_provider(provider)
26+
27+
AnthropicInstrumentor().instrument()
28+
29+
import anthropic # noqa: E402
30+
31+
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
1232

1333
stream = client.messages.create(
1434
model="claude-sonnet-4-5-20250929",
1535
max_tokens=1024,
16-
posthog_distinct_id="example-user",
1736
messages=[{"role": "user", "content": "Write a haiku about observability."}],
1837
stream=True,
1938
)
@@ -24,4 +43,3 @@
2443
print(event.delta.text, end="", flush=True)
2544

2645
print()
27-
posthog.shutdown()

0 commit comments

Comments
 (0)