Skip to content

Commit 7223c52

Browse files
docs: add AI provider examples for all LLM analytics integrations (#478)
* docs: add AI provider examples for all LLM analytics integrations Add 26 new example directories covering every provider documented in our LLM analytics onboarding docs. This includes OpenAI-compatible providers (Groq, DeepSeek, Mistral, xAI, Together AI, Ollama, Cohere, Hugging Face, Perplexity, Cerebras, Fireworks AI, OpenRouter, Helicone, Vercel AI Gateway, Azure OpenAI, Portkey) and custom integrations (Instructor, LangGraph, AWS Bedrock, CrewAI, Mirascope, LlamaIndex, DSPy, Semantic Kernel, smolagents, AutoGen). * fix: use max_tokens instead of max_completion_tokens for Mistral example Mistral's API doesn't support max_completion_tokens, causing 422 errors. * fix: remove unsupported posthog_distinct_id kwarg from Mirascope example Mirascope doesn't pass extra kwargs through to the OpenAI client. Tracking still works via the wrapped client. * fix: update Bedrock model ID and response parsing, fix Cerebras model
1 parent d234b53 commit 7223c52

156 files changed

Lines changed: 29531 additions & 0 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
POSTHOG_API_KEY=phc_your_project_api_key
2+
POSTHOG_HOST=https://us.i.posthog.com
3+
OPENAI_API_KEY=sk-your_api_key
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# AutoGen + PostHog AI Examples
2+
3+
Track AutoGen agent LLM calls with PostHog.
4+
5+
## Setup
6+
7+
```bash
8+
cp .env.example .env
9+
# Fill in your API keys in .env
10+
uv sync
11+
```
12+
13+
## Examples
14+
15+
- **agent.py** - AutoGen agent with PostHog tracking via OpenAI wrapper
16+
17+
## Run
18+
19+
```bash
20+
source .env
21+
uv run python agent.py
22+
```
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
"""AutoGen with PostHog tracking via OpenAI wrapper."""
2+
3+
import os
4+
import asyncio
5+
from posthog import Posthog
6+
from posthog.ai.openai import OpenAI
7+
from autogen_agentchat.agents import AssistantAgent
8+
from autogen_ext.models.openai import OpenAIChatCompletionClient
9+
10+
posthog = Posthog(
11+
os.environ["POSTHOG_API_KEY"],
12+
host=os.environ.get("POSTHOG_HOST", "https://us.i.posthog.com"),
13+
)
14+
openai_client = OpenAI(api_key=os.environ["OPENAI_API_KEY"], posthog_client=posthog)
15+
16+
model_client = OpenAIChatCompletionClient(
17+
model="gpt-4o-mini",
18+
openai_client=openai_client,
19+
)
20+
21+
agent = AssistantAgent("assistant", model_client=model_client)
22+
23+
24+
async def main():
25+
result = await agent.run(task="Tell me a fun fact about hedgehogs.")
26+
print(result)
27+
await model_client.close()
28+
29+
30+
asyncio.run(main())
31+
posthog.shutdown()
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
[project]
2+
name = "example-ai-autogen"
3+
version = "0.1.0"
4+
requires-python = ">=3.10"
5+
dependencies = [
6+
"posthog==7.9.12",
7+
"autogen-agentchat>=0.4.0",
8+
"autogen-ext[openai]>=0.4.0",
9+
]

examples/example-ai-autogen/uv.lock

Lines changed: 993 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
exclude-newer = "7 days"
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
POSTHOG_API_KEY=phc_your_project_api_key
2+
POSTHOG_HOST=https://us.i.posthog.com
3+
AWS_REGION=us-east-1
4+
AWS_ACCESS_KEY_ID=your_access_key
5+
AWS_SECRET_ACCESS_KEY=your_secret_key
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# AWS Bedrock + PostHog AI Examples
2+
3+
Track AWS Bedrock LLM calls with PostHog via OpenTelemetry instrumentation.
4+
5+
## Setup
6+
7+
```bash
8+
cp .env.example .env
9+
# Fill in your API keys in .env
10+
uv sync
11+
```
12+
13+
## Examples
14+
15+
- **chat.py** - Bedrock Converse API with OpenTelemetry tracing to PostHog
16+
17+
## Run
18+
19+
```bash
20+
source .env
21+
uv run python chat.py
22+
```
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
"""AWS Bedrock chat with OpenTelemetry instrumentation, tracked by PostHog."""
2+
3+
import os
4+
import boto3
5+
from opentelemetry import trace
6+
from opentelemetry.sdk.trace import TracerProvider
7+
from opentelemetry.sdk.trace.export import BatchSpanProcessor
8+
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
9+
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
10+
from opentelemetry.instrumentation.botocore import BotocoreInstrumentor
11+
12+
resource = Resource(attributes={SERVICE_NAME: "example-bedrock-app"})
13+
14+
exporter = OTLPSpanExporter(
15+
endpoint=f"{os.environ.get('POSTHOG_HOST', 'https://us.i.posthog.com')}/i/v0/ai/otel",
16+
headers={"Authorization": f"Bearer {os.environ['POSTHOG_API_KEY']}"},
17+
)
18+
19+
provider = TracerProvider(resource=resource)
20+
provider.add_span_processor(BatchSpanProcessor(exporter))
21+
trace.set_tracer_provider(provider)
22+
23+
BotocoreInstrumentor().instrument()
24+
25+
client = boto3.client(
26+
"bedrock-runtime",
27+
region_name=os.environ.get("AWS_REGION", "us-east-1"),
28+
)
29+
30+
response = client.converse(
31+
modelId="openai.gpt-oss-20b-1:0",
32+
messages=[
33+
{
34+
"role": "user",
35+
"content": [{"text": "Tell me a fun fact about hedgehogs."}],
36+
}
37+
],
38+
)
39+
40+
for block in response["output"]["message"]["content"]:
41+
if "text" in block:
42+
print(block["text"])
43+
break
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
[project]
2+
name = "example-ai-aws-bedrock"
3+
version = "0.1.0"
4+
requires-python = ">=3.10"
5+
dependencies = [
6+
"posthog==7.9.12",
7+
"boto3>=1.35.0",
8+
"opentelemetry-instrumentation-botocore>=0.49b0",
9+
"opentelemetry-sdk>=1.30.0",
10+
"opentelemetry-exporter-otlp-proto-http>=1.30.0",
11+
]

0 commit comments

Comments
 (0)