Skip to content

Commit 0ebf804

Browse files
committed
Add OTel example
1 parent 4c7147b commit 0ebf804

7 files changed

Lines changed: 173 additions & 62 deletions

File tree

.devcontainer/devcontainer.json

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,8 @@
55
"context": ".."
66
},
77
"features": {
8-
"ghcr.io/azure/azure-dev/azd:latest": {}
8+
"ghcr.io/azure/azure-dev/azd:latest": {},
9+
"ghcr.io/devcontainers/features/docker-in-docker:latest": {}
910
},
1011
"customizations": {
1112
"vscode": {
@@ -22,4 +23,4 @@
2223
}
2324
},
2425
"remoteUser": "vscode"
25-
}
26+
}

.env.sample

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,3 +9,4 @@ OPENAI_MODEL=gpt-3.5-turbo
99
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
1010
GITHUB_MODEL=gpt-5-mini
1111
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
12+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317

.github/prompts/review_pr_comments.prompt.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -133,4 +133,4 @@ The thread ID starts with `PRRT_` and can be found in the GraphQL query response
133133
Note: This skill can be removed once the GitHub MCP server has added built-in support for replying to PR review comments and resolving threads.
134134
See:
135135
https://github.com/github/github-mcp-server/issues/1323
136-
https://github.com/github/github-mcp-server/issues/1768
136+
https://github.com/github/github-mcp-server/issues/1768

README.md

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -173,46 +173,46 @@ You can run the examples in this repository by executing the scripts in the `exa
173173
| [agent_mcp_local.py](examples/agent_mcp_local.py) | An agent connected to a local MCP server (e.g. for expense logging). |
174174
| [openai_tool_calling.py](examples/openai_tool_calling.py) | Tool calling with the low-level OpenAI SDK, showing manual tool dispatch. |
175175
| [workflow_basic.py](examples/workflow_basic.py) | A workflow-based agent. |
176-
| [agent_otel_aspire.py](examples/agent_otel_aspire.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to the [.NET Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
176+
| [agent_otel_aspire.py](examples/agent_otel_aspire.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to the [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
177177
178178
## Using the Aspire Dashboard for telemetry
179179
180-
The [agent_otel_aspire.py](examples/agent_otel_aspire.py) example can export OpenTelemetry traces, metrics, and structured logs to the [.NET Aspire Dashboard](https://aspire.dev/dashboard/standalone/), a free standalone container for visualizing telemetry.
180+
The [agent_otel_aspire.py](examples/agent_otel_aspire.py) example can export OpenTelemetry traces, metrics, and structured logs to a [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).
181181
182182
1. Start the Aspire Dashboard:
183183
184-
```shell
185-
docker run --rm -it -d \
186-
-p 18888:18888 \
187-
-p 4317:18889 \
188-
--name aspire-dashboard \
189-
mcr.microsoft.com/dotnet/aspire-dashboard:latest
184+
```sh
185+
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard mcr.microsoft.com/dotnet/aspire-dashboard:latest
190186
```
191187
192188
2. Get the login token from the container logs:
193189
194-
```shell
190+
```sh
195191
docker logs aspire-dashboard
196192
```
197193
198194
Look for the line containing `Login to the dashboard at http://localhost:18888/login?t=<TOKEN>`. Copy the token or open the URL directly.
199195
200-
3. Run the example with telemetry export enabled:
196+
3. Add the OTLP endpoint to your `.env` file:
201197
202-
```shell
203-
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 uv run python examples/agent_otel_aspire.py
198+
```sh
199+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
204200
```
205201
206-
4. Open the dashboard at <http://localhost:18888> and explore:
202+
4. Run the example:
203+
204+
```sh
205+
uv run agent_otel_aspire.py
206+
```
207+
208+
5. Open the dashboard at <http://localhost:18888> and explore:
207209
208210
* **Traces**: See the full span tree — agent invocation → chat completion → tool execution
209211
* **Metrics**: View token usage and operation duration histograms
210212
* **Structured Logs**: Browse conversation messages (system, user, assistant, tool)
211213
* **GenAI visualizer**: Select a chat completion span to see the rendered conversation
212214
213-
Without `OTEL_EXPORTER_OTLP_ENDPOINT` set, the example runs normally with no telemetry export and no errors.
214-
215-
5. When done, stop the dashboard:
215+
6. When done, stop the dashboard:
216216
217217
```shell
218218
docker stop aspire-dashboard

examples/agent_otel_aspire.py

Lines changed: 3 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -1,45 +1,3 @@
1-
"""
2-
OpenTelemetry + Aspire Dashboard example.
3-
4-
Demonstrates a tool-calling agent that exports OpenTelemetry traces, metrics,
5-
and structured logs to the .NET Aspire Dashboard via OTLP/gRPC.
6-
7-
Telemetry is only exported when the OTEL_EXPORTER_OTLP_ENDPOINT environment
8-
variable is set. Without it, the agent runs normally with no telemetry export.
9-
10-
To start the Aspire Dashboard:
11-
12-
docker run --rm -it -d \
13-
-p 18888:18888 \
14-
-p 4317:18889 \
15-
--name aspire-dashboard \
16-
mcr.microsoft.com/dotnet/aspire-dashboard:latest
17-
18-
The dashboard UI is at http://localhost:18888.
19-
Get the login token from the container logs:
20-
21-
docker logs aspire-dashboard
22-
23-
Look for: "Login to the dashboard at http://localhost:18888/login?t=<TOKEN>"
24-
25-
Then run this example with telemetry export enabled:
26-
27-
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 python examples/agent_otel_aspire.py
28-
29-
In the Aspire Dashboard you will see:
30-
- Traces: agent -> chat completion -> tool execution spans
31-
- Metrics: token usage and operation duration histograms
32-
- Structured Logs: conversation messages (system, user, assistant, tool)
33-
- GenAI telemetry visualizer: full conversation view on chat spans
34-
35-
To stop the dashboard:
36-
37-
docker stop aspire-dashboard
38-
39-
For the full Python + Aspire guide, see:
40-
https://aspire.dev/dashboard/standalone-for-python/
41-
"""
42-
431
import asyncio
442
import logging
453
import os
@@ -70,7 +28,9 @@
7028
configure_otel_providers(enable_sensitive_data=True)
7129
logger.info(f"OpenTelemetry export enabled — sending to {otlp_endpoint}")
7230
else:
73-
logger.info("Set OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 to export telemetry to the Aspire Dashboard")
31+
logger.info(
32+
"Set OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 in .env to export telemetry to the Aspire Dashboard"
33+
)
7434

7535
# Configure OpenAI client based on environment
7636
load_dotenv(override=True)

examples/spanish/README.md

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -174,6 +174,52 @@ Puedes ejecutar los ejemplos en este repositorio ejecutando los scripts en el di
174174
| [agent_mcp_local.py](agent_mcp_local.py) | Un agente conectado a un servidor MCP local (ej. para registro de gastos). |
175175
| [openai_tool_calling.py](openai_tool_calling.py) | Llamadas a funciones con el SDK de OpenAI de bajo nivel, mostrando despacho manual de herramientas. |
176176
| [workflow_basic.py](workflow_basic.py) | Usa Agent Framework para crear un agente basado en flujo de trabajo. |
177+
| [agent_otel_aspire.py](agent_otel_aspire.py) | Un agente con trazas, métricas y logs estructurados de OpenTelemetry exportados al [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
178+
179+
## Usar el Aspire Dashboard para telemetría
180+
181+
El ejemplo [agent_otel_aspire.py](agent_otel_aspire.py) puede exportar trazas, métricas y logs estructurados de OpenTelemetry a un [Aspire Dashboard](https://aspire.dev/dashboard/standalone/).
182+
183+
1. Inicia el Aspire Dashboard:
184+
185+
```sh
186+
docker run --rm -it -d -p 18888:18888 -p 4317:18889 --name aspire-dashboard mcr.microsoft.com/dotnet/aspire-dashboard:latest
187+
```
188+
189+
2. Obtén el token de inicio de sesión de los logs del contenedor:
190+
191+
```sh
192+
docker logs aspire-dashboard
193+
```
194+
195+
Busca la línea que contiene `Login to the dashboard at http://localhost:18888/login?t=<TOKEN>`. Copia el token o abre la URL directamente.
196+
197+
3. Agrega el endpoint OTLP a tu archivo `.env`:
198+
199+
```sh
200+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
201+
```
202+
203+
4. Ejecuta el ejemplo:
204+
205+
```sh
206+
uv run agent_otel_aspire.py
207+
```
208+
209+
5. Abre el dashboard en <http://localhost:18888> y explora:
210+
211+
* **Traces**: Ve el árbol completo de spans — invocación del agente → completado del chat → ejecución de herramientas
212+
* **Metrics**: Consulta histogramas de uso de tokens y duración de operaciones
213+
* **Structured Logs**: Navega los mensajes de la conversación (sistema, usuario, asistente, herramienta)
214+
* **Visualizador GenAI**: Selecciona un span de completado del chat para ver la conversación renderizada
215+
216+
6. Cuando termines, detén el dashboard:
217+
218+
```sh
219+
docker stop aspire-dashboard
220+
```
221+
222+
Para la guía completa de Python + Aspire, consulta [Usar el Aspire Dashboard con apps de Python](https://aspire.dev/dashboard/standalone-for-python/).
177223

178224
## Recursos
179225

Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,103 @@
1+
import asyncio
2+
import logging
3+
import os
4+
import random
5+
from datetime import datetime, timezone
6+
from typing import Annotated
7+
8+
from agent_framework import ChatAgent
9+
from agent_framework.observability import configure_otel_providers
10+
from agent_framework.openai import OpenAIChatClient
11+
from azure.identity.aio import DefaultAzureCredential, get_bearer_token_provider
12+
from dotenv import load_dotenv
13+
from pydantic import Field
14+
from rich import print
15+
from rich.logging import RichHandler
16+
17+
# Configura logging
18+
handler = RichHandler(show_path=False, rich_tracebacks=True, show_level=False)
19+
logging.basicConfig(level=logging.WARNING, handlers=[handler], force=True, format="%(message)s")
20+
logger = logging.getLogger(__name__)
21+
logger.setLevel(logging.INFO)
22+
23+
# Configura la exportación de OpenTelemetry al Aspire Dashboard (si el endpoint está configurado)
24+
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
25+
if otlp_endpoint:
26+
os.environ.setdefault("OTEL_EXPORTER_OTLP_PROTOCOL", "grpc")
27+
os.environ.setdefault("OTEL_SERVICE_NAME", "agent-framework-demo")
28+
configure_otel_providers(enable_sensitive_data=True)
29+
logger.info(f"Exportación OpenTelemetry habilitada — enviando a {otlp_endpoint}")
30+
else:
31+
logger.info(
32+
"Configura OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317 en .env"
33+
" para exportar telemetría al Aspire Dashboard"
34+
)
35+
36+
# Configura el cliente para usar Azure OpenAI, GitHub Models u OpenAI
37+
load_dotenv(override=True)
38+
API_HOST = os.getenv("API_HOST", "github")
39+
40+
async_credential = None
41+
if API_HOST == "azure":
42+
async_credential = DefaultAzureCredential()
43+
token_provider = get_bearer_token_provider(async_credential, "https://cognitiveservices.azure.com/.default")
44+
client = OpenAIChatClient(
45+
base_url=f"{os.environ['AZURE_OPENAI_ENDPOINT']}/openai/v1/",
46+
api_key=token_provider,
47+
model_id=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
48+
)
49+
elif API_HOST == "github":
50+
client = OpenAIChatClient(
51+
base_url="https://models.github.ai/inference",
52+
api_key=os.environ["GITHUB_TOKEN"],
53+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
54+
)
55+
else:
56+
client = OpenAIChatClient(
57+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
58+
)
59+
60+
61+
def get_weather(
62+
city: Annotated[str, Field(description="City name, spelled out fully")],
63+
) -> dict:
64+
"""Devuelve datos meteorológicos para una ciudad: temperatura y descripción."""
65+
logger.info(f"Obteniendo el clima para {city}")
66+
weather_options = [
67+
{"temperature": 22, "description": "Soleado"},
68+
{"temperature": 15, "description": "Lluvioso"},
69+
{"temperature": 13, "description": "Nublado"},
70+
{"temperature": 7, "description": "Ventoso"},
71+
]
72+
return random.choice(weather_options)
73+
74+
75+
def get_current_time(
76+
timezone_name: Annotated[
77+
str, Field(description="Timezone name, e.g. 'US/Eastern', 'America/Mexico_City', 'UTC'")
78+
],
79+
) -> str:
80+
"""Devuelve la fecha y hora actual en UTC (timezone_name es solo para contexto de visualización)."""
81+
logger.info(f"Obteniendo la hora actual para {timezone_name}")
82+
now = datetime.now(timezone.utc)
83+
return f"La hora actual en {timezone_name} es aproximadamente {now.strftime('%Y-%m-%d %H:%M:%S')} UTC"
84+
85+
86+
agent = ChatAgent(
87+
name="agente-clima-hora",
88+
chat_client=client,
89+
instructions="Eres un asistente útil que puede consultar información del clima y la hora.",
90+
tools=[get_weather, get_current_time],
91+
)
92+
93+
94+
async def main():
95+
response = await agent.run("¿Cómo está el clima en Ciudad de México y qué hora es en Buenos Aires?")
96+
print(response.text)
97+
98+
if async_credential:
99+
await async_credential.close()
100+
101+
102+
if __name__ == "__main__":
103+
asyncio.run(main())

0 commit comments

Comments
 (0)