Skip to content

Commit f27b916

Browse files
committed
Merge branch 'main' into workflows
2 parents a13c53d + 4a98192 commit f27b916

162 files changed

Lines changed: 1865 additions & 3505 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.devcontainer/Dockerfile

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,13 @@ FROM mcr.microsoft.com/devcontainers/python:3.12-bookworm
33
# Required for cloning agent-framework from GitHub (it uses Git LFS)
44
RUN apt-get update && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*
55

6+
# Required for building Rust-based Python packages (e.g. base2048 via pyrit/azure-ai-evaluation[redteam])
7+
ENV RUSTUP_HOME="/home/vscode/.rustup" CARGO_HOME="/home/vscode/.cargo"
8+
ENV PATH="/home/vscode/.cargo/bin:${PATH}"
9+
RUN mkdir -p /home/vscode/.rustup /home/vscode/.cargo \
10+
&& curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
11+
&& chown -R vscode:vscode /home/vscode/.rustup /home/vscode/.cargo
12+
613
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /usr/local/bin/
714

815
COPY pyproject.toml uv.lock /tmp/uv-tmp/

.devcontainer/devcontainer.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,8 @@
1212
"customizations": {
1313
"vscode": {
1414
"settings": {
15-
"python.defaultInterpreterPath": "/usr/local/bin/python",
15+
"python.defaultInterpreterPath": "${workspaceFolder}/.venv/bin/python",
16+
"python.terminal.activateEnvironment": true,
1617
"remote.autoForwardPorts": false,
1718
"python.createEnvironment.trigger": "off",
1819
"python.analysis.typeCheckingMode": "off"
@@ -23,5 +24,6 @@
2324
]
2425
}
2526
},
27+
"postCreateCommand": "uv sync",
2628
"remoteUser": "vscode"
2729
}

.env.sample

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,11 @@ AZURE_OPENAI_CHAT_DEPLOYMENT=YOUR-AZURE-DEPLOYMENT-NAME
77
OPENAI_API_KEY=YOUR-OPENAI-KEY
88
OPENAI_MODEL=gpt-3.5-turbo
99
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
10-
GITHUB_MODEL=gpt-5-mini
10+
GITHUB_MODEL=gpt-4.1-mini
1111
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
1212
# Configure for Redis (used by agent_history_redis.py, defaults to dev container Redis):
1313
REDIS_URL=redis://localhost:6379
14+
# Configure OTLP exporter (not needed in devcontainer, which sets these via docker-compose):
1415
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
1516
OTEL_EXPORTER_OTLP_PROTOCOL=grpc
1617
# Or use console exporters instead of OTLP:

README.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -117,13 +117,13 @@ If you want to run the scripts locally, you need to set up the `GITHUB_TOKEN` en
117117
export GITHUB_TOKEN=your_personal_access_token
118118
```
119119
120-
10. Optionally, you can use a model other than "gpt-5-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-5-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
120+
10. Optionally, you can use a model other than "gpt-4.1-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-4.1-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
121121
122122
## Using Azure AI Foundry models
123123
124124
You can run all examples in this repository using GitHub Models. If you want to run the examples using models from Azure AI Foundry instead, you need to provision the Azure AI resources, which will incur costs.
125125
126-
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-5-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
126+
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-4.1-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
127127
128128
1. Make sure the [Azure Developer CLI (azd)](https://aka.ms/install-azd) is installed.
129129
@@ -194,8 +194,6 @@ You can run the examples in this repository by executing the scripts in the `exa
194194
| [agent_with_subagent.py](examples/agent_with_subagent.py) | Context isolation with sub-agents to keep prompts focused on relevant tools. |
195195
| [agent_without_subagent.py](examples/agent_without_subagent.py) | Context bloat example where one agent carries all tool schemas in a single prompt. |
196196
| [agent_summarization.py](examples/agent_summarization.py) | Context compaction via summarization middleware to reduce token usage in long conversations. |
197-
| [workflow_magenticone.py](examples/workflow_magenticone.py) | A MagenticOne multi-agent workflow. |
198-
| [workflow_hitl.py](examples/workflow_hitl.py) | Human-in-the-loop (HITL) for tool-enabled agents with human feedback. |
199197
| [agent_middleware.py](examples/agent_middleware.py) | Agent, chat, and function middleware for logging, timing, and blocking. |
200198
| [agent_knowledge_aisearch.py](examples/agent_knowledge_aisearch.py) | Knowledge retrieval (RAG) using Azure AI Search with AgentFrameworkAzureAISearchRAG. |
201199
| [agent_knowledge_sqlite.py](examples/agent_knowledge_sqlite.py) | Knowledge retrieval (RAG) using a custom context provider with SQLite FTS5. |

chat_history.db

-20 KB
Binary file not shown.

examples/agent_basic.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,11 +24,11 @@
2424
client = OpenAIChatClient(
2525
base_url="https://models.github.ai/inference",
2626
api_key=os.environ["GITHUB_TOKEN"],
27-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
27+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
2828
)
2929
else:
3030
client = OpenAIChatClient(
31-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
31+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
3232
)
3333

3434
agent = Agent(client=client, instructions="You're an informational agent. Answer questions cheerfully.")

examples/agent_evaluation.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,22 +48,22 @@
4848
client = OpenAIChatClient(
4949
base_url="https://models.github.ai/inference",
5050
api_key=os.environ["GITHUB_TOKEN"],
51-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
51+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
5252
)
5353
eval_model_config = OpenAIModelConfiguration(
5454
type="openai",
5555
base_url="https://models.github.ai/inference",
5656
api_key=os.environ["GITHUB_TOKEN"],
57-
model="openai/gpt-5-mini",
57+
model="openai/gpt-4.1-mini",
5858
)
5959
else:
6060
client = OpenAIChatClient(
61-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
61+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
6262
)
6363
eval_model_config = OpenAIModelConfiguration(
6464
type="openai",
6565
api_key=os.environ["OPENAI_API_KEY"],
66-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
66+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
6767
)
6868

6969

examples/agent_evaluation_batch.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,13 +46,13 @@
4646
type="openai",
4747
base_url="https://models.github.ai/inference",
4848
api_key=os.environ["GITHUB_TOKEN"],
49-
model="openai/gpt-5-mini",
49+
model="openai/gpt-4.1-mini",
5050
)
5151
else:
5252
model_config = OpenAIModelConfiguration(
5353
type="openai",
5454
api_key=os.environ["OPENAI_API_KEY"],
55-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
55+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
5656
)
5757

5858
# Optional: Set AZURE_AI_PROJECT in .env to log results to Azure AI Foundry.

examples/agent_evaluation_generate.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,11 +45,11 @@
4545
client = OpenAIChatClient(
4646
base_url="https://models.github.ai/inference",
4747
api_key=os.environ["GITHUB_TOKEN"],
48-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
48+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4949
)
5050
else:
5151
client = OpenAIChatClient(
52-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
52+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
5353
)
5454

5555

examples/agent_history_redis.py

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@
3838
client = OpenAIChatClient(
3939
base_url="https://models.github.ai/inference",
4040
api_key=os.environ["GITHUB_TOKEN"],
41-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
41+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4242
)
4343
else:
4444
client = OpenAIChatClient(
45-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
45+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
4646
)
4747

4848

@@ -63,7 +63,7 @@ async def example_persistent_session() -> None:
6363
session_id = str(uuid.uuid4())
6464

6565
# Phase 1: Start a conversation with a Redis-backed history provider
66-
print("[dim]--- Phase 1: Starting conversation ---[/dim]")
66+
print("[bold]--- Phase 1: Starting conversation ---[/bold]")
6767
redis_provider = RedisHistoryProvider(source_id="redis_chat", redis_url=REDIS_URL)
6868

6969
agent = Agent(
@@ -84,7 +84,7 @@ async def example_persistent_session() -> None:
8484
print(f"[green]Agent:[/green] {response.text}")
8585

8686
# Phase 2: Simulate an application restart — reconnect using the same session ID in Redis
87-
print("\n[dim]--- Phase 2: Resuming after 'restart' ---[/dim]")
87+
print("\n[bold]--- Phase 2: Resuming after 'restart' ---[/bold]")
8888
redis_provider2 = RedisHistoryProvider(source_id="redis_chat", redis_url=REDIS_URL)
8989

9090
agent2 = Agent(
@@ -99,7 +99,6 @@ async def example_persistent_session() -> None:
9999
print("[blue]User:[/blue] Which of the cities I asked about had better weather?")
100100
response = await agent2.run("Which of the cities I asked about had better weather?", session=session2)
101101
print(f"[green]Agent:[/green] {response.text}")
102-
print("[dim]Note: The agent remembered the conversation from Phase 1 via Redis persistence.[/dim]")
103102

104103

105104
async def main() -> None:
@@ -111,17 +110,15 @@ async def main() -> None:
111110
try:
112111
r.ping()
113112
except Exception as e:
114-
print(f"[red]Cannot connect to Redis at {REDIS_URL}: {e}[/red]")
115-
print(
116-
"[red]Ensure Redis is running (e.g. via the dev container"
117-
" or 'docker run -p 6379:6379 redis:7-alpine').[/red]"
113+
logger.error(f"Cannot connect to Redis at {REDIS_URL}: {e}")
114+
logger.error(
115+
"Ensure Redis is running (e.g. via the dev container"
116+
" or 'docker run -p 6379:6379 redis:7-alpine')."
118117
)
119118
return
120119
finally:
121120
r.close()
122121

123-
print("[dim]Redis connection verified.[/dim]")
124-
125122
await example_persistent_session()
126123

127124
if async_credential:

0 commit comments

Comments
 (0)