Skip to content

Commit 6dbe1a1

Browse files
committed
Merge branch 'main' into workflowbuilders
2 parents af0abc2 + 111130e commit 6dbe1a1

115 files changed

Lines changed: 1381 additions & 3539 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.devcontainer/Dockerfile

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,13 @@ FROM mcr.microsoft.com/devcontainers/python:3.12-bookworm
33
# Required for cloning agent-framework from GitHub (it uses Git LFS)
44
RUN apt-get update && apt-get install -y git-lfs && git lfs install && rm -rf /var/lib/apt/lists/*
55

6+
# Required for building Rust-based Python packages (e.g. base2048 via pyrit/azure-ai-evaluation[redteam])
7+
ENV RUSTUP_HOME="/home/vscode/.rustup" CARGO_HOME="/home/vscode/.cargo"
8+
ENV PATH="/home/vscode/.cargo/bin:${PATH}"
9+
RUN mkdir -p /home/vscode/.rustup /home/vscode/.cargo \
10+
&& curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \
11+
&& chown -R vscode:vscode /home/vscode/.rustup /home/vscode/.cargo
12+
613
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /usr/local/bin/
714

815
COPY pyproject.toml uv.lock /tmp/uv-tmp/

.env.sample

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ AZURE_OPENAI_CHAT_DEPLOYMENT=YOUR-AZURE-DEPLOYMENT-NAME
77
OPENAI_API_KEY=YOUR-OPENAI-KEY
88
OPENAI_MODEL=gpt-3.5-turbo
99
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
10-
GITHUB_MODEL=gpt-5-mini
10+
GITHUB_MODEL=gpt-4.1-mini
1111
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
1212
# Configure for Redis (used by agent_history_redis.py, defaults to dev container Redis):
1313
REDIS_URL=redis://localhost:6379

README.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,13 +117,13 @@ If you want to run the scripts locally, you need to set up the `GITHUB_TOKEN` en
117117
export GITHUB_TOKEN=your_personal_access_token
118118
```
119119
120-
10. Optionally, you can use a model other than "gpt-5-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-5-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
120+
10. Optionally, you can use a model other than "gpt-4.1-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-4.1-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
121121
122122
## Using Azure AI Foundry models
123123
124124
You can run all examples in this repository using GitHub Models. If you want to run the examples using models from Azure AI Foundry instead, you need to provision the Azure AI resources, which will incur costs.
125125
126-
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-5-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
126+
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-4.1-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
127127
128128
1. Make sure the [Azure Developer CLI (azd)](https://aka.ms/install-azd) is installed.
129129
@@ -191,11 +191,14 @@ You can run the examples in this repository by executing the scripts in the `exa
191191
| [agent_memory_redis.py](examples/agent_memory_redis.py) | Long-term memory with RedisContextProvider, storing and retrieving conversational context from Redis. |
192192
| [agent_memory_mem0.py](examples/agent_memory_mem0.py) | Long-term memory with Mem0 OSS, extracting and recalling distilled user facts across sessions. |
193193
| [agent_supervisor.py](examples/agent_supervisor.py) | A supervisor orchestrating activity and recipe sub-agents. |
194+
<<<<<<< HEAD
194195
| [agent_with_subagent.py](examples/agent_with_subagent.py) | Context isolation with sub-agents to keep prompts focused on relevant tools. |
195196
| [agent_without_subagent.py](examples/agent_without_subagent.py) | Context bloat example where one agent carries all tool schemas in a single prompt. |
196197
| [agent_summarization.py](examples/agent_summarization.py) | Context compaction via summarization middleware to reduce token usage in long conversations. |
197198
| [workflow_magenticone.py](examples/workflow_magenticone.py) | A MagenticOne multi-agent workflow. |
198199
| [workflow_hitl.py](examples/workflow_hitl.py) | Human-in-the-loop (HITL) for tool-enabled agents with human feedback. |
200+
=======
201+
>>>>>>> main
199202
| [agent_middleware.py](examples/agent_middleware.py) | Agent, chat, and function middleware for logging, timing, and blocking. |
200203
| [agent_knowledge_aisearch.py](examples/agent_knowledge_aisearch.py) | Knowledge retrieval (RAG) using Azure AI Search with AgentFrameworkAzureAISearchRAG. |
201204
| [agent_knowledge_sqlite.py](examples/agent_knowledge_sqlite.py) | Knowledge retrieval (RAG) using a custom context provider with SQLite FTS5. |
@@ -205,6 +208,7 @@ You can run the examples in this repository by executing the scripts in the `exa
205208
| [agent_mcp_remote.py](examples/agent_mcp_remote.py) | An agent using a remote MCP server (Microsoft Learn) for documentation search. |
206209
| [agent_mcp_local.py](examples/agent_mcp_local.py) | An agent connected to a local MCP server (e.g. for expense logging). |
207210
| [openai_tool_calling.py](examples/openai_tool_calling.py) | Tool calling with the low-level OpenAI SDK, showing manual tool dispatch. |
211+
<<<<<<< HEAD
208212
| [workflow_rag_ingest.py](examples/workflow_rag_ingest.py) | A RAG ingestion pipeline using plain Python executors: fetch a document with markitdown, split into chunks, and embed with an OpenAI model. |
209213
| [workflow_agents.py](examples/workflow_agents.py) | A workflow with AI agents as executors: a Writer drafts content and a Reviewer provides feedback. |
210214
| [workflow_agents_sequential.py](examples/workflow_agents_sequential.py) | A sequential orchestration using `SequentialBuilder`: Writer and Reviewer run in order while sharing full conversation history. |
@@ -215,6 +219,8 @@ You can run the examples in this repository by executing the scripts in the `exa
215219
| [workflow_conditional_state_isolated.py](examples/workflow_conditional_state_isolated.py) | The stateful conditional workflow using a `create_workflow(...)` factory to build fresh agents/workflow per task for state isolation and thread safety. |
216220
| [workflow_switch_case.py](examples/workflow_switch_case.py) | A workflow with switch-case routing: a Classifier agent uses structured outputs to categorize a message and route to a specialized handler. |
217221
| [workflow_converge.py](examples/workflow_converge.py) | A branch-and-converge workflow: Reviewer routes to Publisher or Editor, then converges before final summary output. |
222+
=======
223+
>>>>>>> main
218224
| [agent_otel_aspire.py](examples/agent_otel_aspire.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to the [Aspire Dashboard](https://aspire.dev/dashboard/standalone/). |
219225
| [agent_otel_appinsights.py](examples/agent_otel_appinsights.py) | An agent with OpenTelemetry tracing, metrics, and structured logs exported to [Azure Application Insights](https://learn.microsoft.com/azure/azure-monitor/app/app-insights-overview). Requires Azure provisioning via `azd provision`. |
220226
| [agent_evaluation_generate.py](examples/agent_evaluation_generate.py) | Generate synthetic evaluation data for the travel planner agent. |

chat_history.db

-20 KB
Binary file not shown.

examples/agent_basic.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,11 +24,11 @@
2424
client = OpenAIChatClient(
2525
base_url="https://models.github.ai/inference",
2626
api_key=os.environ["GITHUB_TOKEN"],
27-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
27+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
2828
)
2929
else:
3030
client = OpenAIChatClient(
31-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
31+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
3232
)
3333

3434
agent = Agent(client=client, instructions="You're an informational agent. Answer questions cheerfully.")

examples/agent_evaluation.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -48,22 +48,22 @@
4848
client = OpenAIChatClient(
4949
base_url="https://models.github.ai/inference",
5050
api_key=os.environ["GITHUB_TOKEN"],
51-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
51+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
5252
)
5353
eval_model_config = OpenAIModelConfiguration(
5454
type="openai",
5555
base_url="https://models.github.ai/inference",
5656
api_key=os.environ["GITHUB_TOKEN"],
57-
model="openai/gpt-5-mini",
57+
model="openai/gpt-4.1-mini",
5858
)
5959
else:
6060
client = OpenAIChatClient(
61-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
61+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
6262
)
6363
eval_model_config = OpenAIModelConfiguration(
6464
type="openai",
6565
api_key=os.environ["OPENAI_API_KEY"],
66-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
66+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
6767
)
6868

6969

examples/agent_evaluation_batch.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,13 +46,13 @@
4646
type="openai",
4747
base_url="https://models.github.ai/inference",
4848
api_key=os.environ["GITHUB_TOKEN"],
49-
model="openai/gpt-5-mini",
49+
model="openai/gpt-4.1-mini",
5050
)
5151
else:
5252
model_config = OpenAIModelConfiguration(
5353
type="openai",
5454
api_key=os.environ["OPENAI_API_KEY"],
55-
model=os.environ.get("OPENAI_MODEL", "gpt-5-mini"),
55+
model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini"),
5656
)
5757

5858
# Optional: Set AZURE_AI_PROJECT in .env to log results to Azure AI Foundry.

examples/agent_evaluation_generate.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,11 +45,11 @@
4545
client = OpenAIChatClient(
4646
base_url="https://models.github.ai/inference",
4747
api_key=os.environ["GITHUB_TOKEN"],
48-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
48+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4949
)
5050
else:
5151
client = OpenAIChatClient(
52-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
52+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
5353
)
5454

5555

examples/agent_history_redis.py

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@
3838
client = OpenAIChatClient(
3939
base_url="https://models.github.ai/inference",
4040
api_key=os.environ["GITHUB_TOKEN"],
41-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
41+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4242
)
4343
else:
4444
client = OpenAIChatClient(
45-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
45+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
4646
)
4747

4848

@@ -63,7 +63,7 @@ async def example_persistent_session() -> None:
6363
session_id = str(uuid.uuid4())
6464

6565
# Phase 1: Start a conversation with a Redis-backed history provider
66-
print("[dim]--- Phase 1: Starting conversation ---[/dim]")
66+
print("[bold]--- Phase 1: Starting conversation ---[/bold]")
6767
redis_provider = RedisHistoryProvider(source_id="redis_chat", redis_url=REDIS_URL)
6868

6969
agent = Agent(
@@ -84,7 +84,7 @@ async def example_persistent_session() -> None:
8484
print(f"[green]Agent:[/green] {response.text}")
8585

8686
# Phase 2: Simulate an application restart — reconnect using the same session ID in Redis
87-
print("\n[dim]--- Phase 2: Resuming after 'restart' ---[/dim]")
87+
print("\n[bold]--- Phase 2: Resuming after 'restart' ---[/bold]")
8888
redis_provider2 = RedisHistoryProvider(source_id="redis_chat", redis_url=REDIS_URL)
8989

9090
agent2 = Agent(
@@ -99,7 +99,6 @@ async def example_persistent_session() -> None:
9999
print("[blue]User:[/blue] Which of the cities I asked about had better weather?")
100100
response = await agent2.run("Which of the cities I asked about had better weather?", session=session2)
101101
print(f"[green]Agent:[/green] {response.text}")
102-
print("[dim]Note: The agent remembered the conversation from Phase 1 via Redis persistence.[/dim]")
103102

104103

105104
async def main() -> None:
@@ -111,17 +110,15 @@ async def main() -> None:
111110
try:
112111
r.ping()
113112
except Exception as e:
114-
print(f"[red]Cannot connect to Redis at {REDIS_URL}: {e}[/red]")
115-
print(
116-
"[red]Ensure Redis is running (e.g. via the dev container"
117-
" or 'docker run -p 6379:6379 redis:7-alpine').[/red]"
113+
logger.error(f"Cannot connect to Redis at {REDIS_URL}: {e}")
114+
logger.error(
115+
"Ensure Redis is running (e.g. via the dev container"
116+
" or 'docker run -p 6379:6379 redis:7-alpine')."
118117
)
119118
return
120119
finally:
121120
r.close()
122121

123-
print("[dim]Redis connection verified.[/dim]")
124-
125122
await example_persistent_session()
126123

127124
if async_credential:

examples/agent_history_sqlite.py

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@
3838
client = OpenAIChatClient(
3939
base_url="https://models.github.ai/inference",
4040
api_key=os.environ["GITHUB_TOKEN"],
41-
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-5-mini"),
41+
model_id=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
4242
)
4343
else:
4444
client = OpenAIChatClient(
45-
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-5-mini")
45+
api_key=os.environ["OPENAI_API_KEY"], model_id=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")
4646
)
4747

4848

@@ -111,7 +111,7 @@ async def main() -> None:
111111

112112
# Phase 1: Start a conversation with a SQLite-backed history provider
113113
print("\n[bold]=== Persistent SQLite Session ===[/bold]")
114-
print("[dim]--- Phase 1: Starting conversation ---[/dim]")
114+
print("[bold]--- Phase 1: Starting conversation ---[/bold]")
115115

116116
sqlite_provider = SQLiteHistoryProvider(db_path=db_path)
117117

@@ -132,12 +132,8 @@ async def main() -> None:
132132
response = await agent.run("How about Paris?", session=session)
133133
print(f"[green]Agent:[/green] {response.text}")
134134

135-
messages = await sqlite_provider.get_messages(session_id)
136-
print(f"[dim]Messages stored in SQLite: {len(messages)}[/dim]")
137-
sqlite_provider.close()
138-
139135
# Phase 2: Simulate an application restart — reconnect to the same session ID in SQLite
140-
print("\n[dim]--- Phase 2: Resuming after 'restart' ---[/dim]")
136+
print("\n[bold]--- Phase 2: Resuming after 'restart' ---[/bold]")
141137
sqlite_provider2 = SQLiteHistoryProvider(db_path=db_path)
142138

143139
agent2 = Agent(
@@ -152,7 +148,6 @@ async def main() -> None:
152148
print("[blue]User:[/blue] Which of the cities I asked about had better weather?")
153149
response = await agent2.run("Which of the cities I asked about had better weather?", session=session2)
154150
print(f"[green]Agent:[/green] {response.text}")
155-
print("[dim]Note: The agent remembered the conversation from Phase 1 via SQLite persistence.[/dim]")
156151

157152
sqlite_provider2.close()
158153

0 commit comments

Comments
 (0)