Skip to content

Commit 400eb86

Browse files
committed
Remove GitHub Models support, doesnt support Responses API
1 parent 9431dd1 commit 400eb86

111 files changed

Lines changed: 134 additions & 913 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.env.sample

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,11 @@
1-
# API_HOST can be either azure, openai, or github:
1+
# API_HOST can be either azure or openai:
22
API_HOST=azure
33
# Configure for Azure:
44
AZURE_OPENAI_ENDPOINT=https://YOUR-AZURE-OPENAI-SERVICE-NAME.openai.azure.com
55
AZURE_OPENAI_CHAT_DEPLOYMENT=YOUR-AZURE-DEPLOYMENT-NAME
66
# Configure for OpenAI.com:
77
OPENAI_API_KEY=YOUR-OPENAI-KEY
88
OPENAI_MODEL=gpt-3.5-turbo
9-
# Configure for GitHub models: (GITHUB_TOKEN already exists inside Codespaces)
10-
GITHUB_MODEL=gpt-4.1-mini
11-
GITHUB_TOKEN=YOUR-GITHUB-PERSONAL-ACCESS-TOKEN
129
# Configure for Redis (used by agent_history_redis.py, defaults to dev container Redis):
1310
REDIS_URL=redis://localhost:6379
1411
# Configure OTLP exporter (not needed in devcontainer, which sets these via docker-compose):

README.md

Lines changed: 4 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
<!--
22
---
33
name: Python Agent Framework Demos
4-
description: Collection of Python examples for Microsoft Agent Framework using GitHub Models or Azure AI Foundry.
4+
description: Collection of Python examples for Microsoft Agent Framework using Microsoft Foundry.
55
languages:
66
- python
77
products:
@@ -17,14 +17,13 @@ urlFragment: python-agentframework-demos
1717
[![Open in GitHub Codespaces](https://img.shields.io/static/v1?style=for-the-badge&label=GitHub+Codespaces&message=Open&color=brightgreen&logo=github)](https://codespaces.new/Azure-Samples/python-agentframework-demos)
1818
[![Open in Dev Containers](https://img.shields.io/static/v1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/Azure-Samples/python-agentframework-demos)
1919

20-
This repository provides examples of [Microsoft Agent Framework](https://learn.microsoft.com/agent-framework/) using LLMs from [GitHub Models](https://github.com/marketplace/models), [Azure AI Foundry](https://learn.microsoft.com/azure/ai-foundry/), or other model providers. GitHub Models are free to use for anyone with a GitHub account, up to a [daily rate limit](https://docs.github.com/github-models/prototyping-with-ai-models#rate-limits).
20+
This repository provides examples of [Microsoft Agent Framework](https://learn.microsoft.com/agent-framework/) using LLMs from [Azure AI Foundry](https://learn.microsoft.com/azure/ai-foundry/) or other model providers.
2121

2222
* [Getting started](#getting-started)
2323
* [GitHub Codespaces](#github-codespaces)
2424
* [VS Code Dev Containers](#vs-code-dev-containers)
2525
* [Local environment](#local-environment)
2626
* [Configuring model providers](#configuring-model-providers)
27-
* [Using GitHub Models](#using-github-models)
2827
* [Using Azure AI Foundry models](#using-azure-ai-foundry-models)
2928
* [Using OpenAI.com models](#using-openaicom-models)
3029
* [Running the Python examples](#running-the-python-examples)
@@ -95,35 +94,11 @@ The dev container includes a Redis server, which is used by the `agent_history_r
9594

9695
## Configuring model providers
9796

98-
These examples can be run with Azure AI Foundry, OpenAI.com, or GitHub Models, depending on the environment variables you set. All the scripts reference the environment variables from a `.env` file, and an example `.env.sample` file is provided. Host-specific instructions are below.
99-
100-
## Using GitHub Models
101-
102-
If you open this repository in GitHub Codespaces, you can run the scripts for free using GitHub Models without any additional steps, as your `GITHUB_TOKEN` is already configured in the Codespaces environment.
103-
104-
If you want to run the scripts locally, you need to set up the `GITHUB_TOKEN` environment variable with a GitHub personal access token (PAT). You can create a PAT by following these steps:
105-
106-
1. Go to your GitHub account settings.
107-
2. Click on "Developer settings" in the left sidebar.
108-
3. Click on "Personal access tokens" in the left sidebar.
109-
4. Click on "Tokens (classic)" or "Fine-grained tokens" depending on your preference.
110-
5. Click on "Generate new token".
111-
6. Give your token a name and select the scopes you want to grant. For this project, you don't need any specific scopes.
112-
7. Click on "Generate token".
113-
8. Copy the generated token.
114-
9. Set the `GITHUB_TOKEN` environment variable in your terminal or IDE:
115-
116-
```shell
117-
export GITHUB_TOKEN=your_personal_access_token
118-
```
119-
120-
10. Optionally, you can use a model other than "gpt-4.1-mini" by setting the `GITHUB_MODEL` environment variable. Use a model that supports function calling, such as: `gpt-5`, `gpt-4.1-mini`, `gpt-4o`, `gpt-4o-mini`, `o3-mini`, `AI21-Jamba-1.5-Large`, `AI21-Jamba-1.5-Mini`, `Codestral-2501`, `Cohere-command-r`, `Ministral-3B`, `Mistral-Large-2411`, `Mistral-Nemo`, `Mistral-small`
97+
These examples can be run with Azure AI Foundry or OpenAI.com, depending on the environment variables you set. All the scripts reference the environment variables from a `.env` file, and an example `.env.sample` file is provided. Host-specific instructions are below.
12198

12299
## Using Azure AI Foundry models
123100

124-
You can run all examples in this repository using GitHub Models. If you want to run the examples using models from Azure AI Foundry instead, you need to provision the Azure AI resources, which will incur costs.
125-
126-
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-4.1-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
101+
This project includes infrastructure as code (IaC) to provision Azure OpenAI deployments of "gpt-5-mini" and "text-embedding-3-large" via Azure AI Foundry. The IaC is defined in the `infra` directory and uses the Azure Developer CLI to provision the resources.
127102

128103
1. Make sure the [Azure Developer CLI (azd)](https://aka.ms/install-azd) is installed.
129104

examples/agent_basic.py

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99

1010
# Configure OpenAI client based on environment
1111
load_dotenv(override=True)
12-
API_HOST = os.getenv("API_HOST", "github")
12+
API_HOST = os.getenv("API_HOST", "azure")
1313

1414
async_credential = None
1515
if API_HOST == "azure":
@@ -20,12 +20,6 @@
2020
api_key=token_provider,
2121
model=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
2222
)
23-
elif API_HOST == "github":
24-
client = OpenAIChatClient(
25-
base_url="https://models.github.ai/inference",
26-
api_key=os.environ["GITHUB_TOKEN"],
27-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
28-
)
2923
else:
3024
client = OpenAIChatClient(
3125
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_evaluation.py

Lines changed: 1 addition & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
logger.setLevel(logging.INFO)
2929

3030
load_dotenv(override=True)
31-
API_HOST = os.getenv("API_HOST", "github")
31+
API_HOST = os.getenv("API_HOST", "azure")
3232

3333
async_credential = None
3434
if API_HOST == "azure":
@@ -44,18 +44,6 @@
4444
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
4545
azure_deployment=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
4646
)
47-
elif API_HOST == "github":
48-
client = OpenAIChatClient(
49-
base_url="https://models.github.ai/inference",
50-
api_key=os.environ["GITHUB_TOKEN"],
51-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
52-
)
53-
eval_model_config = OpenAIModelConfiguration(
54-
type="openai",
55-
base_url="https://models.github.ai/inference",
56-
api_key=os.environ["GITHUB_TOKEN"],
57-
model="openai/gpt-4.1-mini",
58-
)
5947
else:
6048
client = OpenAIChatClient(
6149
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_evaluation_batch.py

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -33,21 +33,14 @@
3333
logger.setLevel(logging.INFO)
3434

3535
load_dotenv(override=True)
36-
API_HOST = os.getenv("API_HOST", "github")
36+
API_HOST = os.getenv("API_HOST", "azure")
3737

3838
if API_HOST == "azure":
3939
model_config = AzureOpenAIModelConfiguration(
4040
type="azure_openai",
4141
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
4242
azure_deployment=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
4343
)
44-
elif API_HOST == "github":
45-
model_config = OpenAIModelConfiguration(
46-
type="openai",
47-
base_url="https://models.github.ai/inference",
48-
api_key=os.environ["GITHUB_TOKEN"],
49-
model="openai/gpt-4.1-mini",
50-
)
5144
else:
5245
model_config = OpenAIModelConfiguration(
5346
type="openai",

examples/agent_evaluation_generate.py

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
logger.setLevel(logging.INFO)
3030

3131
load_dotenv(override=True)
32-
API_HOST = os.getenv("API_HOST", "github")
32+
API_HOST = os.getenv("API_HOST", "azure")
3333

3434
async_credential = None
3535
if API_HOST == "azure":
@@ -40,12 +40,6 @@
4040
api_key=token_provider,
4141
model=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
4242
)
43-
elif API_HOST == "github":
44-
client = OpenAIChatClient(
45-
base_url="https://models.github.ai/inference",
46-
api_key=os.environ["GITHUB_TOKEN"],
47-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
48-
)
4943
else:
5044
client = OpenAIChatClient(
5145
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_history_redis.py

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@
2121

2222
# Configure OpenAI client based on environment
2323
load_dotenv(override=True)
24-
API_HOST = os.getenv("API_HOST", "github")
24+
API_HOST = os.getenv("API_HOST", "azure")
2525
REDIS_URL = os.getenv("REDIS_URL", "redis://localhost:6379")
2626

2727
async_credential = None
@@ -33,12 +33,6 @@
3333
api_key=token_provider,
3434
model=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
3535
)
36-
elif API_HOST == "github":
37-
client = OpenAIChatClient(
38-
base_url="https://models.github.ai/inference",
39-
api_key=os.environ["GITHUB_TOKEN"],
40-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
41-
)
4236
else:
4337
client = OpenAIChatClient(
4438
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_history_sqlite.py

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323

2424
# Configure OpenAI client based on environment
2525
load_dotenv(override=True)
26-
API_HOST = os.getenv("API_HOST", "github")
26+
API_HOST = os.getenv("API_HOST", "azure")
2727

2828
async_credential = None
2929
if API_HOST == "azure":
@@ -34,12 +34,6 @@
3434
api_key=token_provider,
3535
model=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
3636
)
37-
elif API_HOST == "github":
38-
client = OpenAIChatClient(
39-
base_url="https://models.github.ai/inference",
40-
api_key=os.environ["GITHUB_TOKEN"],
41-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
42-
)
4337
else:
4438
client = OpenAIChatClient(
4539
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_knowledge_aisearch.py

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
2323
Requires:
2424
- An Azure AI Search service with a Knowledge Base
25-
- An OpenAI-compatible model endpoint (Azure OpenAI, GitHub Models, or OpenAI)
25+
- An OpenAI-compatible model endpoint (Azure OpenAI or OpenAI)
2626
2727
Environment variables:
2828
- AZURE_SEARCH_ENDPOINT: Your Azure AI Search endpoint
@@ -55,7 +55,7 @@
5555

5656
# ── Configuration ────────────────────────────────────────────────────
5757
load_dotenv(override=True)
58-
API_HOST = os.getenv("API_HOST", "github")
58+
API_HOST = os.getenv("API_HOST", "azure")
5959

6060
SEARCH_ENDPOINT = os.environ["AZURE_SEARCH_ENDPOINT"]
6161
KNOWLEDGE_BASE_NAME = os.environ["AZURE_SEARCH_KNOWLEDGE_BASE_NAME"]
@@ -71,12 +71,6 @@
7171
api_key=token_provider,
7272
model=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"],
7373
)
74-
elif API_HOST == "github":
75-
client = OpenAIChatClient(
76-
base_url="https://models.github.ai/inference",
77-
api_key=os.environ["GITHUB_TOKEN"],
78-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
79-
)
8074
else:
8175
client = OpenAIChatClient(
8276
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

examples/agent_knowledge_pg.py

Lines changed: 2 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
2121
Requires:
2222
- PostgreSQL with pgvector extension (see docker-compose.yml)
23-
- An embedding model (GitHub Models, Azure OpenAI, or OpenAI)
23+
- An embedding model (Azure OpenAI or OpenAI)
2424
2525
See also: agent_knowledge_sqlite.py for a simpler SQLite-only (keyword search) version.
2626
"""
@@ -51,7 +51,7 @@
5151

5252
# ── OpenAI clients (chat + embeddings) ───────────────────────────────
5353
load_dotenv(override=True)
54-
API_HOST = os.getenv("API_HOST", "github")
54+
API_HOST = os.getenv("API_HOST", "azure")
5555
POSTGRES_URL = os.getenv("POSTGRES_URL", "postgresql://admin:LocalPasswordOnly@db:5432/postgres")
5656
EMBEDDING_DIMENSIONS = 256 # Smaller dimension for efficiency
5757

@@ -75,17 +75,6 @@
7575
api_key=sync_token_provider(),
7676
)
7777
embed_model = os.environ.get("AZURE_OPENAI_EMBEDDING_DEPLOYMENT", "text-embedding-3-small")
78-
elif API_HOST == "github":
79-
chat_client = OpenAIChatClient(
80-
base_url="https://models.github.ai/inference",
81-
api_key=os.environ["GITHUB_TOKEN"],
82-
model=os.getenv("GITHUB_MODEL", "openai/gpt-4.1-mini"),
83-
)
84-
embed_client = OpenAI(
85-
base_url="https://models.github.ai/inference",
86-
api_key=os.environ["GITHUB_TOKEN"],
87-
)
88-
embed_model = "text-embedding-3-small"
8978
else:
9079
chat_client = OpenAIChatClient(
9180
api_key=os.environ["OPENAI_API_KEY"], model=os.environ.get("OPENAI_MODEL", "gpt-4.1-mini")

0 commit comments

Comments
 (0)