Skip to content

Commit 4eb6fe6

Browse files
committed
remove systemPrompt from chat message
1 parent cbd175c commit 4eb6fe6

2 files changed

Lines changed: 0 additions & 2 deletions

File tree

llms/extensions/app/__init__.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,6 @@ async def queue_chat_handler(request):
199199
"model": thread.get("model"),
200200
"messages": thread.get("messages"),
201201
"modalities": thread.get("modalities"),
202-
"systemPrompt": thread.get("systemPrompt"),
203202
"tools": thread.get("tools"), # tools request
204203
"metadata": metadata,
205204
}

llms/extensions/providers/cerebras.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,6 @@ async def chat(self, chat, context=None):
3131
clean_chat["messages"].append(new_msg)
3232

3333
clean_chat.pop("modalities", None)
34-
clean_chat.pop("systemPrompt", None)
3534
return await super().chat(clean_chat, context)
3635

3736
ctx.add_provider(CerebrasProvider)

0 commit comments

Comments
 (0)