Support for Multi-Turn Conversation Templates #2967
-
|
Hi TensorZero team, I'm working on migrating a complex prompt chain from our codebase to TensorZero. Our current implementation uses a multi-turn conversation format with system message, followed by 4 user/assistant message pairs as few-shot examples, and then a final user message with dynamic content. Currently, it seems TensorZero's function/variant system is designed around single request-response pairs. I'm wondering:
Here's a simplified example of what I'm trying to achieve: [functions.my_function.variants.my_variant]
type = "chat_completion"
model = "openai::gpt-4"
# Ideally something like:
messages = [
{ role = "system", template = "system.minijinja" },
{ role = "user", template = "example1_user.minijinja" },
{ role = "assistant", template = "example1_assistant.minijinja" },
{ role = "user", template = "example2_user.minijinja" },
{ role = "assistant", template = "example2_assistant.minijinja" },
# ... more examples ...
{ role = "user", template = "final_user.minijinja", use_schema = true }
]Would love to hear your thoughts on the best way to handle this pattern within TensorZero's architecture. Thanks for building such a thoughtful tool! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
|
Hi @seansica - thank you for the question. We don't support multiple templates for the same role in a single variant (or inference request). The typical approaches we've seen and recommend are:
For the cases you've mentioned:
If these approaches don't fit your use case, let us know! We'd be open to exploring solutions for arbitrary prompts/schemas in variants. Notably, the solution here would likely be "named templates/schemas" (with the ones for roles like Thank you! |
Beta Was this translation helpful? Give feedback.
The new release supports this use case:
https://www.tensorzero.com/docs/gateway/create-a-prompt-template