Add SFT optimization guide and example#5633
Merged
GabrielBianconi merged 6 commits intomainfrom Feb 6, 2026
Merged
Conversation
…andrew/sft-docs
GabrielBianconi
previously approved these changes
Feb 5, 2026
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
examples/docs/guides/optimization/supervised-fine-tuning/main.py
Outdated
Show resolved
Hide resolved
virajmehta
previously approved these changes
Feb 5, 2026
1443d8b
GabrielBianconi
approved these changes
Feb 6, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request adds documentation and a runnable example for Supervised Fine-Tuning (SFT) optimization.
Changes
Documentation: Adds
docs/optimization/supervised-fine-tuning.mdxguide explaining how to fine-tune LLMs using TensorZero, with updated navigation indocs.jsonand a link fromdocs/optimization/index.mdxRunnable Example: Provides a complete example in
examples/docs/guides/optimization/supervised-fine-tuning/featuring a named entity recognition task, includingmain.pyfor executionConfiguration Files: Supplies
tensorzero.toml,docker-compose.yml,pyproject.toml,system_template.minijinja, andoutput_schema.jsonfor the exampleProvider Reference: Documents configuration parameters for all four SFT providers (OpenAI, Fireworks, Together, GCP Vertex AI Gemini) including required
provider_typessettingsImportant
Adds a guide and runnable example for Supervised Fine-Tuning (SFT) using TensorZero, including documentation, example code, and configuration files.
supervised-fine-tuning.mdxguide for SFT indocs/optimization/.docs.jsonto include the new guide in navigation.index.mdxindocs/optimization/.examples/docs/guides/optimization/supervised-fine-tuning/for named entity recognition.main.pyfor execution anddocker-compose.ymlfor setup.tensorzero.toml,system_template.minijinja, andoutput_schema.jsonfor example configuration.This description was created by
for cdc84ca. You can customize this summary. It will automatically update as commits are pushed.
Note
Low Risk
Docs and example-only changes; no production runtime logic is modified beyond documentation/navigation content.
Overview
Adds a new Supervised Fine-Tuning (SFT) optimization guide (
docs/optimization/supervised-fine-tuning-sft.mdx) describing the end-to-end workflow (curating demonstrations/metrics, rendering samples, launching and polling provider fine-tunes) plus a provider config reference for OpenAI, Vertex Gemini, Fireworks, and Together.Updates the Optimization docs IA to surface SFT:
docs/docs.jsonadds the page to the Optimization nav, anddocs/optimization/index.mdxis rewritten to include an SFT section/link plus refreshed model/prompt/inference-time optimization descriptions;gateway/guides/inference-time-optimizations.mdxtrims a DICL example callout.Introduces a complete runnable SFT example under
examples/docs/guides/optimization/supervised-fine-tuning/(NER pipeline) including Docker Compose +tensorzero.tomlconfig, prompt + schema files, and an asyncmain.pythat generates demonstrations, renders samples, launches an OpenAI SFT job, polls for completion, and prints the config snippet to use the fine-tuned model.Written by Cursor Bugbot for commit 1443d8b. This will update automatically on new commits. Configure here.