Skip to content

Conversation

@aibrahim-oai
Copy link
Collaborator

@aibrahim-oai aibrahim-oai commented Dec 8, 2025

This is a step towards removing the need to know model when constructing config. We firstly don't need to know model_info and just respect if the user has already set it. Next step, we don't need to know model unless the user explicitly set it in config.toml

@aibrahim-oai aibrahim-oai changed the title move Remove legacy ModelInfo and merge it with ModelFamily Dec 8, 2025
@aibrahim-oai
Copy link
Collaborator Author

@codex review this

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 1155 to +1159
let config = Self {
model,
review_model,
model_context_window,
model_auto_compact_token_limit,
model_context_window: cfg.model_context_window,
model_auto_compact_token_limit: cfg.model_auto_compact_token_limit,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Preserve derived defaults for model context limits

Loading config no longer derives model_context_window and model_auto_compact_token_limit from the selected model family; these fields now mirror only the raw TOML/profile values. In the provided fixtures (e.g., the o3, gpt3, and gpt5 profiles in create_test_fixture) those settings are omitted, so Config::load_from_base_config_with_overrides now returns None for both fields where the tests and callers previously received model-specific defaults (200_000/180_000 for o3, 16_385/14_746 for gpt‑3.5, 272_000/244_800 for gpt‑5.1). This will break the precedence tests around lines 2923–3235 and removes context-window metadata from telemetry for default profiles. Please restore deriving these defaults from the model family when the user has not overridden them.

Useful? React with 👍 / 👎.

(context_window * 9) / 10
}

fn finalize_auto_compact_default(mut self) -> Self {
Copy link
Collaborator

@pakrym-oai pakrym-oai Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you make auto_compact_token_limit a getter instead?

Copy link
Collaborator

@pakrym-oai pakrym-oai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very nice!

review_model: OPENAI_DEFAULT_REVIEW_MODEL.to_string(),
model_context_window: Some(272_000),
model_auto_compact_token_limit: Some(244_800),
model_context_window: None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have coverage for this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly interested in ensuring model_auto_compact_token_limit is set correctly for well known models

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn multiple_auto_compact_per_task_runs_after_token_limit_hit() {
skip_if_no_network!();
let server = start_mock_server().await;
let codex = test_codex()
.build(&server)
.await
.expect("build codex")
.codex;
// user message

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should cover it although not very precise

@aibrahim-oai aibrahim-oai merged commit 382f047 into main Dec 8, 2025
26 checks passed
@aibrahim-oai aibrahim-oai deleted the move-model-info branch December 8, 2025 23:29
@github-actions github-actions bot locked and limited conversation to collaborators Dec 8, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants