feat(mimiclaw): add MiniMax as first-class LLM provider#544
Open
octo-patch wants to merge 1 commit into
Open
Conversation
Add MiniMax AI as a third LLM provider alongside OpenAI and Anthropic in the MimiClaw embedded AI agent. MiniMax uses OpenAI-compatible API format, so this change introduces provider_uses_openai_format() to share the OpenAI message/auth/response path while routing to MiniMax's own endpoint (api.minimax.io/v1/chat/completions). Changes: - llm_proxy.c: add provider_is_minimax(), provider_uses_openai_format(), separate endpoint/cert cache for MiniMax, three-way provider routing - mimi_config.h: add MIMI_MINIMAX_API_URL with ifndef guard - serial_cli.c: update CLI help text and default URL routing for minimax - mimi_secrets.h.example: document minimax as a provider option - README.md/README_zh.md: add MiniMax to provider documentation - tests/: add 24 unit + 3 integration tests validating provider routing and MiniMax API compatibility Models: MiniMax-M2.7 (latest, 1M context), MiniMax-M2.5-highspeed (204K) Co-Authored-By: Octopus <liyuan851277048@icloud.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add MiniMax AI as a third LLM provider in the MimiClaw embedded AI agent, alongside OpenAI and Anthropic.
MiniMax offers an OpenAI-compatible API, so this change introduces a
provider_uses_openai_format()helper to share the existing OpenAI message format, Bearer auth, and response parsing path, while routing requests to MiniMax's own endpoint (api.minimax.io/v1/chat/completions).Changes
provider_is_minimax(),provider_uses_openai_format(), separate endpoint/cert cache for MiniMax, three-way provider routing in endpoint URL, cert management, and HTTP authMIMI_MINIMAX_API_URLwith#ifndefguardanthropic|openai|minimax) and default URL routingUsage
Available models: (latest, 1M context), (204K context)
Design
MiniMax uses the same OpenAI-compatible format for:
This means all existing OpenAI format code paths are reused via
provider_uses_openai_format(), keeping the diff minimal. Only endpoint routing and TLS cert caching needed three-way provider dispatch.Test Plan