Skip to content

feat(mimiclaw): add MiniMax as first-class LLM provider#544

Open
octo-patch wants to merge 1 commit into
tuya:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat(mimiclaw): add MiniMax as first-class LLM provider#544
octo-patch wants to merge 1 commit into
tuya:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax AI as a third LLM provider in the MimiClaw embedded AI agent, alongside OpenAI and Anthropic.

MiniMax offers an OpenAI-compatible API, so this change introduces a provider_uses_openai_format() helper to share the existing OpenAI message format, Bearer auth, and response parsing path, while routing requests to MiniMax's own endpoint (api.minimax.io/v1/chat/completions).

Changes

  • llm_proxy.c: Add provider_is_minimax(), provider_uses_openai_format(), separate endpoint/cert cache for MiniMax, three-way provider routing in endpoint URL, cert management, and HTTP auth
  • mimi_config.h: Add MIMI_MINIMAX_API_URL with #ifndef guard
  • serial_cli.c: Update CLI help text (anthropic|openai|minimax) and default URL routing
  • mimi_secrets.h.example: Document MiniMax as a provider option
  • README.md / README_zh.md: Add MiniMax to provider documentation with model names
  • tests/test_minimax_provider.py: 24 unit tests (source-level validation) + 3 integration tests (real API calls)

Usage

Available models: (latest, 1M context), (204K context)

Design

MiniMax uses the same OpenAI-compatible format for:

  • Message structure (system/user/assistant roles)
  • Bearer token authentication
  • Response format (choices[].message.content)
  • Tool calling (function type)

This means all existing OpenAI format code paths are reused via provider_uses_openai_format(), keeping the diff minimal. Only endpoint routing and TLS cert caching needed three-way provider dispatch.

Test Plan

  • 24 unit tests validate C source-level correctness (provider routing, config, CLI, docs)
  • 3 integration tests verify real MiniMax API compatibility (chat, tool calling, auth)
  • All 27 tests passing

Add MiniMax AI as a third LLM provider alongside OpenAI and Anthropic in
the MimiClaw embedded AI agent. MiniMax uses OpenAI-compatible API format,
so this change introduces provider_uses_openai_format() to share the
OpenAI message/auth/response path while routing to MiniMax's own endpoint
(api.minimax.io/v1/chat/completions).

Changes:
- llm_proxy.c: add provider_is_minimax(), provider_uses_openai_format(),
  separate endpoint/cert cache for MiniMax, three-way provider routing
- mimi_config.h: add MIMI_MINIMAX_API_URL with ifndef guard
- serial_cli.c: update CLI help text and default URL routing for minimax
- mimi_secrets.h.example: document minimax as a provider option
- README.md/README_zh.md: add MiniMax to provider documentation
- tests/: add 24 unit + 3 integration tests validating provider routing
  and MiniMax API compatibility

Models: MiniMax-M2.7 (latest, 1M context), MiniMax-M2.5-highspeed (204K)

Co-Authored-By: Octopus <liyuan851277048@icloud.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant