Skip to content

feat: add MiniMax as a first-class AI provider#61

Open
octo-patch wants to merge 1 commit intoTraderAlice:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as a first-class AI provider#61
octo-patch wants to merge 1 commit intoTraderAlice:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as a first-class AI provider alongside Anthropic, OpenAI, and Google. MiniMax offers an OpenAI-compatible API with 204K context window models at competitive pricing.

What's included

  • Backend: New minimax case in model factory using the existing @ai-sdk/openai adapter with https://api.minimax.io/v1 as the default base URL
  • Config: minimax added to the apiKeys schema so users can store their API key in ai-provider-manager.json
  • Web UI: MiniMax appears in the AI Provider page's provider selector with two model presets (MiniMax-M2.5, MiniMax-M2.5-highspeed), and in the per-channel LLM Provider dropdown
  • Tests: Unit tests for MiniMax model creation, custom base URL, and API key resolution
  • Docs: README updated to reference MiniMax

Configuration

{
  "backend": "vercel-ai-sdk",
  "provider": "minimax",
  "model": "MiniMax-M2.5",
  "apiKeys": { "minimax": "your-api-key" }
}

Available models

Model Context Input Output
MiniMax-M2.5 204K tokens $0.3/M $1.2/M
MiniMax-M2.5-highspeed 204K tokens $0.6/M $2.4/M

China mainland users can set baseUrl to https://api.minimaxi.com/v1.

Test plan

  • Unit tests pass (pnpm test — 9 new tests, all green)
  • Full test suite passes (775/775 tests pass)
  • Backend builds cleanly (pnpm build)
  • UI builds cleanly with no TypeScript errors
  • MiniMax API integration verified (successful chat completion with MiniMax-M2.5)

Add MiniMax (https://www.minimax.io) support via the Vercel AI SDK's
OpenAI-compatible adapter. MiniMax offers MiniMax-M2.5 (204K context)
and MiniMax-M2.5-highspeed models through an OpenAI-compatible API.

Changes:
- model-factory.ts: add minimax provider case using @ai-sdk/openai
  with compatibility mode and chat completions endpoint
- config.ts: add minimax to apiKeys Zod schema
- config.ts (web routes): expose minimax API key status
- types.ts (UI): add minimax to ApiKeys type
- AIProviderPage.tsx: add MiniMax to provider list with model presets
- README.md: mention MiniMax in AI Provider and api-keys docs

Tested with both MiniMax-M2.5 and MiniMax-M2.5-highspeed models.
All 766 existing tests continue to pass.
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 22e116b to 8ad0645 Compare March 16, 2026 00:22
@luokerenx4
Copy link
Contributor

Hi @octo-patch, thanks for the contribution! We appreciate the effort.

However, due to security considerations, we're unable to directly accept external PRs at this time.

That said, MiniMax support is something we're thinking about. Currently there are two possible approaches we're considering:

  1. Configure MiniMax's Claude-compatible endpoint via Anthropic's Claude Agent SDK
  2. Configure MiniMax's Claude-compatible endpoint via the Vercel AI SDK's Claude provider

We haven't decided yet how to best surface this in the frontend for quick switching between providers. Once we've figured that out, we'll close this PR and include your name in the commit as the original proposer.

Thanks again for your interest in the project!

@octo-patch
Copy link
Author

Thanks for the response, @luokerenx4! Completely understand the security considerations. Great to hear that MiniMax support is on the roadmap — happy to close this PR if you'd prefer to handle it internally. Let me know if there's anything else I can help with!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants