feat: add MiniMax as a first-class AI provider#61
feat: add MiniMax as a first-class AI provider#61octo-patch wants to merge 1 commit intoTraderAlice:masterfrom
Conversation
Add MiniMax (https://www.minimax.io) support via the Vercel AI SDK's OpenAI-compatible adapter. MiniMax offers MiniMax-M2.5 (204K context) and MiniMax-M2.5-highspeed models through an OpenAI-compatible API. Changes: - model-factory.ts: add minimax provider case using @ai-sdk/openai with compatibility mode and chat completions endpoint - config.ts: add minimax to apiKeys Zod schema - config.ts (web routes): expose minimax API key status - types.ts (UI): add minimax to ApiKeys type - AIProviderPage.tsx: add MiniMax to provider list with model presets - README.md: mention MiniMax in AI Provider and api-keys docs Tested with both MiniMax-M2.5 and MiniMax-M2.5-highspeed models. All 766 existing tests continue to pass.
22e116b to
8ad0645
Compare
|
Hi @octo-patch, thanks for the contribution! We appreciate the effort. However, due to security considerations, we're unable to directly accept external PRs at this time. That said, MiniMax support is something we're thinking about. Currently there are two possible approaches we're considering:
We haven't decided yet how to best surface this in the frontend for quick switching between providers. Once we've figured that out, we'll close this PR and include your name in the commit as the original proposer. Thanks again for your interest in the project! |
|
Thanks for the response, @luokerenx4! Completely understand the security considerations. Great to hear that MiniMax support is on the roadmap — happy to close this PR if you'd prefer to handle it internally. Let me know if there's anything else I can help with! |
Summary
Add MiniMax as a first-class AI provider alongside Anthropic, OpenAI, and Google. MiniMax offers an OpenAI-compatible API with 204K context window models at competitive pricing.
What's included
minimaxcase in model factory using the existing@ai-sdk/openaiadapter withhttps://api.minimax.io/v1as the default base URLminimaxadded to theapiKeysschema so users can store their API key inai-provider-manager.jsonMiniMax-M2.5,MiniMax-M2.5-highspeed), and in the per-channel LLM Provider dropdownConfiguration
{ "backend": "vercel-ai-sdk", "provider": "minimax", "model": "MiniMax-M2.5", "apiKeys": { "minimax": "your-api-key" } }Available models
China mainland users can set
baseUrltohttps://api.minimaxi.com/v1.Test plan
pnpm test— 9 new tests, all green)pnpm build)MiniMax-M2.5)