feat(cli): add codemie models list command (EPMCDME-11631)#239
Merged
TarasSpashchenko merged 2 commits intomainfrom Apr 7, 2026
Merged
feat(cli): add codemie models list command (EPMCDME-11631)#239TarasSpashchenko merged 2 commits intomainfrom
TarasSpashchenko merged 2 commits intomainfrom
Conversation
Introduce a new `codemie models list` sub-command that displays all AI models available under the user's current provider/auth configuration. - Add src/cli/commands/models.ts with createModelsCommand() factory; resolves active provider via ConfigLoader and fetches models through ProviderRegistry.getModelProxy(), prints a padded-column table - Register the command in src/cli/index.ts - Fix BedrockModelProxy.fetchModels() to honour runtime config fields (awsRegion, awsProfile, apiKey, awsSecretAccessKey) instead of the hardcoded singleton defaults - Register LiteLLMModelProxy in litellm/index.ts so litellm profiles are served by the new command - Make LiteLLMModelProxy.fetchModels() use config.baseUrl / config.apiKey so runtime profile values override the empty singleton defaults Generated with AI Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
Generated with AI Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
8nevil8
approved these changes
Apr 7, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds a new
codemie models listCLI command that displays all AI models available to the user based on their current provider/auth configuration. Resolves EPMCDME-11631.Changes
codemie models list— loads the active profile viaConfigLoader, resolves the provider'sProviderModelFetcherthroughProviderRegistry.getModelProxy(), and prints a formatted table of model ID, name, and descriptionBedrockModelProxy.fetchModels()now honours runtime config fields (awsRegion,awsProfile,apiKey,awsSecretAccessKey) instead of the hardcoded singleton defaults, so users with non-default regions or profiles get correct resultsLiteLLMModelProxyis now registered inlitellm/index.ts;fetchModels()usesconfig.baseUrlandconfig.apiKeyso the runtime profile values are respectedImpact
Before: no way to discover available models without leaving the CLI.
After:
Supported providers: Ollama, AWS Bedrock, LiteLLM, CodeMie SSO (
ai-run-sso).Checklist