Skip to content

feat: add Gemini 3 Pro to Google provider model list (#405)#521

Open
mvanhorn wants to merge 1 commit into
AsyncFuncAI:mainfrom
mvanhorn:feat/405-deepwiki-405-gemini-3-pro
Open

feat: add Gemini 3 Pro to Google provider model list (#405)#521
mvanhorn wants to merge 1 commit into
AsyncFuncAI:mainfrom
mvanhorn:feat/405-deepwiki-405-gemini-3-pro

Conversation

@mvanhorn
Copy link
Copy Markdown

Adds gemini-3-pro to the Google provider model list per #405.

Changes:

  • api/config/generator.json - new gemini-3-pro entry in providers.google.models with the same temperature/top_p/top_k shape as the existing 2.5 entries
  • api/api.py:217-221 - mirror the addition in the exception-path fallback ModelConfig so 3 Pro still surfaces in the dropdown if load_generator_config() raises

default_model for the Google provider stays at gemini-2.5-flash, so existing users see no behavior change. Users opt into 3 Pro explicitly via the model picker.

Pattern matches PR #420 (Bedrock model additions) and the original 2.5 family additions in PR #318.

Fixes #405

Adds gemini-3-pro to providers.google.models in api/config/generator.json
with the same temperature/top_p/top_k shape as the existing 2.5 entries,
and mirrors the addition in the fallback ModelConfig in api/api.py:220
so the new model surfaces in the dropdown even if config load fails.

default_model remains gemini-2.5-flash so existing users see no behavior
change. Users opt into 3 Pro explicitly via the model picker.

Fixes AsyncFuncAI#405
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds support for the Gemini 3 Pro model by updating the model list in the API and providing its configuration in generator.json. Feedback was provided to reorder the models in the API fallback list to ensure consistency with the configuration file and prevent potential issues with default model selection in the frontend.

Comment thread api/api.py
Comment on lines +220 to 221
Model(id="gemini-3-pro", name="Gemini 3 Pro"),
Model(id="gemini-2.5-flash", name="Gemini 2.5 Flash")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The order of models in the fallback list is inconsistent with the main configuration in generator.json. In the configuration file, gemini-2.5-flash is the first entry and the designated default model for the Google provider. Placing gemini-3-pro at the top of the list in this fallback path might lead to it being incorrectly selected as the default model by the frontend if the configuration fails to load. It is better to maintain a consistent order across both the success and fallback paths.

Suggested change
Model(id="gemini-3-pro", name="Gemini 3 Pro"),
Model(id="gemini-2.5-flash", name="Gemini 2.5 Flash")
Model(id="gemini-2.5-flash", name="Gemini 2.5 Flash"),
Model(id="gemini-3-pro", name="Gemini 3 Pro")

@jiaxi-xu-fsx
Copy link
Copy Markdown

Small follow-up: I think gemini-3-pro is already superseded by gemini-3.1-pro-preview in the latest Gemini API docs. We should probably switch to gemini-3.1-pro-preview (or include both for compatibility).

Reference: https://ai.google.dev/gemini-api/docs/models/gemini-3.1-pro-preview

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Gemini 3 Pro to LLM List

2 participants