Summary
MeticAI should be fully usable without a Gemini API key. Users who just want to manage profiles, control their machine, browse shot history, and import existing profiles should be able to do so without configuring (or paying for) AI services. AI features should gracefully degrade — disabled in the UI, guarded on the backend — rather than blocking the entire app.
Motivation
- Lower the barrier to entry for new users
- Not everyone wants/needs AI-generated profiles — many users have profiles they already like
- Reduces cost for users who only want machine control + profile management
- The app already has significant non-AI value: Control Center, shot history, profile catalogue, scheduling, live telemetry
Current State: AI Dependency Audit
Hard AI dependencies (must disable when no key)
| Feature |
Endpoint |
File |
| Coffee bag analysis |
POST /api/analyze_coffee |
apps/server/api/routes/coffee.py |
| AI profile creation |
POST /api/analyze_and_profile |
apps/server/api/routes/coffee.py |
| Description format conversion |
POST /api/profile/convert-description |
apps/server/api/routes/profiles.py |
| AI image generation |
POST /api/profile/{name}/generate-image |
apps/server/api/routes/profiles.py |
Soft AI dependencies (can gracefully degrade)
| Feature |
Endpoint |
Fallback |
| Profile import description |
POST /api/profile/import |
Skip generation, use static description (see below) |
| Bulk profile import |
POST /api/profile/import-all |
Same — already catches generation failures |
| LLM shot analysis |
POST /api/shots/analyze-llm |
Local algorithmic analysis already exists in _analyze_shot_local() |
| LLM cache check |
GET /api/shots/{id}/llm-analysis |
Returns null — no issue |
No AI dependency (already works)
- Control Center (all machine commands, live telemetry, WebSocket)
- Profile catalogue browsing, manual image uploads
- Shot history with local/algorithmic analysis
- Scheduling, settings, machine profiles listing
- Health check, version endpoint
Proposed Changes
1. Backend: is_ai_available() helper
Add a simple check in services/gemini_service.py:
def is_ai_available() -> bool:
"""Check if the Gemini API key is configured."""
return bool(os.environ.get("GEMINI_API_KEY", "").strip())
2. Backend: Guard AI endpoints
All hard-dependency endpoints should return HTTP 503 with a clear message when AI is unavailable:
{
"status": "error",
"error": "ai_not_configured",
"message": "This feature requires a Gemini API key. Configure one in Settings."
}
3. Backend: Expose AI status in settings
The GET /api/settings response already includes gemini_configured: bool. The frontend can use this to conditionally render UI.
4. Frontend: Conditional AI UI
Components that need changes:
| Component |
Change |
CoffeeAnalyzer.tsx |
Disable "Create Profile" / "Analyze" buttons when !gemini_configured. Show a hint linking to Settings. |
ProfileImportDialog.tsx |
Default generate_description: false when AI unavailable. Hide the "Generate descriptions" toggle or show it as disabled. |
HistoryView.tsx |
Hide "Generate Image" and "Convert Description" buttons when AI unavailable. Manual image upload remains enabled. |
ShotHistoryView.tsx |
Hide "AI Analysis" tab/button when AI unavailable. Local analysis tab remains. |
SettingsView.tsx |
Already handles missing key gracefully. Could add a note: "AI features are optional — MeticAI works without a Gemini API key." |
5. Static profile descriptions (for AI-free imports)
When importing profiles without AI, generate a deterministic local description from the profile JSON instead of calling the LLM. This should be a pure Python function, e.g.:
def generate_static_description(profile: dict) -> str:
"""Generate a human-readable description from profile JSON without AI."""
name = profile.get("name", "Unknown")
temp = profile.get("temperature")
weight = profile.get("final_weight")
stages = profile.get("stages", [])
# Build stage summary
stage_names = [s.get("name", f"Stage {i+1}") for i, s in enumerate(stages)]
parts = [f"**Profile Created**: {name}\n"]
parts.append(f"**Stages**: {len(stages)}-stage profile")
if stage_names:
parts.append(f"({', '.join(stage_names)})")
if temp:
parts.append(f"\n**Temperature**: {temp}°C")
if weight:
parts.append(f"\n**Target Weight**: {weight}g")
return "\n".join(parts)
Additionally, if the profile JSON contains a description field (some Meticulous profiles do), use that as the description text directly.
6. Install scripts: Make API key optional
Both scripts/install.sh and scripts/install.ps1 currently require the Gemini API key input. Change to:
Enter your Gemini API key (press Enter to skip — AI features will be disabled):
7. MCP server / Gemini CLI: Conditional startup
The mcp-server s6 service and Gemini CLI are only useful when AI is configured. Make the s6 service check for the key and exit cleanly if not set, saving container resources:
#!/command/execlineb -P
if { test -n "${GEMINI_API_KEY}" }
# start the MCP server
8. Docker image size consideration (optional / future)
The Gemini CLI (@google/gemini-cli) adds significant size to the Docker image. A future optimization could create a lighter "no-AI" image variant, but this is not required for the initial implementation.
Acceptance Criteria
Out of Scope
- Manual profile creation UI (separate feature request)
- "No-AI" Docker image variant (future optimization)
- Alternative AI providers (future feature)
Summary
MeticAI should be fully usable without a Gemini API key. Users who just want to manage profiles, control their machine, browse shot history, and import existing profiles should be able to do so without configuring (or paying for) AI services. AI features should gracefully degrade — disabled in the UI, guarded on the backend — rather than blocking the entire app.
Motivation
Current State: AI Dependency Audit
Hard AI dependencies (must disable when no key)
POST /api/analyze_coffeeapps/server/api/routes/coffee.pyPOST /api/analyze_and_profileapps/server/api/routes/coffee.pyPOST /api/profile/convert-descriptionapps/server/api/routes/profiles.pyPOST /api/profile/{name}/generate-imageapps/server/api/routes/profiles.pySoft AI dependencies (can gracefully degrade)
POST /api/profile/importPOST /api/profile/import-allPOST /api/shots/analyze-llm_analyze_shot_local()GET /api/shots/{id}/llm-analysisnull— no issueNo AI dependency (already works)
Proposed Changes
1. Backend:
is_ai_available()helperAdd a simple check in
services/gemini_service.py:2. Backend: Guard AI endpoints
All hard-dependency endpoints should return
HTTP 503with a clear message when AI is unavailable:{ "status": "error", "error": "ai_not_configured", "message": "This feature requires a Gemini API key. Configure one in Settings." }3. Backend: Expose AI status in settings
The
GET /api/settingsresponse already includesgemini_configured: bool. The frontend can use this to conditionally render UI.4. Frontend: Conditional AI UI
Components that need changes:
CoffeeAnalyzer.tsx!gemini_configured. Show a hint linking to Settings.ProfileImportDialog.tsxgenerate_description: falsewhen AI unavailable. Hide the "Generate descriptions" toggle or show it as disabled.HistoryView.tsxShotHistoryView.tsxSettingsView.tsx5. Static profile descriptions (for AI-free imports)
When importing profiles without AI, generate a deterministic local description from the profile JSON instead of calling the LLM. This should be a pure Python function, e.g.:
Additionally, if the profile JSON contains a
descriptionfield (some Meticulous profiles do), use that as the description text directly.6. Install scripts: Make API key optional
Both
scripts/install.shandscripts/install.ps1currently require the Gemini API key input. Change to:7. MCP server / Gemini CLI: Conditional startup
The
mcp-servers6 service and Gemini CLI are only useful when AI is configured. Make the s6 service check for the key and exit cleanly if not set, saving container resources:8. Docker image size consideration (optional / future)
The Gemini CLI (
@google/gemini-cli) adds significant size to the Docker image. A future optimization could create a lighter "no-AI" image variant, but this is not required for the initial implementation.Acceptance Criteria
GEMINI_API_KEYdescriptionfieldOut of Scope