Description
Running: openshell inference set --provider nvidia-nim --model nvidia/nemotron-3-super-120b-a12b
successfully updates the gateway inference route. openshell inference get shows the new provider/model, but nemoclaw status still shows the sandbox with the previous model, which makes it appear that the sandbox inference provider was not updated.
Reproduction Steps
- Open https://build.nvidia.com/nemoclaw and click "Try Early Preview" button to open brev launchable like https://brev.nvidia.com/launchable/deploy/now?launchableID=env-3Azt0aYgVNFEuz7opyx3gscmowS
- Click on "Deploy Launchable" button
- Click "Code-Server" when instance is ready
- During sandbox onboard, set the inference provider to qwen/qwen3.5-397b-a17b
- Then follow the guide "Switch inference providers" as run openshell inference set --provider nvidia-nim --model nvidia/nemotron-3-super-120b-a12b
Environment
- Device: Brev instance Ubuntu 22.04.5 LTS
- Node.js: v22.22.1
- Docker: Docker Engine 29.3.0
- OpenShell CLI: 0.0.10
- NemoClaw: v0.1.0 (6e1208c)
- OpenClaw: 2026.3.11 (29dc654)
Debug Output
Logs
ubuntu@brev-1gfvh46jl:~$ openshell inference set --provider nvidia-nim --model nvidia/nemotron-3-super-120b-a12b
Gateway inference configured:
Route: inference.local
Provider: nvidia-nim
Model: nvidia/nemotron-3-super-120b-a12b
Version: 2
Validated Endpoints:
- https://integrate.api.nvidia.com/v1/chat/completions (openai_chat_completions)
ubuntu@brev-1gfvh46jl:~$ openshell inference get
Gateway inference:
Provider: nvidia-nim
Model: nvidia/nemotron-3-super-120b-a12b
Version: 2
ubuntu@brev-1gfvh46jl:~$ nemoclaw brev-test status
Sandbox: brev-test
Model: qwen/qwen3.5-397b-a17b
Provider: nvidia-nim
GPU: no
Policies: pypi, npm
Sandbox:
Id: c6df6f45-8c00-40e4-8be8-2c3b39fe58ce
Name: brev-test
Namespace: openshell
Phase: Ready
Checklist
[NVB#6014112]
Description
Running: openshell inference set --provider nvidia-nim --model nvidia/nemotron-3-super-120b-a12b
successfully updates the gateway inference route. openshell inference get shows the new provider/model, but nemoclaw status still shows the sandbox with the previous model, which makes it appear that the sandbox inference provider was not updated.
Reproduction Steps
Environment
Debug Output
Logs
Checklist
[NVB#6014112]