Skip to content

sunilp/jam-cli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

188 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
    β–ˆβ–ˆβ•—  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—  β–ˆβ–ˆβ–ˆβ•—   β–ˆβ–ˆβ–ˆβ•—
    β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ•‘
    β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ•‘
β–ˆβ–ˆ  β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘
β•šβ–ˆβ–ˆβ–ˆβ–ˆβ•”β• β–ˆβ–ˆβ•‘  β–ˆβ–ˆβ•‘ β–ˆβ–ˆβ•‘ β•šβ•β• β–ˆβ–ˆβ•‘
 β•šβ•β•β•β•  β•šβ•β•  β•šβ•β• β•šβ•β•     β•šβ•β•

jam

The developer-first AI CLI. Cross-language code intelligence from your terminal.

Trace call graphs across Java, SQL, Python, and TypeScript. Impact analysis. AI-powered agentic execution. Works with Copilot & Gemini. Windows, macOS, Linux.

CI npm License: MIT

Docs Β· Install Β· VSCode Extension

jam CLI β€” trace, git wtf, agent

What Jam Does

Jam isn't a generic AI assistant. It's the senior dev who's seen everything β€” direct, opinionated, and warm. Every message, error, and prompt speaks with the same voice: concise, specific, developer-aligned.

  • πŸ” Call graph tracing β€” trace any symbol's callers, callees, and upstream chain across languages
  • πŸ’₯ Impact analysis β€” "if I change this, what breaks?" with column-level SQL dependency tracking
  • πŸ€– Agentic execution β€” jam go (interactive) and jam run (one-shot) decompose tasks into parallel subtasks
  • πŸ’¬ AI chat & ask β€” streaming responses, multi-turn sessions, stdin/pipe support
  • 🩹 Patch workflow β€” generate diffs, validate, preview, apply with confirmation
  • πŸ“Š Code intelligence β€” explain files, search code, review diffs, generate Mermaid diagrams
  • πŸ”§ Git toolkit β€” wtf explains state, undo reverses mistakes, standup shows your work
  • βœ… Verification β€” scan for secrets, lint, type-check before you commit
  • 🧰 19 zero-LLM utilities β€” ports, stats, deps, todo, hash, json, env, and more
  • πŸ”Œ Any provider β€” Copilot, Gemini, Ollama, OpenAI, Anthropic, Groq β€” or bring your own
  • 🏠 Local-first β€” your code never leaves your machine unless you choose a remote provider
  • πŸ–₯️ Cross-platform β€” Windows (PowerShell, cmd), macOS, Linux
  • πŸ”— MCP + plugins β€” connect to Model Context Protocol servers, drop in custom commands

Install

# npm
npm install -g @sunilp-org/jam-cli

# Homebrew
brew tap sunilp/tap && brew install jam-cli

# Try without installing
npx @sunilp-org/jam-cli doctor

Jam auto-detects the best available AI provider β€” no API keys needed if you have Copilot or Gemini:

Priority Provider Setup
1 GitHub Copilot VSCode extension β€” zero config
2 Gemini Code Assist VSCode extension β€” zero config
3 Anthropic export ANTHROPIC_API_KEY=sk-ant-...
4 OpenAI export OPENAI_API_KEY=sk-...
5 Ollama (default) ollama serve + ollama pull llama3.2
jam doctor           # verify everything works
jam models list      # see available models
jam models set gpt-4o  # set your preferred model

Cookbook

Ask & Chat

jam ask "explain the builder pattern in Go"

# pipe anything
cat schema.sql | jam ask "what tables have no foreign keys?"
git log --since="1 week" -p | jam ask "summarize this week's changes"

# interactive chat with history
jam chat

Agent Engine

# interactive agent console β€” reads, writes, runs commands
jam go
jam> add retry logic to the HTTP client with exponential backoff

# one-shot autonomous task
jam run "add input validation to all API endpoints" --yes

# fully autonomous with parallel workers
jam run "refactor auth module into separate files" --auto --workers 4

Code Intelligence

# trace a function's call graph
jam trace createProvider
jam trace updateBalance --impact       # what breaks if this changes?
jam trace handleRequest --mermaid      # output as Mermaid diagram
jam trace PROC_PAYMENT --depth 8       # deeper upstream chain

# explain any file
jam explain src/auth/middleware.ts

# search with AI understanding
jam search "where is the rate limiter configured?"

# generate architecture diagram from code
jam diagram

Git Toolkit

jam git wtf          # "3 files staged, 2 conflicts, 1 stash. Here's what happened..."
jam git undo         # undo last commit, last stash, or last merge
jam git standup      # your commits from the last 3 days
jam git cleanup      # preview and delete merged branches
jam git oops         # fix common mistakes (wrong branch, bad commit message)

Dev Utilities (zero LLM)

jam stats            # LOC, languages, complexity hotspots
jam deps             # import dependency graph
jam todo             # find all TODO/FIXME/HACK comments
jam verify           # pre-commit checks: secrets, lint, types
jam ports            # what's listening on which port
jam env              # environment variable diff between shells
jam hash <file>      # MD5/SHA1/SHA256 of any file
jam json <file>      # validate, format, query JSON
jam recent           # recently modified files
jam convert 5kg lb   # unit conversions
jam http GET /users  # quick HTTP requests
jam pack             # analyze npm/pip/cargo package size

Patch & Review

# AI-powered diff summary
jam diff

# code review with risk assessment
jam review

# generate and apply a patch
jam patch "add error handling to the database module"

# auto-generate commit message matching your project's convention
jam commit

VSCode Extension

Install from Marketplace

  • All commands in the Command Palette
  • @jam chat participant in GitHub Copilot Chat
  • TODO tree in the sidebar with click-to-navigate
  • Copilot or Gemini auto-detected β€” zero configuration, no API keys
  • Status bar indicator shows connection state
  • Keeps jam-cli updated automatically

Configuration

jam init              # interactive setup wizard
jam config show       # show resolved config
// .jamrc (per-project)
{
  "defaultProfile": "work",
  "profiles": {
    "work": { "provider": "anthropic", "model": "claude-sonnet-4-20250514" },
    "local": { "provider": "ollama", "model": "llama3.2" }
  }
}
jam ask "hello" --profile work     # use Anthropic
jam ask "hello" --profile local    # use Ollama

Supports HTTP proxy (HTTP_PROXY), custom CA certificates (tlsCaPath), configurable timeouts, MCP servers, and plugin loading. Full configuration docs β†’


Links

Contributing

See CONTRIBUTING.md. PRs welcome.

License

MIT