Comprehensive documentation for the SPARK CLI tool
This is the complete reference for all SPARK CLI commands, options, and features. For installation instructions, see the main README.
- Getting Started
- Global Options
- Workspace Management
- Authentication
- Task Management
- Backlog Management
- Board Operations
- OKR Management
- Tag Management
- Proposal System
- Configuration
- AI Integration
- Examples
# Check if SPARK CLI is installed
spark --version
# Get general help
spark --help
# Get help for specific commands
spark <command> --help# Initialize your first SPARK workspace
spark init
# This will:
# - Create workspace configuration
# - Set up authentication
# - Guide you through OKR creation
# - Create your first SPARK boardAvailable with all commands:
--help, -h Show help information
--version, -v Show version information
--json Output in JSON format
--yaml Output in YAML format
--no-color Disable colored output
--verbose Enable verbose logging
--config <path> Use custom config file
--workspace <id> Use specific workspaceInitialize a new SPARK workspace.
spark init # Interactive setup
spark init --name "My Team" # Set workspace name
spark init --skip-okrs # Skip OKR setup
spark init --server "https://my-server.com" # Custom server URLOptions:
--name <name>- Set workspace name--skip-okrs- Skip OKR creation during setup--server <url>- Custom server URL
Manage workspace settings and members.
spark workspace show # Show workspace details
spark workspace edit # Edit workspace settings
spark workspace members # List workspace members
spark workspace invite # Invite new members
spark workspace settings # Configure workspace settingsSubcommands:
show- Display workspace informationedit- Modify workspace settingsmembers- Manage workspace membersinvite- Send member invitationssettings- Configure workspace options
Login using OAuth device flow.
spark auth login # Use default server
spark auth login --server <url> # Custom server URL
spark auth login --workspace <id> # Login to specific workspaceOptions:
--server <url>- Custom server URL--workspace <id>- Target workspace ID
Register a new account.
spark auth register # Interactive registration
spark auth register --name "John Doe" # Set display name
spark auth register --email "john@example.com" # Set email
spark auth register --json # Register with JSON inputOptions:
--name <name>- Display name--email <email>- Email address--json- Accept JSON input
Check authentication status.
spark auth status # Current auth status
spark auth status --json # Structured output
spark auth status --verbose # Detailed informationOptions:
--json- JSON output format--verbose- Show detailed information
Logout from server.
spark auth logout # Logout from current workspace
spark auth logout --all # Logout from all workspacesOptions:
--all- Logout from all workspaces
Refresh authentication tokens.
spark auth refresh # Refresh current tokensCreate a new task.
spark task create "Fix login bug" # Simple task
spark task create "Add feature" --description "Details here" # With description
spark task create "Task" --tags "bug,urgent" # With tags
spark task create "Task" --assignee "john" # With assignee
spark task create --json # JSON input
spark task create --yaml # YAML input
spark task create --editor # External editor
spark task create --file tasks.json # From fileOptions:
--description <text>- Task description--tags <tags>- Comma-separated tags--assignee <user>- Assign to user--json- Accept JSON input--yaml- Accept YAML input--editor- Use external editor--file <path>- Create from file
JSON Format:
{
"title": "Task title",
"description": "Task description",
"tags": ["tag1", "tag2"],
"assignee": "username",
"priority": "high|medium|low"
}Edit an existing task.
spark task edit <taskId> # Interactive edit
spark task edit <taskId> --editor # External editor
spark task edit <taskId> --json # JSON input
spark task edit <taskId> --yaml # YAML inputOptions:
--editor- Use external editor--json- Accept JSON input--yaml- Accept YAML input
Assign tasks to team members.
spark task assign <taskId> --to "john" # Assign to user
spark task assign <taskId> --unassign # Remove assignmentOptions:
--to <user>- Assign to specific user--unassign- Remove current assignment
Manage task reviews.
spark task review <taskId> # Start review process
spark task review <taskId> --approve # Approve task
spark task review <taskId> --reject # Reject task
spark task review <taskId> --comment "text" # Add review commentOptions:
--approve- Approve the task--reject- Reject the task--comment <text>- Add review comment
Pull an available task from backlog.
spark task pull # Interactive selection
spark task pull --assignee "john" # Assign to specific user
spark task pull --tag "bug" # Pull tasks with specific tag
spark task pull --priority "high" # Pull high priority tasksOptions:
--assignee <user>- Assign to specific user--tag <tag>- Filter by tag--priority <level>- Filter by priority level
List tasks with advanced filtering.
spark task list # All tasks
spark task list --status backlog # Filter by status
spark task list --assignee "jane" # Filter by assignee
spark task list --tag "bug" # Filter by tag
spark task list --priority "high" # Filter by priority
spark task list --json # JSON output
spark task list --sort "priority" # Sort by field
spark task list --limit 10 # Limit resultsOptions:
--status <status>- Filter by status (backlog, in_progress, review, done)--assignee <user>- Filter by assignee--tag <tag>- Filter by tag--priority <level>- Filter by priority--sort <field>- Sort by field (priority, created, updated)--limit <number>- Limit number of results--json- JSON output format
Move task between columns.
spark task move <taskId> # Move to next column
spark task move <taskId> --to done # Move to specific columnOptions:
--to <column>- Move to specific column (backlog, in_progress, review, done)
Delete a task.
spark task delete <taskId> # Delete with confirmation
spark task delete <taskId> --force # Force delete without confirmationOptions:
--force- Skip confirmation prompt
Browse and manage the backlog.
spark backlog # View backlog
spark backlog --tag "bug" # Filter by tag
spark backlog --assignee "john" # Filter by assignee
spark backlog --priority "high" # Filter by priority
spark backlog --json # JSON output
spark backlog --limit 10 # Limit results
spark backlog --sort "priority" # Sort by fieldOptions:
--tag <tag>- Filter by tag--assignee <user>- Filter by assignee--priority <level>- Filter by priority--limit <number>- Limit results--sort <field>- Sort by field--json- JSON output format
View and manage the SPARK board.
spark board # View current board
spark board --refresh # Refresh from server
spark board --json # JSON output
spark board --filter "tag:bug" # Filter by tag
spark board --assignee "john" # Filter by assignee
spark board --no-colors # Disable colorsOptions:
--refresh- Refresh data from server--json- JSON output format--filter <filter>- Apply filter (tag:name, assignee:user)--assignee <user>- Filter by assignee--no-colors- Disable colored output
Create quarterly OKRs with multiple input methods.
spark okr create # Interactive creation
spark okr create --editor # External editor
spark okr create --json # JSON input
spark okr create --yaml # YAML input
spark okr create --file okrs.json # From fileOptions:
--editor- Use external editor--json- Accept JSON input--yaml- Accept YAML input--file <path>- Create from file
JSON Format:
{
"objective": "Improve user experience",
"key_results": [
{
"title": "Reduce load time by 50%",
"target": 2.5,
"unit": "seconds"
},
{
"title": "Increase user satisfaction to 90%",
"target": 90,
"unit": "percent"
}
]
}Edit existing OKRs.
spark okr edit <okrId> # Interactive edit
spark okr edit <okrId> --editor # External editor
spark okr edit <okrId> --json # JSON input
spark okr edit <okrId> --yaml # YAML inputOptions:
--editor- Use external editor--json- Accept JSON input--yaml- Accept YAML input
List all OKRs with filtering.
spark okr list # All OKRs
spark okr list --quarter Q1 # Filter by quarter
spark okr list --status active # Filter by status
spark okr list --json # JSON outputOptions:
--quarter <quarter>- Filter by quarter (Q1, Q2, Q3, Q4)--status <status>- Filter by status (active, completed, archived)--json- JSON output format
View OKR progress with visual indicators.
spark okr status # All OKRs progress
spark okr status <okrId> # Specific OKR progress
spark okr status --json # Structured outputOptions:
--json- JSON output format
Update OKR progress.
spark okr update <okrId> # Interactive update
spark okr update <okrId> --progress 75 # Set progress percentage
spark okr update <okrId> --value 85 # Set current valueOptions:
--progress <percentage>- Set progress percentage--value <number>- Set current value
Manage task tags.
spark tags list # List all tags
spark tags create "bug" # Create new tag
spark tags delete "old-tag" # Delete tag
spark tags rename "old" "new" # Rename tag
spark tags stats # Show tag statisticsSubcommands:
list- List all tagscreate <name>- Create new tagdelete <name>- Delete tagrename <old> <new>- Rename tagstats- Show tag usage statistics
Manage team proposals.
spark proposal create # Create new proposal
spark proposal list # List all proposals
spark proposal show <proposalId> # Show proposal details
spark proposal approve <proposalId> # Approve proposal
spark proposal reject <proposalId> # Reject proposal
spark proposal comment <proposalId> "text" # Add commentSubcommands:
create- Create new proposallist- List all proposalsshow <id>- Show proposal detailsapprove <id>- Approve proposalreject <id>- Reject proposalcomment <id> <text>- Add comment
.spark/
├── config.json # Team/workspace config (committed)
└── local/
├── config.json # User auth config (gitignored)
├── cache/ # Command cache (gitignored)
└── logs/ # Command logs (gitignored)
- Command-line arguments (highest priority)
- Environment variables
- Local config (
.spark/local/config.json) - Workspace config (
.spark/config.json) - Global config (
~/.spark/config.json) - Default values (lowest priority)
export SPARK_SERVER_URL="https://my-server.com" # Override server URL
export SPARK_CONFIG_DIR="/custom/path" # Custom config directory
export SPARK_WORKSPACE_ID="workspace-123" # Override workspace ID
export SPARK_AUTH_TOKEN="token-here" # Override auth token
export SPARK_EDITOR="vim" # Override default editor
export SPARK_NO_COLOR=1 # Disable colored output
export SPARK_LOG_LEVEL="debug" # Set logging level
export SPARK_CACHE_TTL=300 # Cache time-to-live{
"server": {
"url": "https://api.sparkpm.dev",
"timeout": 30000,
"retry": 3
},
"ui": {
"colors": true,
"table_style": "grid",
"max_width": 120,
"pager": "auto"
},
"editor": {
"command": "$EDITOR",
"temp_dir": "/tmp/spark"
},
"cache": {
"enabled": true,
"ttl": 300
}
}SPARK CLI is designed for seamless AI integration:
# Create task from JSON
echo '{"title":"Fix bug","tags":["bug","urgent"]}' | spark task create --json
# Get board as JSON
spark board --json
# Process tasks with jq
spark task list --json | jq '.[] | select(.tags[] == "bug")'# Create OKR from YAML
spark okr create --yaml <<EOF
objective: "Improve performance"
key_results:
- title: "Reduce load time"
target: 2.0
unit: "seconds"
EOF
# Get OKR status as YAML
spark okr status --yaml# All commands support non-interactive mode
spark task create "title" --description "desc" --tags "tag1,tag2"
spark task assign task-123 --to "user"
spark okr update okr-456 --progress 75# Commands return appropriate exit codes
spark task create "title" || echo "Failed"
# Structured error messages
spark task create --json 2>&1 | jq '.error'# 1. Initialize workspace
spark init --name "My Project"
# 2. Create some tasks
spark task create "Set up development environment"
spark task create "Implement user authentication" --tags "feature,auth"
spark task create "Fix login bug" --tags "bug,urgent"
# 3. View the board
spark board
# 4. Pull a task to work on
spark task pull
# 5. Move task through workflow
spark task move task-123 --to in_progress
spark task move task-123 --to review
spark task move task-123 --to done# Create quarterly OKRs
spark okr create --yaml <<EOF
objective: "Improve user experience"
key_results:
- title: "Reduce page load time to under 2 seconds"
target: 2.0
unit: "seconds"
- title: "Achieve 95% user satisfaction"
target: 95
unit: "percent"
EOF
# Track progress
spark okr status
spark okr update okr-123 --progress 60# Create and organize tags
spark tags create "bug"
spark tags create "feature"
spark tags create "urgent"
# Filter tasks by tags
spark task list --tag "bug"
spark backlog --tag "urgent"# Create tasks from AI-generated JSON
cat ai_tasks.json | spark task create --json
# Get structured data for AI processing
spark board --json | ai-processor
# Batch operations
spark task list --json | jq '.[].id' | xargs -I {} spark task assign {} --to "developer"# Complex board filtering
spark board --filter "tag:bug" --assignee "john"
# Backlog management
spark backlog --priority "high" --limit 5 --sort "created"
# Task reporting
spark task list --status done --json | jq 'length'- Use
spark --helpfor general help - Use
spark <command> --helpfor command-specific help - Use
--jsonflag for structured output - Use
--verboseflag for detailed logging - Check the main README for installation and getting started
This documentation covers all SPARK CLI features and commands. For framework methodology, see the SPARK Framework Guide.