Skip to content

Conversation

@derek-etherton-opslevel
Copy link
Contributor

@derek-etherton-opslevel derek-etherton-opslevel commented Feb 4, 2026

Bump default opslevel.workerHigh.resources from small (200m CPU / 500Mi memory requests) to medium (1 CPU / 2Gi memory requests) so high-priority Sidekiq workers have enough CPU and memory to avoid restarts and degraded deployments on typical self-hosted installs.

Problem

We are seeing degraded performance of one customer's worker-high deployment.

Solution

We already use resourcesMedium for our worker-low deployment, there's no reason we should be allocating less to our high priority workloads. Bumps resourcing from small (200m CPU / 500Mi memory requests) to medium (1 CPU / 2Gi memory requests).

I had AI help outline our current resource requests:

Component Pods CPU (req) Memory (req)
OpsLevel web 3 3 6 Gi
worker-high 2 2 (was 0.4) 4 Gi (was 1 Gi)
worker-low 2 2 4 Gi
scheduler 1 1 2 Gi
worker-faktory 1 1 2 Gi
worker-search 1 1 2 Gi
Opssight web 1 1 2 Gi
Opssight worker 1 0.2 0.5 Gi
MySQL 1 1 2 Gi
Postgres 1 1 2 Gi
Redis 1 1 2 Gi
Elasticsearch 1 2 2 Gi
MinIO 4 4 8 Gi
Faktory 1 1 2 Gi
Total 21 ~21.2 (was ~19.6) ~40.5 Gi (was ~37.5 Gi)

Comparing this back to our documented Resource Requirements:

The Kubernetes cluster must meet the following minimum specifications:

    CPU: 16 cores
    Memory: 20 GB RAM
    Pods: Capacity for at least 20 pods
    Disk Storage: Minimum of 120 GiB (may be lower if the production database is hosted elsewhere)

It seems like our recommended minimums have been well below the current requirements. Am I missing something here, or do the docs need updated?

Bump default opslevel.workerHigh.resources from small (200m CPU / 500Mi memory requests) to medium (1 CPU / 2Gi memory requests) so high-priority Sidekiq workers have enough CPU and memory to avoid restarts and degraded deployments on typical self-hosted installs.
@jasonopslevel
Copy link
Contributor

docs definitely need updating!

@derek-etherton-opslevel derek-etherton-opslevel merged commit 13ae2bf into main Feb 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants