From a83862838ddd712a2b413be90b2435459ce70073 Mon Sep 17 00:00:00 2001 From: Filip Seman Date: Sun, 8 Mar 2026 07:32:34 +0100 Subject: [PATCH 1/2] feat: multi-workflow PR comment merging MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Split comment.ts into findComment() + upsertComment(existingId) so the caller can look up the existing comment before building the body. Add embedCoverageData() / extractCoverageData() to render.ts — tool reports are stored as base64 JSON inside a hidden HTML comment at the end of every PR comment body. On each run the action reads the stored data, merges it with the current tool reports (current run wins on conflict), and rewrites the comment with updated combined data. This lets separate workflows (e.g. TypeScript and Go quality checks) contribute to the same sticky comment without coordination. Add a GitHub Callout note when at least one tool has no cached baseline, pointing users to the bootstrapping steps instead of silently showing empty diff tables. Update README to v0.3.0 with multi-workflow setup and bootstrapping documentation. --- README.md | 104 +++++++++++++++++++------ src/comment.ts | 36 ++++++--- src/index.ts | 43 ++++++++++- src/render.ts | 60 ++++++++++++++- src/render_test.ts | 185 ++++++++++++++++++++++++++++++++++++++++++++- 5 files changed, 385 insertions(+), 43 deletions(-) diff --git a/README.md b/README.md index 3dd5115..6137551 100644 --- a/README.md +++ b/README.md @@ -24,8 +24,11 @@ multiple languages, and shows meaningful per-file diffs without external depende - Supports any LCOV-producing tool (Bun, Node.js, Jest, c8, nyc, Istanbul, PHPUnit, …) and Go coverage - Shows per-file coverage deltas against base branch - Single sticky PR comment (updates existing, no spam) +- Multi-workflow merging — separate workflows contribute to the same comment automatically - Uses `@actions/cache` for cross-run comparison +- Supports explicit PR number overrides and optional commit links in the comment header - Optional thresholds and fail-on-decrease +- Omits the top-level comparison block when a full baseline is not available for every tool - No external services or tokens required ## Output example @@ -35,7 +38,7 @@ multiple languages, and shows meaningful per-file diffs without external depende ## Usage ```yaml -- uses: xseman/coverage@v0.2.0 +- uses: xseman/coverage@v0.3.0 with: coverage-artifact-paths: bun:coverage/lcov.info ``` @@ -43,7 +46,7 @@ multiple languages, and shows meaningful per-file diffs without external depende With multiple tools and thresholds: ```yaml -- uses: xseman/coverage@v0.2.0 +- uses: xseman/coverage@v0.3.0 with: coverage-artifact-paths: | bun:coverage/lcov.info @@ -67,11 +70,37 @@ jobs: - run: bun install - run: bun test --coverage --coverage-reporter=lcov - - uses: xseman/coverage@v0.2.0 + - uses: xseman/coverage@v0.3.0 with: coverage-artifact-paths: bun:coverage/lcov.info ``` +### Multi-workflow setup + +When TypeScript and Go (or any other combination) tests run in separate workflows, +use the **same `update-comment-marker`** value in both. The second workflow to finish +will find the first comment, read its embedded tool data, merge the results, and +update the comment in place — producing one combined report. + +```yaml +# typescript-quality.yml +- uses: xseman/coverage@v0.3.0 + with: + update-comment-marker: "" + coverage-artifact-paths: bun:typescript/coverage/lcov.info + +# go-quality.yml +- uses: xseman/coverage@v0.3.0 + with: + update-comment-marker: "" + coverage-artifact-paths: go:go/coverage.out +``` + +If both workflows run at the same time and there is no existing comment yet, both +may create their own comment. On the next commit push they will converge to one. +Use workflow dependencies (`needs:`) or `concurrency` groups if immediate +convergence on the first push is required. + ## How it works ```mermaid @@ -82,17 +111,13 @@ config: fontFamily: monospace fontSize: "10px" --- -sequenceDiagram - participant F as Filesystem - participant A as Action - participant C as Cache - participant G as GitHub - - F->>A: read coverage file - A->>C: restore base coverage - A->>A: compute deltas - A->>G: post PR comment - A->>C: save new coverage + +flowchart LR + A[Read coverage artifacts] --> B[Parse reports by tool] + B --> C[Restore cached base snapshot] + C --> D[Compute file deltas and summaries] + D --> E[Post or update one sticky PR comment] + E --> F[Save current snapshot for later comparisons] ``` Each `:` entry goes through this pipeline independently. Results @@ -100,16 +125,56 @@ are combined into one PR comment. The action caches parsed coverage as JSON via `@actions/cache` using key `{prefix}-{tool}-{branch}-{sha}`, restoring by prefix match to find the latest base-branch snapshot. +When the same `update-comment-marker` is used across multiple workflows, each +run reads the previously embedded tool reports from the existing comment, merges +its own results in (current tool takes priority), and rewrites the comment with +the combined data. + +If every tool has a comparable base snapshot, the comment also includes an +overall base vs head summary. If some tools do not have cached base data yet, +the action still shows the per-tool sections and any available file deltas, +but skips the top-level comparison block so partial baselines do not distort +the summary. A note in the comment identifies which tools are missing a +baseline. + +### Bootstrapping the cache + +The diff table compares head coverage against a cached snapshot from the target +branch. On the first run (or when introducing a new tool) there is nothing to +compare against, so deltas are omitted. The cache is seeded automatically when +the workflow runs on a push to the base branch. + +To get diffs working immediately: + +1. Make sure the workflow triggers on **push** to the base branch (not just + `pull_request`), so coverage is cached after each merge. +2. For a cold start, trigger the workflow manually on the base branch with + `workflow_dispatch`: + +```yaml +on: + push: + branches: [master] + pull_request: + branches: [master] + workflow_dispatch: {} +``` + +Then run the workflow from the Actions tab on the base branch. The next PR +will find the cached snapshot and show full deltas. + ## Inputs | Input | Default | Description | | ------------------------- | ----------------------------------- | -------------------------------------------------- | | `coverage-artifact-paths` | _(required)_ | Newline or comma-separated `:` entries | +| `pull-request-number` | auto-detected | Explicit PR number override for comment updates | +| `show-commit-link` | `on` | Include commit link(s) at the top of the comment | | `base-branch` | PR base ref | Branch for delta comparison | | `cache-key` | `coverage-reporter` | Cache key prefix | | `update-comment-marker` | `` | HTML marker for sticky comment | | `colorize` | `on` | `[+]`/`[-]` delta markers (`on`/`off`) | -| `fail-on-decrease` | `false` | Fail if coverage decreases | +| `fail-on-decrease` | `false` | Fail if any file coverage decreases | | `coverage-threshold` | `0` | Minimum overall coverage % (0 = disabled) | | `github-token` | `${{ github.token }}` | Token for PR comments | @@ -147,15 +212,6 @@ node --test \ go test -coverprofile=coverage.out ./... ``` -## Development - -```bash -bun install # install dependencies -bun test # run tests -bun run lint # typecheck + format check -bun run build # bundle to lib/index.mjs -``` - ## Related - [@actions/cache](https://github.com/actions/cache) diff --git a/src/comment.ts b/src/comment.ts index 7c84f1a..2eab98f 100644 --- a/src/comment.ts +++ b/src/comment.ts @@ -5,22 +5,22 @@ export interface CommentResult { created: boolean; } +export interface ExistingComment { + id: number; + body: string; +} + /** - * Find an existing PR comment containing the given marker string, - * then create or update accordingly. + * Find an existing PR comment containing the given marker string. */ -export async function upsertComment( +export async function findComment( token: string, marker: string, - body: string, prNumber: number, -): Promise { +): Promise { const octokit = github.getOctokit(token); const { owner, repo } = github.context.repo; - // Paginate through existing comments to find the one with our marker - let existingCommentId: number | null = null; - for await ( const response of octokit.paginate.iterator( octokit.rest.issues.listComments, @@ -29,13 +29,27 @@ export async function upsertComment( ) { for (const comment of response.data) { if (comment.body && comment.body.includes(marker)) { - existingCommentId = comment.id; - break; + return { id: comment.id, body: comment.body }; } } - if (existingCommentId) break; } + return null; +} + +/** + * Find an existing PR comment containing the given marker string, + * then create or update accordingly. + */ +export async function upsertComment( + token: string, + body: string, + prNumber: number, + existingCommentId?: number, +): Promise { + const octokit = github.getOctokit(token); + const { owner, repo } = github.context.repo; + if (existingCommentId) { await octokit.rest.issues.updateComment({ owner, diff --git a/src/index.ts b/src/index.ts index ed0830a..363e6be 100644 --- a/src/index.ts +++ b/src/index.ts @@ -9,7 +9,10 @@ import { restoreBaseArtifact, saveArtifact, } from "./cache.js"; -import { upsertComment } from "./comment.js"; +import { + findComment, + upsertComment, +} from "./comment.js"; import { resolveBaseBranch, resolveCurrentBranch, @@ -28,7 +31,10 @@ import { formatPercent, formatPercentValue, } from "./percent.js"; -import { renderReport } from "./render.js"; +import { + extractCoverageData, + renderReport, +} from "./render.js"; import type { ArtifactInput, FileCoverage, @@ -170,6 +176,36 @@ async function run(): Promise { } } + // Resolve PR number early so we can look up the existing comment for merging + const prNumber = await resolvePrNumber(prNumberInput, token); + + // Merge with previously stored tool reports from the same sticky comment. + // This allows separate workflows (e.g. TS and Go) to contribute to one comment. + let existingCommentId: number | undefined; + if (prNumber && token) { + try { + const existing = await findComment(token, marker, prNumber); + if (existing) { + existingCommentId = existing.id; + const stored = extractCoverageData(existing.body); + if (stored) { + const currentTools = new Set(toolReports.map((r) => r.tool)); + for (const prev of stored.tools) { + if (!currentTools.has(prev.tool)) { + toolReports.push(prev); + } + } + if (!baseSha && stored.baseSha) { + baseSha = stored.baseSha; + } + core.info(`Merged ${stored.tools.length} stored tool report(s) from existing comment`); + } + } + } catch { + core.warning("Could not read existing comment for merging"); + } + } + // Build full report const fullReport = buildFullReport(toolReports); @@ -183,11 +219,10 @@ async function run(): Promise { core.setOutput("coverage-decreased", anyDecrease ? "true" : "false"); // Post / update PR comment - const prNumber = await resolvePrNumber(prNumberInput, token); if (prNumber && token) { try { core.info(`Upserting comment on PR #${prNumber}`); - const result = await upsertComment(token, marker, markdown, prNumber); + const result = await upsertComment(token, markdown, prNumber, existingCommentId); core.setOutput("comment-id", result.commentId.toString()); core.info( result.created diff --git a/src/render.ts b/src/render.ts index fefb1a9..0084c0e 100644 --- a/src/render.ts +++ b/src/render.ts @@ -1,3 +1,5 @@ +import { Buffer } from "node:buffer"; + import { formatPercent, formatPercentValue, @@ -7,6 +9,42 @@ import type { ToolCoverageReport, } from "./types.js"; +const COVERAGE_DATA_PREFIX = ""; + +export interface EmbeddedCoverageData { + tools: ToolCoverageReport[]; + baseSha?: string; +} + +export function embedCoverageData( + markdown: string, + data: EmbeddedCoverageData, +): string { + const json = JSON.stringify(data); + const encoded = Buffer.from(json).toString("base64"); + return markdown + "\n" + COVERAGE_DATA_PREFIX + encoded + COVERAGE_DATA_SUFFIX; +} + +export function extractCoverageData( + body: string, +): EmbeddedCoverageData | null { + const start = body.indexOf(COVERAGE_DATA_PREFIX); + if (start === -1) return null; + const dataStart = start + COVERAGE_DATA_PREFIX.length; + const end = body.indexOf(COVERAGE_DATA_SUFFIX, dataStart); + if (end === -1) return null; + const encoded = body.slice(dataStart, end); + try { + const json = Buffer.from(encoded, "base64").toString("utf-8"); + const parsed = JSON.parse(json); + if (!parsed || !Array.isArray(parsed.tools)) return null; + return parsed as EmbeddedCoverageData; + } catch { + return null; + } +} + export interface CommitInfo { sha: string; baseSha?: string; @@ -222,6 +260,19 @@ export function renderReport( parts.push("```\n"); } + // Note tools missing baseline data + const toolsWithoutBase = report.tools.filter( + (t) => t.files.length > 0 && t.summary.baseTotalLines === null, + ); + if (toolsWithoutBase.length > 0) { + const names = toolsWithoutBase.map((t) => `**${t.tool}**`).join(", "); + parts.push( + `> [!NOTE]\n> No cached baseline for ${names}. ` + + "Per-file deltas and the diff table will appear once the target branch has coverage cached. " + + "Push to the base branch or trigger the workflow manually to seed the cache.\n", + ); + } + for (const tool of report.tools) { parts.push("```"); parts.push(renderToolSection(tool, colorize)); @@ -240,5 +291,12 @@ export function renderReport( `Generated at ${report.generatedAt} by coverage`, ); - return parts.join("\n"); + const markdown = parts.join("\n"); + + const embedded: EmbeddedCoverageData = { + tools: report.tools, + baseSha: commitInfo?.baseSha, + }; + + return embedCoverageData(markdown, embedded); } diff --git a/src/render_test.ts b/src/render_test.ts index 2ddbf26..17db438 100644 --- a/src/render_test.ts +++ b/src/render_test.ts @@ -8,11 +8,19 @@ import { buildFullReport, buildToolReport, } from "./diff"; -import { renderReport } from "./render"; -import type { CommitInfo } from "./render"; +import type { + CommitInfo, + EmbeddedCoverageData, +} from "./render"; +import { + embedCoverageData, + extractCoverageData, + renderReport, +} from "./render"; import type { CoverageArtifact, FileCoverage, + ToolCoverageReport, } from "./types"; describe("renderReport", () => { @@ -67,8 +75,9 @@ describe("renderReport", () => { expect(md).not.toContain("[-]"); expect(md).toContain("Go Coverage: 50.00%"); expect(md).toContain("Project coverage is 50.00%."); - // No base — no diff table + // No base — no diff table, but note about missing baseline expect(md).not.toContain("Coverage Diff"); + expect(md).toContain("No cached baseline for **go**"); }); test("renders warning rows", () => { @@ -128,6 +137,42 @@ describe("renderReport", () => { expect(md).not.toContain("Coverage Diff"); expect(md).toContain("Bun Coverage: 80.00% [+] +10.00%"); expect(md).toContain("Go Coverage: 80.00%"); + // Should note which tool is missing baseline + expect(md).toContain("No cached baseline for **go**"); + expect(md).not.toContain("**bun**"); + }); + + test("shows missing-baseline note for single tool without base", () => { + const report = buildToolReport( + "go", + [{ file: "b.go", coveredLines: 8, totalLines: 10, percent: 80 }], + null, + [], + ); + const fullReport = buildFullReport([report]); + const md = renderReport(fullReport, "", true); + + expect(md).toContain("No cached baseline for **go**"); + expect(md).toContain("seed the cache"); + }); + + test("omits missing-baseline note when all tools have base data", () => { + const report = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 8, totalLines: 10, percent: 80 }], + { + tool: "bun", + files: [{ file: "a.ts", coveredLines: 7, totalLines: 10, percent: 70 }], + commitSha: "abc", + branch: "main", + timestamp: "2025-01-01T00:00:00Z", + }, + [], + ); + const fullReport = buildFullReport([report]); + const md = renderReport(fullReport, "", true); + + expect(md).not.toContain("No cached baseline"); }); test("renders project coverage with base and head commit links", () => { @@ -331,3 +376,137 @@ describe("renderReport", () => { expect(headerLine.indexOf("+/-")).toBe(coverageLine.indexOf("+10.00%")); }); }); + +describe("embedCoverageData / extractCoverageData", () => { + test("round-trips tool reports through embed and extract", () => { + const report = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 8, totalLines: 10, percent: 80 }], + null, + [], + ); + const data: EmbeddedCoverageData = { tools: [report], baseSha: "abc123" }; + const markdown = embedCoverageData("## Report\nsome content", data); + const extracted = extractCoverageData(markdown); + + expect(extracted).not.toBeNull(); + expect(extracted!.tools).toHaveLength(1); + expect(extracted!.tools[0].tool).toBe("bun"); + expect(extracted!.tools[0].summary.percent).toBe(80); + expect(extracted!.baseSha).toBe("abc123"); + }); + + test("returns null when no embedded data present", () => { + expect(extractCoverageData("## Report\nno data here")).toBeNull(); + }); + + test("returns null for malformed embedded data", () => { + expect(extractCoverageData("")).toBeNull(); + }); + + test("renderReport output contains extractable data", () => { + const head: FileCoverage[] = [ + { file: "a.ts", coveredLines: 5, totalLines: 10, percent: 50 }, + ]; + const report = buildToolReport("bun", head, null, []); + const fullReport = buildFullReport([report]); + const md = renderReport(fullReport, "", true); + + const extracted = extractCoverageData(md); + expect(extracted).not.toBeNull(); + expect(extracted!.tools).toHaveLength(1); + expect(extracted!.tools[0].tool).toBe("bun"); + }); +}); + +describe("merge workflow", () => { + test("second workflow merges stored tool into combined report", () => { + // Simulate first workflow: Bun produces a comment + const bunReport = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 8, totalLines: 10, percent: 80 }], + null, + [], + ); + const firstReport = buildFullReport([bunReport]); + const firstMd = renderReport(firstReport, "", true); + + // Simulate second workflow: Go extracts stored Bun data, merges + const stored = extractCoverageData(firstMd); + expect(stored).not.toBeNull(); + + const goReport = buildToolReport( + "go", + [{ file: "b.go", coveredLines: 6, totalLines: 10, percent: 60 }], + null, + [], + ); + + // Merge: current tool reports + stored tools not in current run + const mergedTools: ToolCoverageReport[] = [goReport]; + const currentTools = new Set(mergedTools.map((r) => r.tool)); + for (const prev of stored!.tools) { + if (!currentTools.has(prev.tool)) { + mergedTools.push(prev); + } + } + + const mergedReport = buildFullReport(mergedTools); + const mergedMd = renderReport(mergedReport, "", true); + + // Should contain both tools + expect(mergedMd).toContain("Go Coverage: 60.00%"); + expect(mergedMd).toContain("Bun Coverage: 80.00%"); + expect(mergedMd).toContain("**Total Coverage: 70.00%**"); + + // Embedded data should contain both tools + const reExtracted = extractCoverageData(mergedMd); + expect(reExtracted!.tools).toHaveLength(2); + }); + + test("current run overrides stored tool with same name", () => { + // First run: Bun at 80% + const bunOld = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 8, totalLines: 10, percent: 80 }], + null, + [], + ); + const firstReport = buildFullReport([bunOld]); + const firstMd = renderReport(firstReport, "", true); + + // Second run: Bun at 90% (same tool, new data) + const stored = extractCoverageData(firstMd)!; + const bunNew = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 9, totalLines: 10, percent: 90 }], + null, + [], + ); + + const mergedTools: ToolCoverageReport[] = [bunNew]; + const currentTools = new Set(mergedTools.map((r) => r.tool)); + for (const prev of stored.tools) { + if (!currentTools.has(prev.tool)) { + mergedTools.push(prev); + } + } + + expect(mergedTools).toHaveLength(1); + expect(mergedTools[0].summary.percent).toBe(90); + }); + + test("merge preserves baseSha from stored data when current has none", () => { + const report = buildToolReport( + "bun", + [{ file: "a.ts", coveredLines: 8, totalLines: 10, percent: 80 }], + null, + [], + ); + const data: EmbeddedCoverageData = { tools: [report], baseSha: "base123" }; + const md = embedCoverageData("## Report", data); + + const extracted = extractCoverageData(md)!; + expect(extracted.baseSha).toBe("base123"); + }); +}); From 5e970698c5b7ffb177043a28a5c0f898277caa19 Mon Sep 17 00:00:00 2001 From: Filip Seman Date: Sun, 8 Mar 2026 09:00:54 +0100 Subject: [PATCH 2/2] build(release): Update artifact commit process and tagging --- .github/workflows/release.yml | 51 ++++++++++++++++++++--------------- 1 file changed, 30 insertions(+), 21 deletions(-) diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index 7423d8f..32295e0 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -26,10 +26,26 @@ permissions: pull-requests: write jobs: - build: - name: Build and Commit Artifact - runs-on: ubuntu-latest + release-please: + name: Release Please if: github.event_name == 'push' + runs-on: ubuntu-latest + outputs: + release_created: ${{ steps.release.outputs.release_created }} + tag_name: ${{ steps.release.outputs.tag_name }} + steps: + - uses: googleapis/release-please-action@v4 + id: release + with: + token: ${{ secrets.RELEASE_PLEASE_TOKEN }} + config-file: .github/.release-config.json + manifest-file: .github/.release-manifest.json + + build-release: + name: Build Release Artifact + needs: release-please + if: needs.release-please.outputs.release_created == 'true' + runs-on: ubuntu-latest permissions: contents: write steps: @@ -44,7 +60,14 @@ jobs: - run: bun install --frozen-lockfile - run: bun run build - - name: Commit built artifact + # GitHub Actions consume the checked-in bundle from action.yml (`lib/index.js`). + # We only want that generated artifact to change for published releases, not for + # every commit on master. This job runs only after release-please creates a new + # release tag, then commits the freshly built bundle and re-points the tag so the + # published tag includes the exact artifact that users get via `uses: ...@vX.Y.Z`. + - name: Commit artifact and update release tag + env: + TAG: ${{ needs.release-please.outputs.tag_name }} run: | if git diff --quiet lib/index.js; then echo "No changes to lib/index.js" @@ -54,21 +77,7 @@ jobs: git config user.name "github-actions[bot]" git config user.email "github-actions[bot]@users.noreply.github.com" git add lib/index.js - git commit -m "build: update lib/index.js" + git commit -m "build: update lib/index.js for ${TAG}" git push origin master - - release-please: - name: Release Please - needs: build - if: github.event_name == 'push' - runs-on: ubuntu-latest - outputs: - release_created: ${{ steps.release.outputs.release_created }} - tag_name: ${{ steps.release.outputs.tag_name }} - steps: - - uses: googleapis/release-please-action@v4 - id: release - with: - token: ${{ secrets.RELEASE_PLEASE_TOKEN }} - config-file: .github/.release-config.json - manifest-file: .github/.release-manifest.json + git tag -fa "${TAG}" -m "Release ${TAG}" + git push origin "${TAG}" --force