diff --git a/.changeset/generalize-fork-management.md b/.changeset/generalize-fork-management.md deleted file mode 100644 index 0c21aab..0000000 --- a/.changeset/generalize-fork-management.md +++ /dev/null @@ -1,4 +0,0 @@ ---- ---- - -Generalize ccc-dev/ into multi-repo fork management tool diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json index ccd342d..e8a5209 100644 --- a/.devcontainer/devcontainer.json +++ b/.devcontainer/devcontainer.json @@ -37,7 +37,7 @@ // Clipboard support for wl-copy/wl-paste (used by CLAUDE.md) "clipboard": "sudo apt-get update -qq && sudo apt-get install -y -qq --no-install-recommends wl-clipboard", // Install vscode focus timer extension - "focus-timer": "d=~/focus-timer-ext && git clone https://github.com/phroi/vscode-focus-timer.git $d && ln -sfn $d ~/.vscode-server/extensions/phroi.focus" + "focus-timer": "d=~/focus-timer-ext && git clone --depth 1 https://github.com/phroi/vscode-focus-timer.git $d && ln -sfn $d ~/.vscode-server/extensions/phroi.focus" }, // Run each time the container starts. Ensures updated dependencies are installed inside the container user environment. diff --git a/.planning/PROJECT.md b/.planning/PROJECT.md index 0520498..3659620 100644 --- a/.planning/PROJECT.md +++ b/.planning/PROJECT.md @@ -56,7 +56,7 @@ Clean, CCC-aligned library packages published to npm that frontends can depend o **Migration status:** Library packages are on CCC. Apps split: faucet/sampler already migrated; bot/interface/tester still on legacy Lumos (`@ckb-lumos/*`, `@ickb/lumos-utils@1.4.2`, `@ickb/v1-core@1.4.2`). -**Local CCC dev build:** `ccc-fork/` supports using local CCC builds for testing. `.pnpmfile.cjs` transparently rewires `@ckb-ccc/*` to local packages. `fork-scripts/patch.sh` rewrites exports to `.ts` source. This enables testing upstream changes before they're published. +**Local CCC dev build:** `forks/ccc/` supports using local CCC builds for testing. `.pnpmfile.cjs` transparently rewires `@ckb-ccc/*` to local packages. `forks/forker/patch.sh` rewrites exports to `.ts` source. This enables testing upstream changes before they're published. ## Constraints diff --git a/.planning/REQUIREMENTS.md b/.planning/REQUIREMENTS.md index d78278c..9a68233 100644 --- a/.planning/REQUIREMENTS.md +++ b/.planning/REQUIREMENTS.md @@ -86,7 +86,7 @@ Which phases cover which requirements. Updated during roadmap creation. | SMTX-03 | Phase 6 | Pending | | | SMTX-04 | Phase 1 | Complete | getHeader()/HeaderKey removed, CCC client calls inlined | | SMTX-05 | Phase 1, 4, 5 | Complete | addUdtHandlers() replaced with tx.addCellDeps(udtHandler.cellDeps) (01-03); UdtHandler/UdtManager replacement deferred to Phase 4-5 | -| SMTX-06 | Phase 1 | Complete | DAO check contributed to CCC core via ccc-fork/ (01-01) | +| SMTX-06 | Phase 1 | Complete | DAO check contributed to CCC core via forks/ccc/ (01-01) | | SMTX-07 | Phase 5 | Pending | | | SMTX-08 | Phase 6 | Pending | | | SMTX-09 | Phase 7 | Pending | | diff --git a/.planning/ROADMAP.md b/.planning/ROADMAP.md index e6887c6..27dc965 100644 --- a/.planning/ROADMAP.md +++ b/.planning/ROADMAP.md @@ -30,7 +30,7 @@ Decimal phases appear between their surrounding integers in numeric order. 1. `SmartTransaction` class and `CapacityManager` class no longer exist in `@ickb/utils` source or exports 2. `UdtHandler` interface and `UdtManager` class remain in `@ickb/utils` with method signatures updated from `SmartTransaction` to `ccc.TransactionLike` (full replacement deferred to Phase 3+) 3. `getHeader()` function and `HeaderKey` type are removed from `@ickb/utils`; all call sites across dao/core/sdk inline CCC client calls (`client.getTransactionWithHeader()`, `client.getHeaderByNumber()`); `SmartTransaction.addHeaders()` call sites in DaoManager/LogicManager push to `tx.headerDeps` directly - 4. A 64-output NervosDAO limit check exists in CCC core (via `ccc-fork/`): `completeFee()` safety net, standalone async utility, and `ErrorNervosDaoOutputLimit` error class; all 6+ scattered checks across dao/core packages are replaced with calls to this CCC utility + 4. A 64-output NervosDAO limit check exists in CCC core (via `forks/ccc/`): `completeFee()` safety net, standalone async utility, and `ErrorNervosDaoOutputLimit` error class; all 6+ scattered checks across dao/core packages are replaced with calls to this CCC utility 5. ALL manager method signatures across ALL 5 library packages accept `ccc.TransactionLike` instead of `SmartTransaction`, following CCC's convention (TransactionLike input, Transaction output with `Transaction.from()` conversion at entry point) 6. `pnpm check:full` passes after each feature-slice removal step — no intermediate broken states **Plans**: 3 plans diff --git a/.planning/STATE.md b/.planning/STATE.md index d35dd07..2d2714f 100644 --- a/.planning/STATE.md +++ b/.planning/STATE.md @@ -47,10 +47,10 @@ Recent decisions affecting current work: - [Roadmap]: Phase 1 uses feature-slice approach -- each removal chased across all packages, build stays green after every step. SMTX-01 (all signatures to TransactionLike) completed in Phase 1, not Phase 5. - [Roadmap]: UDT investigation (Phase 3) is a design phase that produces a decision document; its outcome determines UdtHandler/UdtManager replacement pattern used in Phases 4-5 - [Roadmap]: Phases 4-5 reduced in scope: Phase 4 focuses on deprecated API replacement + UDT pattern finalization in dao/order; Phase 5 focuses on IckbUdt implementation + conservation law in core -- [Phase 1 Context]: DAO 64-output limit check contributed to CCC core via ccc-fork/, CCC PR submitted during Phase 1 +- [Phase 1 Context]: DAO 64-output limit check contributed to CCC core via forks/ccc/, CCC PR submitted during Phase 1 - [Phase 1 Context]: getHeader()/HeaderKey removed entirely -- inline CCC client calls at read-only call sites; addHeaders() call sites in DaoManager/LogicManager push to tx.headerDeps directly - [Phase 1 Context]: Script comparison must always use full Script.eq(), never just codeHash comparison -- [01-01]: Added ccc-fork local patch mechanism for deterministic replay of CCC modifications (now multi-file format: manifest + res-N.resolution + local-*.patch) +- [01-01]: Added forks/ccc local patch mechanism for deterministic replay of CCC modifications (now multi-file format: manifest + res-N.resolution + local-*.patch) - [01-01]: DaoManager.requestWithdrawal/withdraw client parameter placed before optional options for cleaner API - [01-01]: assertDaoOutputLimit uses early return when outputs <= 64 for zero-cost common case - [01-02]: Moved getHeader/HeaderKey to transaction.ts as non-exported internals (deleted alongside SmartTransaction in 01-03) @@ -80,7 +80,7 @@ None yet. - Resolved: CCC's `Transaction.getInputsCapacity()` handles DAO profit natively via `getInputsCapacityExtra()` -> `CellInput.getExtraCapacity()` -> `Cell.getDaoProfit()` (verified in STACK.md from CCC source). No standalone utility needed. - Resolved: CCC Udt `getInputsInfo()` resolves inputs to `Cell` objects (which have `outPoint`) before passing to `infoFrom()`. `infoFrom()`'s `CellAnyLike` parameter has `outPoint?: OutPointLike | null` — optional, not absent. Input cells have outPoint (for header fetches), output cells don't. Both `infoFrom` and `getInputsInfo/getOutputsInfo` are viable override points for IckbUdt (verified during Phase 3 discuss-phase). - Resolved: STACK.md research correction applied — `client.getHeaderByTxHash()` (non-existent) replaced with `client.getTransactionWithHeader()` in STACK.md, ROADMAP.md Phase 3 success criterion #2, and REQUIREMENTS.md UDT-02. -- Resolved: PR #328 stance updated during Phase 3 context — user decision is to design around PR #328 as target architecture (overrides research recommendation to "not wait for #328"). PR #328 is now integrated into `ccc-fork/ccc` via pins; FeePayer classes available at `ccc-fork/ccc/packages/core/src/signer/feePayer/`. The separate `reference/ccc-fee-payer` clone is no longer needed. +- Resolved: PR #328 stance updated during Phase 3 context — user decision is to design around PR #328 as target architecture (overrides research recommendation to "not wait for #328"). PR #328 is now integrated into `forks/ccc/` via pins; FeePayer classes available at `forks/ccc/packages/core/src/signer/feePayer/`. The separate ccc-fee-payer reference clone is no longer needed. - Resolved: `CellAny` has `capacityFree` getter (CCC transaction.ts:404-405) — 03-RESEARCH.md corrected (previously claimed `CellAny` lacked it). ## Session Continuity diff --git a/.planning/codebase/ARCHITECTURE.md b/.planning/codebase/ARCHITECTURE.md index 2fb579e..c2c375d 100644 --- a/.planning/codebase/ARCHITECTURE.md +++ b/.planning/codebase/ARCHITECTURE.md @@ -47,7 +47,7 @@ The iCKB protocol solves NervosDAO illiquidity by pooling DAO deposits and issui ## On-Chain Contracts (Rust) -Three production smart contracts (in `reference/contracts/` reference repo) implement the protocol on CKB L1. Each TS package in `packages/` corresponds to contract logic: +Three production smart contracts (in `forks/contracts/` reference repo) implement the protocol on CKB L1. Each TS package in `packages/` corresponds to contract logic: | Contract | Script Type | TS Package | Purpose | |---|---|---|---| @@ -70,7 +70,7 @@ Receipts convert to UDT; deposits stay as deposits or convert to UDT. No iCKB ca **Foundation: CCC Framework** - Purpose: Provide blockchain primitives and client interface -- Location: `@ckb-ccc/core` (npm or local `ccc-fork/ccc/`) +- Location: `@ckb-ccc/core` (npm or local `forks/ccc/`) - Contains: CKB RPC clients, transaction builders, signers, Molecule codec, UDT support, Epoch handling - Used by: All packages and applications - Note: CCC now includes UDT and Epoch features contributed by this project's maintainer diff --git a/.planning/codebase/CONCERNS.md b/.planning/codebase/CONCERNS.md index c1ae06f..c616afe 100644 --- a/.planning/codebase/CONCERNS.md +++ b/.planning/codebase/CONCERNS.md @@ -38,28 +38,28 @@ ### Local UDT Handling May Overlap CCC Upstream (Medium) -- Issue: CCC now has a dedicated `@ckb-ccc/udt` package (at `ccc-fork/ccc/packages/udt/`). The local `packages/utils/src/udt.ts` and `packages/core/src/udt.ts` implement custom UDT handling (`UdtHandler` interface, `IckbUdtManager` class). While the local UDT handling is iCKB-specific (custom balance calculation accounting for DAO deposits), the generic UDT operations like `ccc.udtBalanceFrom()` are still being used from CCC upstream in `packages/utils/src/udt.ts` (4 locations). +- Issue: CCC now has a dedicated `@ckb-ccc/udt` package (at `forks/ccc/packages/udt/`). The local `packages/utils/src/udt.ts` and `packages/core/src/udt.ts` implement custom UDT handling (`UdtHandler` interface, `IckbUdtManager` class). While the local UDT handling is iCKB-specific (custom balance calculation accounting for DAO deposits), the generic UDT operations like `ccc.udtBalanceFrom()` are still being used from CCC upstream in `packages/utils/src/udt.ts` (4 locations). - Files: - `packages/utils/src/udt.ts` - `UdtHandler` interface, `UdtManager` class (~370 lines) - `packages/core/src/udt.ts` - `IckbUdtManager` extending UDT handling for iCKB-specific logic - - `ccc-fork/ccc/packages/udt/src/` - CCC upstream UDT package + - `forks/ccc/packages/udt/src/` - CCC upstream UDT package - Usage of `ccc.udtBalanceFrom()`: `packages/utils/src/udt.ts` lines 169, 197, 323, 368 - Impact: There may be duplicated utility code for standard UDT operations (finding cells, calculating balances). The iCKB-specific extensions (e.g., `IckbUdtManager` which modifies balance calculations based on DAO deposit/withdrawal state) are domain-specific and unlikely to be in CCC. - Fix approach: Audit the CCC `@ckb-ccc/udt` package to identify which local utilities can be replaced. Keep iCKB-specific extensions but delegate standard UDT operations (cell finding, basic balance) to CCC where possible. ### Fragile CCC Local Override Mechanism (Medium) -- Issue: The `.pnpmfile.cjs` hook and `fork-scripts/record.sh` script create a fragile mechanism for overriding published CCC packages with local builds. The `.pnpmfile.cjs` `readPackage` hook intercepts pnpm's dependency resolution to redirect `@ckb-ccc/*` packages to local paths under `ccc-fork/ccc/packages/*/`. +- Issue: The `.pnpmfile.cjs` hook and `forks/forker/record.sh` script create a fragile mechanism for overriding published CCC packages with local builds. The `.pnpmfile.cjs` `readPackage` hook intercepts pnpm's dependency resolution to redirect `@ckb-ccc/*` packages to local paths under `forks/ccc/packages/*/`. - Files: - `.pnpmfile.cjs` - pnpm hook that overrides `@ckb-ccc/*` package resolutions - - `fork-scripts/record.sh` - generic fork record script (clones repo, merges refs, builds locally) - - `ccc-fork/config.json` - CCC fork configuration (upstream URL, refs, workspace config) - - `pnpm-workspace.yaml` - includes `ccc-fork/ccc/packages/*` in workspace (auto-generated section) - - `ccc-fork/ccc/` - local CCC checkout (when present) + - `forks/forker/record.sh` - generic fork record script (clones repo, merges refs, builds locally) + - `forks/config.json` - CCC fork configuration (upstream URL, refs, workspace config) + - `pnpm-workspace.yaml` - includes `forks/ccc/packages/*` in workspace (auto-generated section) + - `forks/ccc/` - local CCC checkout (when present) - Impact: Multiple fragility points: - 1. The local CCC repo at `ccc-fork/ccc/` must be manually cloned and kept in sync with a specific branch/commit. + 1. The local CCC repo at `forks/ccc/` must be manually cloned and kept in sync with a specific branch/commit. 2. The `readPackage` hook modifies `dependencies` objects at install time, which can silently break if CCC reorganizes its packages. - 3. CI/CD (`fork-scripts/replay.sh`) must run this setup before `pnpm install`, creating an ordering dependency. + 3. CI/CD (`forks/forker/replay.sh`) must run this setup before `pnpm install`, creating an ordering dependency. 4. The override mechanism is invisible to developers who don't read `.pnpmfile.cjs`, leading to confusion when packages resolve differently than expected from `package.json`. - Fix approach: Now that UDT and Epoch PRs have been merged into CCC upstream, evaluate whether the local overrides are still needed. If CCC publishes releases containing the merged features, switch to published versions and remove the override mechanism. @@ -154,7 +154,7 @@ - Current capacity: Maximum 64 output cells per transaction containing NervosDAO operations. - Limit: Enforced by the NervosDAO script itself. Consolidated into CCC core in Phase 1. - Files: - - `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` — `assertDaoOutputLimit` utility + `completeFee` safety net (contributed to CCC core in Phase 1) + - `forks/ccc/packages/core/src/ckb/transaction.ts` — `assertDaoOutputLimit` utility + `completeFee` safety net (contributed to CCC core in Phase 1) - `packages/dao/src/dao.ts` — calls `assertDaoOutputLimit` - `packages/core/src/owned_owner.ts` — calls `assertDaoOutputLimit` - `apps/bot/src/index.ts`, line 414 (limits to 58 outputs to reserve 6 for change) @@ -217,19 +217,19 @@ ### TS Exchange Rate Must Match Rust Contract Logic -- What's not tested: The TypeScript exchange rate calculation (`packages/core/src/udt.ts`) must produce identical results to the Rust contract's `deposit_to_ickb()` function (`reference/contracts/scripts/contracts/ickb_logic/src/entry.rs`). Any discrepancy would cause transactions to be rejected on-chain. +- What's not tested: The TypeScript exchange rate calculation (`packages/core/src/udt.ts`) must produce identical results to the Rust contract's `deposit_to_ickb()` function (`forks/contracts/scripts/contracts/ickb_logic/src/entry.rs`). Any discrepancy would cause transactions to be rejected on-chain. - Key formula: `iCKB = capacity * AR_0 / AR_m` with soft cap penalty `amount - (amount - 100000) / 10` when `amount > ICKB_SOFT_CAP_PER_DEPOSIT` - Contract constants that TS must match: - `CKB_MINIMUM_UNOCCUPIED_CAPACITY_PER_DEPOSIT = 1,000 * 100_000_000` (1,000 CKB) - `CKB_MAXIMUM_UNOCCUPIED_CAPACITY_PER_DEPOSIT = 1,000,000 * 100_000_000` (1,000,000 CKB) - `ICKB_SOFT_CAP_PER_DEPOSIT = 100,000 * 100_000_000` (100,000 iCKB) - `GENESIS_ACCUMULATED_RATE = 10_000_000_000_000_000` (10^16) -- Reference: `reference/contracts/scripts/contracts/ickb_logic/src/entry.rs` function `deposit_to_ickb()` +- Reference: `forks/contracts/scripts/contracts/ickb_logic/src/entry.rs` function `deposit_to_ickb()` - Fix approach: Add cross-validation tests with known inputs/outputs derived from the Rust contract logic ### TS Molecule Codecs Must Match Contract Schemas -- What's not tested: The TypeScript Molecule codec definitions (`@ccc.codec` decorators in `packages/order/src/entities.ts`, `packages/core/src/entities.ts`) must produce byte-identical encodings to the Molecule schema at `reference/contracts/schemas/encoding.mol`. Field order, sizes, and endianness must match exactly. +- What's not tested: The TypeScript Molecule codec definitions (`@ccc.codec` decorators in `packages/order/src/entities.ts`, `packages/core/src/entities.ts`) must produce byte-identical encodings to the Molecule schema at `forks/contracts/schemas/encoding.mol`. Field order, sizes, and endianness must match exactly. - Key schemas: - `ReceiptData { deposit_quantity: Uint32, deposit_amount: Uint64 }` = 12 bytes - `OwnedOwnerData { owned_distance: Int32 }` = 4 bytes diff --git a/.planning/codebase/CONVENTIONS.md b/.planning/codebase/CONVENTIONS.md index 1f7795f..36d2ceb 100644 --- a/.planning/codebase/CONVENTIONS.md +++ b/.planning/codebase/CONVENTIONS.md @@ -13,7 +13,7 @@ - All `@ckb-lumos/*` packages are **DEPRECATED** -- Lumos is being replaced by CCC. - CCC PRs for UDT and Epochs have been **MERGED** upstream -- those features now exist in CCC itself. - `SmartTransaction` was **DELETED** in Phase 1 in favor of CCC's client cache for header caching. Headers are now fetched inline via CCC client calls (`client.getTransactionWithHeader()`, `client.getHeaderByNumber()`). All manager method signatures now accept `ccc.TransactionLike` and return `ccc.Transaction` directly. -- CCC is sometimes overridden locally via `fork-scripts/record.sh` and `.pnpmfile.cjs` for testing unpublished changes. +- CCC is sometimes overridden locally via `bash forks/forker/record.sh` and `.pnpmfile.cjs` for testing unpublished changes. **When writing new code:** Use CCC (`@ckb-ccc/core`) types and patterns exclusively in `packages/`. Never introduce new Lumos dependencies. @@ -270,7 +270,7 @@ export * from "./utils.js"; ## Molecule / Codec Patterns -**TS codecs must match the Molecule schema** at `reference/contracts/schemas/encoding.mol`. The on-chain contracts use Molecule for serialization; the TS packages must produce byte-identical encodings. +**TS codecs must match the Molecule schema** at `forks/contracts/schemas/encoding.mol`. The on-chain contracts use Molecule for serialization; the TS packages must produce byte-identical encodings. **Entity classes** use CCC's `ccc.Entity.Base` with decorator-based codec definition: ```typescript diff --git a/.planning/codebase/INTEGRATIONS.md b/.planning/codebase/INTEGRATIONS.md index f5e2795..1c068b6 100644 --- a/.planning/codebase/INTEGRATIONS.md +++ b/.planning/codebase/INTEGRATIONS.md @@ -104,7 +104,7 @@ All interaction with the Nervos CKB Layer 1 blockchain happens via JSON-RPC 2.0. ## Smart Contracts (On-Chain Scripts) -The project interacts with several on-chain CKB scripts defined in `packages/sdk/src/constants.ts`. The Rust source code is available in the `reference/contracts/` reference repo (clone via `pnpm reference`). Protocol design is documented in the `reference/whitepaper/` reference repo. +The project interacts with several on-chain CKB scripts defined in `packages/sdk/src/constants.ts`. The Rust source code is available in the `forks/contracts/` reference repo (auto-cloned via `pnpm install`). Protocol design is documented in the `forks/whitepaper/` reference repo. **NervosDAO:** - Code hash: `0x82d76d1b75fe2fd9a27dfbaa65a039221a380d76c926f378d3f81cf3e7e13f2e` @@ -129,7 +129,7 @@ The project interacts with several on-chain CKB scripts defined in `packages/sdk - Hash type: `data1` - Purpose: Core iCKB deposit/receipt logic (type script) - Managed by: `LogicManager` in `packages/core/src/logic.ts` -- Contract source: `reference/contracts/scripts/contracts/ickb_logic/` +- Contract source: `forks/contracts/scripts/contracts/ickb_logic/` - Validation rules: - Empty args required (prevents reuse with different configurations) - Cell classification: Deposit (iCKB lock + DAO type), Receipt (any lock + iCKB type), UDT (any lock + xUDT type) @@ -146,7 +146,7 @@ The project interacts with several on-chain CKB scripts defined in `packages/sdk - Hash type: `data1` - Purpose: Withdrawal ownership tracking (lock script) - Managed by: `OwnedOwnerManager` in `packages/core/src/owned_owner.ts` -- Contract source: `reference/contracts/scripts/contracts/owned_owner/` +- Contract source: `forks/contracts/scripts/contracts/owned_owner/` - Design: Solves NervosDAO constraint that deposit lock and withdrawal lock must have equal size - Mechanism: Owner cell (type=owned_owner) contains `owned_distance: i32 LE` (4 bytes) pointing to its paired owned cell (lock=owned_owner) - Validation rules: @@ -161,7 +161,7 @@ The project interacts with several on-chain CKB scripts defined in `packages/sdk - Hash type: `data1` - Purpose: On-chain limit orders for CKB/UDT exchange (lock script) - Managed by: `OrderManager` in `packages/order/src/order.ts` -- Contract source: `reference/contracts/scripts/contracts/limit_order/` +- Contract source: `forks/contracts/scripts/contracts/limit_order/` - Lifecycle: Mint (create order + master cell) -> Match (partial/full fill) -> Melt (destroy fulfilled order) - Order cell data layout (88-89 bytes): - `[0:16]` UDT amount (u128 LE) @@ -184,7 +184,7 @@ The project interacts with several on-chain CKB scripts defined in `packages/sdk - Cannot modify already-fulfilled orders - Error codes: NotEmptyArgs(5), DuplicatedMaster(6), InvalidAction(7), NonZeroPadding(8), InvalidRatio(9), InvalidCkbMinMatchLog(10), ConcaveRatio(11), BothRatioNull(12), MissingUdtType(13), SameMaster(14), ScriptMisuse(15), DifferentInfo(16), InvalidMatch(17), DecreasingValue(18), AttemptToChangeFulfilled(19), InsufficientMatch(20), InvalidConfiguration(21) -**Molecule Schema (`reference/contracts/schemas/encoding.mol`):** +**Molecule Schema (`forks/contracts/schemas/encoding.mol`):** ```molecule struct ReceiptData { deposit_quantity: Uint32, deposit_amount: Uint64 } struct OwnedOwnerData { owned_distance: Int32 } diff --git a/.planning/codebase/STACK.md b/.planning/codebase/STACK.md index 64cf396..711fcfc 100644 --- a/.planning/codebase/STACK.md +++ b/.planning/codebase/STACK.md @@ -8,10 +8,10 @@ - TypeScript 5.9.3 - All source code across packages and apps **On-Chain (reference):** -- Rust 2021 edition - On-chain CKB smart contracts in `reference/contracts/` reference repo (3 contracts + shared utils, ~1,163 lines). Built with Capsule v0.10.5, `no_std` + alloc-only runtime, targeting RISC-V. Uses `ckb-std 0.15.3` and `primitive_types` crate for C256 safe math. +- Rust 2021 edition - On-chain CKB smart contracts in `forks/contracts/` reference repo (3 contracts + shared utils, ~1,163 lines). Built with Capsule v0.10.5, `no_std` + alloc-only runtime, targeting RISC-V. Uses `ckb-std 0.15.3` and `primitive_types` crate for C256 safe math. **Secondary:** -- Bash - `fork-scripts/record.sh`, `fork-scripts/replay.sh` for local fork dev build setup +- Bash - `forks/forker/record.sh`, `forks/forker/replay.sh` for local fork dev build setup - JavaScript (CJS) - `.pnpmfile.cjs` for pnpm hook overrides, `prettier.config.cjs` ## Runtime @@ -73,15 +73,15 @@ packages: - packages/* - apps/* - # @generated begin fork-workspaces — auto-generated by fork-scripts/record.sh - - ccc-fork/ccc/packages/* - - "!ccc-fork/ccc/packages/demo" - - "!ccc-fork/ccc/packages/docs" - - "!ccc-fork/ccc/packages/examples" - - "!ccc-fork/ccc/packages/faucet" - - "!ccc-fork/ccc/packages/playground" - - "!ccc-fork/ccc/packages/tests" - # @generated end fork-workspaces + # @generated begin forker-workspaces — auto-generated by forks/forker/record.sh + - forks/ccc/packages/* + - "!forks/ccc/packages/demo" + - "!forks/ccc/packages/docs" + - "!forks/ccc/packages/examples" + - "!forks/ccc/packages/faucet" + - "!forks/ccc/packages/playground" + - "!forks/ccc/packages/tests" + # @generated end forker-workspaces catalog: '@ckb-ccc/core': ^1.12.2 @@ -90,7 +90,7 @@ catalog: minimumReleaseAge: 1440 ``` -**Note:** The `fork-workspaces` section between `@generated` markers is auto-generated by `pnpm fork:record` from `*-fork/config.json` files. Manual edits to that section are overwritten on re-record. +**Note:** The `forker-workspaces` section between `@generated` markers is auto-generated by `bash forks/forker/record.sh` from `forks/config.json`. Manual edits to that section are overwritten on re-record. **Internal dependency graph (new CCC-based packages):** ``` @@ -107,19 +107,19 @@ minimumReleaseAge: 1440 ## Local CCC Dev Build Override Mechanism -The repo supports using a local development build of CCC for testing unpublished upstream changes. This is controlled by the generic fork management framework (`fork-scripts/`) and the CCC-specific configuration (`ccc-fork/config.json`): +The repo supports using a local development build of CCC for testing unpublished upstream changes. This is controlled by the generic fork management framework (`forks/forker/`) and the CCC-specific configuration (`forks/config.json`): -**`fork-scripts/record.sh ccc-fork`:** -- Clones the CCC repo (upstream URL from `ccc-fork/config.json`) into `./ccc-fork/ccc/` -- Merges refs listed in `ccc-fork/config.json` onto a `wip` branch (uses AI Coworker CLI for merge conflict resolution) -- Patches CCC exports for source-level types via `fork-scripts/patch.sh` -- Run via: `pnpm fork:record ccc-fork` -- The `ccc-fork/ccc/` directory is gitignored -- Aborts if `ccc-fork/ccc/` has pending work (any changes vs pinned commit, diverged HEAD, or untracked files) +**`bash forks/forker/record.sh ccc`:** +- Clones the CCC repo (upstream URL from `forks/config.json`) into `./forks/ccc/` +- Merges refs listed in `forks/config.json` onto a `wip` branch (uses AI Coworker CLI for merge conflict resolution) +- Patches CCC exports for source-level types via `forks/forker/patch.sh` +- Run via: `bash forks/forker/record.sh ccc` +- The `forks/ccc/` directory is gitignored +- Aborts if `forks/ccc/` has pending work (any changes vs pinned commit, diverged HEAD, or untracked files) **`.pnpmfile.cjs`:** -- A pnpm `readPackage` hook that auto-discovers all packages in `ccc-fork/ccc/packages/*/package.json` -- When `ccc-fork/ccc/` exists, overrides all `@ckb-ccc/*` dependency versions in the workspace with `workspace:*` (CCC packages are listed in `pnpm-workspace.yaml`, but catalog specifiers resolve to semver ranges before workspace linking, so the hook forces `workspace:*` to ensure local packages are used) +- A pnpm `readPackage` hook that auto-discovers all packages in `forks/ccc/packages/*/package.json` +- When `forks/ccc/` exists, overrides all `@ckb-ccc/*` dependency versions in the workspace with `workspace:*` (CCC packages are listed in `pnpm-workspace.yaml`, but catalog specifiers resolve to semver ranges before workspace linking, so the hook forces `workspace:*` to ensure local packages are used) - Applies to `dependencies`, `devDependencies`, and `optionalDependencies` - Effect: all workspace packages transparently use the local CCC build instead of npm versions diff --git a/.planning/codebase/STRUCTURE.md b/.planning/codebase/STRUCTURE.md index 8a7b47b..ddf5b9d 100644 --- a/.planning/codebase/STRUCTURE.md +++ b/.planning/codebase/STRUCTURE.md @@ -67,39 +67,20 @@ │ │ ├── utils.ts # Helper utilities (160 lines) │ │ └── vite-env.d.ts # Vite type definitions │ └── public/ # Static assets -├── fork-scripts/ # Generic fork management scripts (accept fork dir as argument) -│ ├── lib.sh # Shared shell functions for fork scripts -│ ├── patch.sh # Rewrites fork exports to .ts source, creates deterministic commit -│ ├── push.sh # Pushes local fork changes upstream -│ ├── record.sh # Records new pins with AI Coworker conflict resolution -│ ├── replay.sh # Deterministically rebuilds clone from pins -│ ├── replay-all.sh # Replays all *-fork/ directories -│ ├── save.sh # Captures local work as a patch in pins/ -│ ├── status.sh # Checks if fork clone has pending custom work -│ ├── status-all.sh # Checks status of all *-fork/ directories -│ ├── clean.sh # Removes fork clone (guarded against pending work) -│ ├── clean-all.sh # Cleans all *-fork/ directories -│ ├── reset.sh # Resets fork to published packages (guarded) -│ └── tsgo-filter.sh # Wrapper around tsgo filtering fork diagnostics -├── ccc-fork/ # CCC fork configuration and pins -│ ├── config.json # Upstream URL, fork URL, refs to merge, workspace config -│ ├── pins/ # Committed: manifest + counted resolutions + local patches -│ │ ├── HEAD # Expected final SHA after full replay -│ │ ├── manifest # Base SHA + merge refs (TSV, one per line) -│ │ ├── res-N.resolution # Conflict resolution for merge step N (counted format) -│ │ └── local-*.patch # Local development patches (applied after merges) -│ ├── ccc/ # Gitignored: ephemeral CCC clone (auto-generated) -│ └── README.md # Fork documentation -├── reference/ # Read-only reference repos (git-ignored, clone via `pnpm reference`) -│ ├── contracts/ # Rust on-chain contracts -│ │ └── scripts/contracts/ -│ │ ├── ickb_logic/ # Type script: iCKB UDT minting/validation -│ │ ├── limit_order/ # Lock script: peer-to-peer limit orders -│ │ ├── owned_owner/ # Lock script: DAO withdrawal delegation -│ │ └── utils/ # Shared: DAO helpers, C256 safe math, MetaPoint -│ └── whitepaper/ # iCKB protocol design -│ ├── README.md # Complete protocol specification (~49KB) -│ └── 2024_overview.md # Project overview and timeline +├── tsgo-filter.sh # Wrapper around tsgo filtering fork diagnostics +├── forks/ # Unified fork management directory +│ ├── .gitignore # Track only .pin/ and config.json +│ ├── config.json # Unified config, all entries keyed by name +│ ├── .pin/ # Committed: computed state per entry +│ │ └── ccc/ +│ │ ├── HEAD # Expected final SHA after full replay +│ │ ├── manifest # Base SHA + merge refs (TSV, one per line) +│ │ ├── res-N.resolution # Conflict resolution for merge step N (counted format) +│ │ └── local-*.patch # Local development patches (applied after merges) +│ ├── forker/ # Gitignored: fork management tool (self-hosting clone) +│ ├── ccc/ # Gitignored: CCC fork clone (auto-replayed) +│ ├── contracts/ # Gitignored: reference clone (Rust on-chain contracts) +│ └── whitepaper/ # Gitignored: reference clone (iCKB protocol design) ├── .planning/ # GSD analysis documents │ └── codebase/ │ ├── ARCHITECTURE.md # Architecture and data flows @@ -109,9 +90,6 @@ │ ├── CONCERNS.md # Technical debt and issues │ ├── STACK.md # Technology stack │ └── INTEGRATIONS.md # External services and APIs -├── scripts/ # Developer scripts -│ ├── pr.sh # Open GitHub PR creation page -│ └── review.sh # Fetch PR review comments ├── .github/ # GitHub configuration │ └── workflows/ # CI/CD pipeline definitions ├── .devcontainer/ # Dev container configuration @@ -201,17 +179,14 @@ - Data flow: React Query for L1 state, @ickb/v1-core for TX building - Styling: TailwindCSS with inline classes -**fork-scripts/:** -- Purpose: Generic fork management scripts for record/replay mechanism -- All scripts accept a fork directory as their first argument (e.g., `fork-scripts/record.sh ccc-fork`) -- Called via pnpm aliases: `pnpm fork:record ccc-fork`, `pnpm fork:status ccc-fork`, etc. - -**ccc-fork/:** -- Purpose: CCC fork configuration and pins for local development against unpublished upstream changes -- `config.json`: Upstream URL, fork URL, refs to merge, workspace include/exclude -- `pins/`: Committed directory with manifest + counted resolutions + local patches (multi-file format) -- `ccc/`: Gitignored, ephemeral clone auto-generated by `fork-scripts/replay.sh` -- Activation: `.pnpmfile.cjs` auto-triggers replay on `pnpm install` and redirects @ckb-ccc/* deps +**forks/:** +- Purpose: Unified fork management directory (managed forks and reference-only clones) +- `config.json`: Single source of truth for all entries, keyed by name +- `.pin//`: Committed pin state per entry (manifest + counted resolutions + local patches) +- `forker/`: Gitignored self-hosting clone of the fork management tool +- `ccc/`: Gitignored CCC fork clone, auto-replayed from `.pin/ccc/` on `pnpm install` +- `contracts/`, `whitepaper/`: Gitignored reference clones, shallow-cloned on `pnpm install` +- Activation: `.pnpmfile.cjs` bootstraps forker, replays pins, and overrides @ckb-ccc/* deps **.planning/codebase/:** - Purpose: GSD codebase analysis documents @@ -332,13 +307,13 @@ **Dependencies:** - Internal package: `"@ickb/package": "workspace:*"` in package.json -- Internal CCC (local dev): Automatic via `.pnpmfile.cjs` override when `ccc-fork/ccc/` exists +- Internal CCC (local dev): Automatic via `.pnpmfile.cjs` override when `forks/ccc/` exists - External package: `pnpm add @vendor/package` from workspace root - Catalog versions: Reference via `"@vendor/package": "catalog:"` in pnpm-workspace.yaml -**reference/contracts/ (reference repo):** +**forks/contracts/ (reference entry):** - Purpose: Rust on-chain smart contracts for the iCKB protocol (3 production contracts + shared utils) -- Cloned via: `pnpm reference` (git-ignored, read-only reference) +- Auto-cloned via `pnpm install` (git-ignored, shallow clone) - Key paths: - `scripts/contracts/ickb_logic/` - Type script: iCKB UDT minting, deposit/receipt validation, conservation law - `scripts/contracts/limit_order/` - Lock script: peer-to-peer limit order matching (mint/match/melt lifecycle) @@ -349,9 +324,9 @@ - Build: Capsule v0.10.5, Rust 2021, `no_std` + alloc-only, RISC-V target - Audit: Scalebit (2024-09-11) -**reference/whitepaper/ (reference repo):** +**forks/whitepaper/ (reference entry):** - Purpose: iCKB protocol design specification -- Cloned via: `pnpm reference` (git-ignored, read-only reference) +- Auto-cloned via `pnpm install` (git-ignored, shallow clone) - Key files: - `README.md` (~49KB) - Complete protocol specification: deposit/withdrawal phases, exchange rate mechanics, soft cap penalty, pooled deposit model, ancillary scripts (owned owner, limit order), deployment details, attack mitigations - `2024_overview.md` - Project timeline and milestones @@ -359,25 +334,25 @@ ## Special Directories -**fork-scripts/:** +**forks/forker/:** - Purpose: Generic fork management framework for deterministic, conflict-free builds - System: Record/replay mechanism using pins (manifest + counted resolutions + local patches) -- All scripts accept a `-fork` directory as their first argument -- Commands (using `ccc-fork` as example): - - Record: `pnpm fork:record ccc-fork` (requires AI Coworker CLI) - - Status: `pnpm fork:status ccc-fork` (check for pending work in clone) - - Save: `pnpm fork:save ccc-fork [description]` (capture local work as patch in pins/) - - Push: `pnpm fork:push ccc-fork` (cherry-pick commits onto a PR branch) - - Rebuild: `pnpm install` (automatic when pins/ exists but clone does not) - - Clean (re-replay): `pnpm fork:clean ccc-fork && pnpm install` (guarded) - - Reset (published): `pnpm fork:reset ccc-fork && pnpm install` (guarded) - -**ccc-fork/:** -- Purpose: CCC fork configuration for local development against unpublished upstream changes -- `config.json`: Upstream/fork URLs, merge refs, workspace include/exclude -- `pins/`: Committed merge instructions and conflict resolutions -- `ccc/`: Generated from pins; auto-deleted and rebuilt on `pnpm install` -- Activation: `.pnpmfile.cjs` hook triggers `fork-scripts/replay.sh` and overrides package resolution +- All scripts accept an entry name as their first argument (e.g., `ccc`) +- Commands (using `ccc` as example): + - Record: `bash forks/forker/record.sh ccc` (requires AI Coworker CLI) + - Status: `bash forks/forker/status.sh ccc` (check for pending work in clone) + - Save: `bash forks/forker/save.sh ccc [description]` (capture local work as patch in .pin/) + - Push: `bash forks/forker/push.sh ccc` (cherry-pick commits onto a PR branch) + - Rebuild: `pnpm install` (automatic when .pin/ exists but clone does not) + - Clean (re-replay): `bash forks/forker/clean.sh ccc && pnpm install` (guarded) + - Reset (published): `bash forks/forker/reset.sh ccc && pnpm install` (guarded) + +**forks/ccc/:** +- Purpose: CCC fork clone for local development against unpublished upstream changes +- Configuration: `forks/config.json` (unified config, entry keyed by `ccc`) +- Pin state: `forks/.pin/ccc/` (committed manifest + counted resolutions + local patches) +- Clone: `forks/ccc/` (gitignored, generated from pins; auto-replayed on `pnpm install`) +- Activation: `.pnpmfile.cjs` hook triggers `forks/forker/replay.sh` and overrides package resolution **node_modules/:** - Purpose: Installed npm/pnpm dependencies diff --git a/.planning/codebase/TESTING.md b/.planning/codebase/TESTING.md index 23da58b..5acab03 100644 --- a/.planning/codebase/TESTING.md +++ b/.planning/codebase/TESTING.md @@ -265,9 +265,9 @@ pnpm test:cov # Generates V8 coverage report **Contract-Alignment Tests (critical):** - Scope: Verify TS logic produces identical results to Rust contract validation - Priority targets: - 1. Exchange rate: `iCKB = capacity * AR_0 / AR_m` with soft cap penalty -- must match `reference/contracts/scripts/contracts/ickb_logic/src/entry.rs` `deposit_to_ickb()` - 2. Molecule encoding: `ReceiptData`, `OwnedOwnerData`, `Ratio`, `OrderInfo`, `MintOrderData`, `MatchOrderData` -- must match `reference/contracts/schemas/encoding.mol` - 3. Order value conservation: `in_ckb * ckb_mul + in_udt * udt_mul <= out_ckb * ckb_mul + out_udt * udt_mul` -- must match `reference/contracts/scripts/contracts/limit_order/src/entry.rs` `validate()` + 1. Exchange rate: `iCKB = capacity * AR_0 / AR_m` with soft cap penalty -- must match `forks/contracts/scripts/contracts/ickb_logic/src/entry.rs` `deposit_to_ickb()` + 2. Molecule encoding: `ReceiptData`, `OwnedOwnerData`, `Ratio`, `OrderInfo`, `MintOrderData`, `MatchOrderData` -- must match `forks/contracts/schemas/encoding.mol` + 3. Order value conservation: `in_ckb * ckb_mul + in_udt * udt_mul <= out_ckb * ckb_mul + out_udt * udt_mul` -- must match `forks/contracts/scripts/contracts/limit_order/src/entry.rs` `validate()` 4. Concavity check: `c2u.ckb_mul * u2c.udt_mul >= c2u.udt_mul * u2c.ckb_mul` -- must match limit_order contract 5. Deposit size bounds: min 1,000 CKB, max 1,000,000 CKB unoccupied capacity 6. Owned owner distance calculation: TS MetaPoint arithmetic must match contract's `extract_owned_metapoint()` diff --git a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-PLAN.md b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-PLAN.md index 0193523..962b7f3 100644 --- a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-PLAN.md +++ b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-PLAN.md @@ -5,10 +5,10 @@ type: execute wave: 1 depends_on: [] files_modified: - - ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts - - ccc-fork/ccc/packages/core/src/ckb/transaction.ts - - ccc-fork/ccc/packages/core/src/ckb/index.ts - - ccc-fork/pins/ + - forks/ccc/packages/core/src/ckb/transactionErrors.ts + - forks/ccc/packages/core/src/ckb/transaction.ts + - forks/ccc/packages/core/src/ckb/index.ts + - forks/.pin/ccc/ - packages/dao/src/dao.ts - packages/core/src/logic.ts - packages/core/src/owned_owner.ts @@ -24,23 +24,23 @@ must_haves: - "The `ErrorNervosDaoOutputLimit` error class exists in CCC `transactionErrors.ts` with count and limit fields" - "`pnpm check:full` passes after DAO check consolidation" artifacts: - - path: "ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts" + - path: "forks/ccc/packages/core/src/ckb/transactionErrors.ts" provides: "ErrorNervosDaoOutputLimit error class" contains: "ErrorNervosDaoOutputLimit" - - path: "ccc-fork/ccc/packages/core/src/ckb/transaction.ts" + - path: "forks/ccc/packages/core/src/ckb/transaction.ts" provides: "assertDaoOutputLimit utility function and completeFee safety net" contains: "assertDaoOutputLimit" key_links: - from: "packages/dao/src/dao.ts" - to: "ccc-fork/ccc/packages/core/src/ckb/transaction.ts" + to: "forks/ccc/packages/core/src/ckb/transaction.ts" via: "import and call assertDaoOutputLimit" pattern: "assertDaoOutputLimit" - from: "packages/core/src/logic.ts" - to: "ccc-fork/ccc/packages/core/src/ckb/transaction.ts" + to: "forks/ccc/packages/core/src/ckb/transaction.ts" via: "import and call assertDaoOutputLimit" pattern: "assertDaoOutputLimit" - from: "packages/core/src/owned_owner.ts" - to: "ccc-fork/ccc/packages/core/src/ckb/transaction.ts" + to: "forks/ccc/packages/core/src/ckb/transaction.ts" via: "import and call assertDaoOutputLimit" pattern: "assertDaoOutputLimit" --- @@ -50,7 +50,7 @@ Build the 64-output NervosDAO limit check as a CCC core utility and replace all Purpose: Consolidate duplicated DAO output limit logic into a single source of truth in CCC core, preparing for SmartTransaction deletion by removing one of its responsibilities. This is the first feature-slice step (purely additive, nothing breaks). -Output: `ErrorNervosDaoOutputLimit` error class + `assertDaoOutputLimit` utility in CCC core; all iCKB packages calling the utility instead of inline checks; ccc-fork pins recorded. +Output: `ErrorNervosDaoOutputLimit` error class + `assertDaoOutputLimit` utility in CCC core; all iCKB packages calling the utility instead of inline checks; forks/ccc pins recorded. @@ -67,9 +67,9 @@ Output: `ErrorNervosDaoOutputLimit` error class + `assertDaoOutputLimit` utility @.planning/codebase/ARCHITECTURE.md Key source files to read: -@ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts (error class patterns) -@ccc-fork/ccc/packages/core/src/ckb/transaction.ts (completeFee method, where safety net goes) -@ccc-fork/ccc/packages/core/src/ckb/index.ts (barrel exports) +@forks/ccc/packages/core/src/ckb/transactionErrors.ts (error class patterns) +@forks/ccc/packages/core/src/ckb/transaction.ts (completeFee method, where safety net goes) +@forks/ccc/packages/core/src/ckb/index.ts (barrel exports) @packages/utils/src/transaction.ts (SmartTransaction.completeFee with DAO check at lines 85-95) @packages/dao/src/dao.ts (DaoManager with 3 DAO checks at lines 100, 174, 245) @packages/core/src/logic.ts (LogicManager.deposit with DAO check at line 106) @@ -81,14 +81,14 @@ Key source files to read: Task 1: Build ErrorNervosDaoOutputLimit and assertDaoOutputLimit in CCC core - ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts - ccc-fork/ccc/packages/core/src/ckb/transaction.ts - ccc-fork/ccc/packages/core/src/ckb/index.ts + forks/ccc/packages/core/src/ckb/transactionErrors.ts + forks/ccc/packages/core/src/ckb/transaction.ts + forks/ccc/packages/core/src/ckb/index.ts **Step 1: Add error class to transactionErrors.ts** -Read `ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts`. Add `ErrorNervosDaoOutputLimit` following the existing `ErrorTransactionInsufficientCapacity` pattern: +Read `forks/ccc/packages/core/src/ckb/transactionErrors.ts`. Add `ErrorNervosDaoOutputLimit` following the existing `ErrorTransactionInsufficientCapacity` pattern: ```typescript export class ErrorNervosDaoOutputLimit extends Error { @@ -107,7 +107,7 @@ export class ErrorNervosDaoOutputLimit extends Error { **Step 2: Add assertDaoOutputLimit to transaction.ts** -Read `ccc-fork/ccc/packages/core/src/ckb/transaction.ts`. Add the standalone utility function as a module-level export (NOT a method on Transaction). Place it after the Transaction class definition. Import `ErrorNervosDaoOutputLimit` from `./transactionErrors.js`, `KnownScript` from the appropriate module, and `Script` if not already imported: +Read `forks/ccc/packages/core/src/ckb/transaction.ts`. Add the standalone utility function as a module-level export (NOT a method on Transaction). Place it after the Transaction class definition. Import `ErrorNervosDaoOutputLimit` from `./transactionErrors.js`, `KnownScript` from the appropriate module, and `Script` if not already imported: ```typescript /** @@ -153,20 +153,20 @@ In the `completeFee` method of the Transaction class (around line 2183), add a c **Step 4: Export from barrel** -Read `ccc-fork/ccc/packages/core/src/ckb/index.ts`. Add exports for `ErrorNervosDaoOutputLimit` (from `./transactionErrors.js`) and `assertDaoOutputLimit` (from `./transaction.js`). Follow existing export patterns in the file. +Read `forks/ccc/packages/core/src/ckb/index.ts`. Add exports for `ErrorNervosDaoOutputLimit` (from `./transactionErrors.js`) and `assertDaoOutputLimit` (from `./transaction.js`). Follow existing export patterns in the file. -**Step 5: Record ccc-fork pins** +**Step 5: Record forks/ccc pins** -Run: `pnpm fork:record` +Run: `bash forks/forker/record.sh ccc` -This updates `ccc-fork/pins/` to reflect the new CCC state. +This updates `forks/.pin/ccc/` to reflect the new CCC state. -Verify: `pnpm fork:status` should exit 0 (no pending work). +Verify: `bash forks/forker/status.sh ccc` should exit 0 (no pending work). -1. `pnpm fork:status` exits 0 -2. `grep -r "ErrorNervosDaoOutputLimit" ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts` finds the class -3. `grep -r "assertDaoOutputLimit" ccc-fork/ccc/packages/core/src/ckb/transaction.ts` finds the function +1. `bash forks/forker/status.sh ccc` exits 0 +2. `grep -r "ErrorNervosDaoOutputLimit" forks/ccc/packages/core/src/ckb/transactionErrors.ts` finds the class +3. `grep -r "assertDaoOutputLimit" forks/ccc/packages/core/src/ckb/transaction.ts` finds the function 4. `pnpm check:full` passes (purely additive change, nothing should break) @@ -243,9 +243,9 @@ All 7 inline DAO output checks are removed. Every location now calls `await asse 1. `grep -rn "outputs.length > 64" packages/` -- should return 0 results -2. `grep -rn "assertDaoOutputLimit" packages/ ccc-fork/ccc/packages/core/src/ckb/` -- should show usage in 4 iCKB files + definition in CCC -3. `grep -rn "ErrorNervosDaoOutputLimit" ccc-fork/ccc/packages/core/src/ckb/` -- should show class definition + export -4. `pnpm fork:status` -- should exit 0 +2. `grep -rn "assertDaoOutputLimit" packages/ forks/ccc/packages/core/src/ckb/` -- should show usage in 4 iCKB files + definition in CCC +3. `grep -rn "ErrorNervosDaoOutputLimit" forks/ccc/packages/core/src/ckb/` -- should show class definition + export +4. `bash forks/forker/status.sh ccc` -- should exit 0 5. `pnpm check:full` -- should pass clean @@ -255,7 +255,7 @@ All 7 inline DAO output checks are removed. Every location now calls `await asse - completeFee has DAO safety net - All 7 scattered inline checks replaced with centralized utility calls - No intermediate broken build state -- ccc-fork pins recorded +- forks/ccc pins recorded diff --git a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-SUMMARY.md b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-SUMMARY.md index 72b8256..96c6573 100644 --- a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-SUMMARY.md +++ b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-01-SUMMARY.md @@ -10,34 +10,34 @@ provides: - ErrorNervosDaoOutputLimit error class in CCC core - assertDaoOutputLimit centralized utility function in CCC core - completeFee safety net for DAO output limit in CCC core - - ccc-fork local patch mechanism for deterministic builds + - forks/ccc local patch mechanism for deterministic builds affects: [01-02, 01-03, 01-04] # Tech tracking tech-stack: added: [] - patterns: [centralized-dao-limit-check, ccc-fork-local-patches] + patterns: [centralized-dao-limit-check, forks/ccc-local-patches] key-files: created: - - ccc-fork/pins/local-001-dao-output-limit.patch (now part of pins/ multi-file format) + - forks/.pin/ccc/local-001-dao-output-limit.patch (now part of .pin/ multi-file format) modified: - - ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts - - ccc-fork/ccc/packages/core/src/ckb/transaction.ts - - ccc-fork/record.sh - - ccc-fork/replay.sh + - forks/ccc/packages/core/src/ckb/transactionErrors.ts + - forks/ccc/packages/core/src/ckb/transaction.ts + - forks/forker/record.sh + - forks/forker/replay.sh - packages/dao/src/dao.ts - packages/core/src/logic.ts - packages/core/src/owned_owner.ts - packages/utils/src/transaction.ts key-decisions: - - "Added ccc-fork local patch mechanism (pins/local-*.patch) to support deterministic replay of CCC source modifications" + - "Added forks/ccc local patch mechanism (.pin/ccc/local-*.patch) to support deterministic replay of CCC source modifications" - "Moved client parameter before optional options in DaoManager.requestWithdrawal and DaoManager.withdraw signatures" - "assertDaoOutputLimit uses early return when outputs <= 64 for zero-cost in common case" patterns-established: - - "Local CCC patches: pins/local-*.patch files applied after standard merge+patch cycle" + - "Local CCC patches: .pin/ccc/local-*.patch files applied after standard merge+patch cycle" - "DAO output limit: always use ccc.assertDaoOutputLimit(tx, client) instead of inline checks" requirements-completed: [SMTX-06] @@ -64,7 +64,7 @@ completed: 2026-02-22 - Built assertDaoOutputLimit utility function that checks both inputs and outputs for DAO type script using full Script.eq() comparison - Added completeFee safety net in CCC Transaction class (both return paths) - Replaced all 7 inline DAO output checks across 4 files with centralized utility calls -- Added local patch mechanism to ccc-fork record/replay for deterministic builds of CCC modifications +- Added local patch mechanism to forks/ccc record/replay for deterministic builds of CCC modifications ## Task Commits @@ -74,18 +74,18 @@ Each task was committed atomically: 2. **Task 2: Replace all 7 scattered DAO checks with assertDaoOutputLimit calls** - `2decd06` (refactor) ## Files Created/Modified -- `ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts` - ErrorNervosDaoOutputLimit error class -- `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` - assertDaoOutputLimit utility + completeFee safety net -- `ccc-fork/pins/` - Updated pins for deterministic replay -- `ccc-fork/record.sh` - Added local patch preservation and application -- `ccc-fork/replay.sh` - Added local patch application after standard merge+patch +- `forks/ccc/packages/core/src/ckb/transactionErrors.ts` - ErrorNervosDaoOutputLimit error class +- `forks/ccc/packages/core/src/ckb/transaction.ts` - assertDaoOutputLimit utility + completeFee safety net +- `forks/.pin/ccc/` - Updated pins for deterministic replay +- `forks/forker/record.sh` - Added local patch preservation and application +- `forks/forker/replay.sh` - Added local patch application after standard merge+patch - `packages/dao/src/dao.ts` - DaoManager.deposit/requestWithdrawal/withdraw now async with client param - `packages/core/src/logic.ts` - LogicManager.deposit now async with client param - `packages/core/src/owned_owner.ts` - OwnedOwnerManager.requestWithdrawal/withdraw now async with client param - `packages/utils/src/transaction.ts` - SmartTransaction.completeFee DAO check replaced ## Decisions Made -- Added ccc-fork local patch mechanism (pins/local-*.patch) because the existing record/replay infrastructure had no way to persist source-level CCC modifications through the clean/replay cycle. This was a necessary blocking-issue fix (Rule 3). +- Added forks/ccc local patch mechanism (.pin/ccc/local-*.patch) because the existing record/replay infrastructure had no way to persist source-level CCC modifications through the clean/replay cycle. This was a necessary blocking-issue fix (Rule 3). - Placed `client: ccc.Client` parameter before optional `options` parameters in DaoManager signatures for cleaner API design (required params before optional). - assertDaoOutputLimit uses early return when `outputs.length <= 64` so the common-case path has zero async overhead. @@ -93,11 +93,11 @@ Each task was committed atomically: ### Auto-fixed Issues -**1. [Rule 3 - Blocking] Added ccc-fork local patch mechanism** +**1. [Rule 3 - Blocking] Added forks/ccc local patch mechanism** - **Found during:** Task 1 (CCC core changes) -- **Issue:** ccc-fork record/replay infrastructure had no way to persist local CCC source modifications. Running `pnpm fork:record` or `pnpm check:full` would wipe changes because replay clones fresh from upstream. -- **Fix:** Added `pins/local-*.patch` mechanism. Modified `record.sh` to preserve local patches during re-recording and apply them after standard merge+patch. Modified `replay.sh` to apply local patches after standard replay. Both use deterministic git identity/timestamps for reproducible HEAD SHAs. -- **Files modified:** `ccc-fork/record.sh`, `ccc-fork/replay.sh`, `ccc-fork/pins/local-001-dao-output-limit.patch` +- **Issue:** forks/ccc record/replay infrastructure had no way to persist local CCC source modifications. Running `bash forks/forker/record.sh ccc` or `pnpm check:full` would wipe changes because replay clones fresh from upstream. +- **Fix:** Added `.pin//local-*.patch` mechanism. Modified `record.sh` to preserve local patches during re-recording and apply them after standard merge+patch. Modified `replay.sh` to apply local patches after standard replay. Both use deterministic git identity/timestamps for reproducible HEAD SHAs. +- **Files modified:** `forks/forker/record.sh`, `forks/forker/replay.sh`, `forks/.pin/ccc/local-001-dao-output-limit.patch` - **Verification:** `pnpm check:full` passes (clean wipe + replay + build cycle) - **Committed in:** 7081869 (Task 1 commit) @@ -119,7 +119,7 @@ None - no external service configuration required. ## Self-Check: PASSED -- FOUND: ccc-fork/pins/ (local patch integrated into multi-file format) +- FOUND: forks/.pin/ccc/ (local patch integrated into multi-file format) - FOUND: 01-01-SUMMARY.md - FOUND: commit 7081869 (Task 1) - FOUND: commit 2decd06 (Task 2) diff --git a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-CONTEXT.md b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-CONTEXT.md index 5589ebd..98c0dcc 100644 --- a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-CONTEXT.md +++ b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-CONTEXT.md @@ -13,9 +13,9 @@ Delete SmartTransaction class and its infrastructure across all packages; contri ## Implementation Decisions -### CCC DAO Contribution (via ccc-fork/) +### CCC DAO Contribution (via forks/ccc/) - Build the 64-output NervosDAO limit check **in CCC core**, not in @ickb/utils -- Develop in `ccc-fork/ccc/`, record pins, use immediately via workspace override while waiting for upstream merge +- Develop in `forks/ccc/`, record pins, use immediately via workspace override while waiting for upstream merge - **Submit the upstream CCC PR during Phase 1 execution** - CCC PR includes three components: 1. **`completeFee()` safety net** — async check using `client.getKnownScript(KnownScript.NervosDao)` with full `Script.eq()` comparison diff --git a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-RESEARCH.md b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-RESEARCH.md index fe713b1..f4e5b64 100644 --- a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-RESEARCH.md +++ b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-RESEARCH.md @@ -6,7 +6,7 @@ ## Summary -Phase 1 removes `SmartTransaction`, `CapacityManager`, `getHeader()`/`HeaderKey`, and 7 scattered 64-output DAO limit checks. It contributes the DAO check to CCC core via `ccc-fork/`, updates all manager method signatures across all 5 library packages from `SmartTransaction` to `ccc.TransactionLike`, and keeps the build green after every step. +Phase 1 removes `SmartTransaction`, `CapacityManager`, `getHeader()`/`HeaderKey`, and 7 scattered 64-output DAO limit checks. It contributes the DAO check to CCC core via `forks/ccc/`, updates all manager method signatures across all 5 library packages from `SmartTransaction` to `ccc.TransactionLike`, and keeps the build green after every step. The codebase is well-structured: SmartTransaction has exactly 9 consumer files across 5 packages; `getHeader` has 5 standalone call sites plus 4 instance method call sites; the 64-output DAO check appears in 7 locations across 4 files. CCC's native `Transaction` already handles DAO profit in `getInputsCapacity()` via `getInputsCapacityExtra()` -> `CellInput.getExtraCapacity()` -> `Cell.getDaoProfit()`, making SmartTransaction's `getInputsCapacity` override redundant. CCC's `Transaction.from(txLike)` provides the `TransactionLike` -> `Transaction` entry-point conversion pattern that all updated method signatures will follow. @@ -17,7 +17,7 @@ The codebase is well-structured: SmartTransaction has exactly 9 consumer files a ### Locked Decisions - Build the 64-output NervosDAO limit check **in CCC core**, not in @ickb/utils -- Develop in `ccc-fork/ccc/`, record pins, use immediately via workspace override while waiting for upstream merge +- Develop in `forks/ccc/`, record pins, use immediately via workspace override while waiting for upstream merge - **Submit the upstream CCC PR during Phase 1 execution** - CCC PR includes three components: 1. **`completeFee()` safety net** -- async check using `client.getKnownScript(KnownScript.NervosDao)` with full `Script.eq()` comparison @@ -76,14 +76,14 @@ The codebase is well-structured: SmartTransaction has exactly 9 consumer files a |---------|---------|---------|--------------| | `@ckb-ccc/core` | catalog: (^1.12.2) | CKB blockchain SDK | Project's core dependency; `Transaction`, `TransactionLike`, `Client`, `Script.eq()` | | TypeScript | ^5.9.3 (strict mode) | Type safety | `noUncheckedIndexedAccess`, `verbatimModuleSyntax`, `noImplicitOverride` | -| tsgo | native-preview | Type checking | Used via `ccc-fork/tsgo-filter.sh` when CCC is cloned | +| tsgo | native-preview | Type checking | Used via `tsgo-filter.sh` (repo root) when CCC is cloned | | vitest | ^3.2.4 | Testing | CCC's test framework; tests for the CCC PR | | pnpm | 10.30.1 | Package management | Workspace protocol, catalog specifiers | ### Supporting | Library | Version | Purpose | When to Use | |---------|---------|---------|-------------| -| `ccc-fork/` system | local | Local CCC development | Building/testing CCC DAO contribution before upstream merge | +| `forks/ccc/` system | local | Local CCC development | Building/testing CCC DAO contribution before upstream merge | | `@changesets/cli` | ^2.29.8 | Versioning | After API changes, run `pnpm changeset` | ### Alternatives Considered @@ -115,7 +115,7 @@ packages/ ├── sdk/src/ │ ├── sdk.ts # IckbSdk (SmartTransaction -> TransactionLike, CapacityManager removed) │ └── constants.ts # getConfig (CapacityManager removed) -ccc-fork/ccc/packages/core/src/ckb/ +forks/ccc/packages/core/src/ckb/ ├── transactionErrors.ts # + ErrorNervosDaoOutputLimit (new) └── transaction.ts # + completeFee safety net + assertDaoOutputLimit (new) ``` @@ -182,7 +182,7 @@ if (!tx.headerDeps.some((h) => h === hash)) { **When to use:** The new `ErrorNervosDaoOutputLimit` **Example:** ```typescript -// Source: ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts +// Source: forks/ccc/packages/core/src/ckb/transactionErrors.ts // Follow the ErrorTransactionInsufficientCapacity pattern: export class ErrorNervosDaoOutputLimit extends Error { public readonly count: number; @@ -310,17 +310,17 @@ tx.addCellDeps(this.udtHandler.cellDeps); **How to avoid:** Replace `tx.getHeader(client, { type: "txHash", value: outPoint.txHash })` with inlined CCC client calls. The headerDeps validation from the old `getHeader` instance method was a runtime check that headers were pre-populated -- after removal, the client call fetches headers directly. **Warning signs:** TypeScript error `Property 'getHeader' does not exist on type 'Transaction'`. -### Pitfall 8: ccc-fork pins must be recorded after CCC changes -**What goes wrong:** Making changes to `ccc-fork/ccc/` without running `pnpm fork:record` means the pins don't reflect the new state. -**Why it happens:** `ccc-fork/pins/` contains an integrity check. If ccc code changes but pins don't update, replay won't reproduce the same state. -**How to avoid:** After developing the DAO utility in `ccc-fork/ccc/`, run `pnpm fork:record` to update pins. Check `pnpm fork:status` to verify. -**Warning signs:** `pnpm fork:status` reports exit code 1 (pending work). +### Pitfall 8: forks/ccc pins must be recorded after CCC changes +**What goes wrong:** Making changes to `forks/ccc/` without running `bash forks/forker/record.sh ccc` means the pins don't reflect the new state. +**Why it happens:** `forks/.pin/ccc/` contains an integrity check. If ccc code changes but pins don't update, replay won't reproduce the same state. +**How to avoid:** After developing the DAO utility in `forks/ccc/`, run `bash forks/forker/record.sh ccc` to update pins. Check `bash forks/forker/status.sh ccc` to verify. +**Warning signs:** `bash forks/forker/status.sh ccc` reports exit code 1 (pending work). ## Code Examples ### Complete DAO Check Replacement Pattern ```typescript -// Source: Verified from ccc-fork/ccc/packages/core/src/ckb/transaction.ts +// Source: Verified from forks/ccc/packages/core/src/ckb/transaction.ts // and packages/dao/src/dao.ts // Before (scattered in 7 locations): @@ -392,7 +392,7 @@ for (const lock of unique(this.bots)) { ### CCC Vitest Test Pattern ```typescript -// Source: ccc-fork/ccc/packages/core/src/ckb/transaction.test.ts +// Source: forks/ccc/packages/core/src/ckb/transaction.test.ts import { beforeEach, describe, expect, it, vi } from "vitest"; import { ccc } from "../index.js"; @@ -547,7 +547,7 @@ This interface must be moved to a surviving file before `transaction.ts` is dele ### Primary (HIGH confidence) - Codebase source files in `/workspaces/stack/packages/` -- all SmartTransaction consumers, getHeader call sites, DAO checks inventoried directly -- CCC source in `/workspaces/stack/ccc-fork/ccc/packages/core/src/` -- Transaction class, TransactionLike type, error patterns, completeFee implementation, getInputsCapacity, test patterns +- CCC source in `/workspaces/stack/forks/ccc/packages/core/src/` -- Transaction class, TransactionLike type, error patterns, completeFee implementation, getInputsCapacity, test patterns - `.planning/phases/01-ickb-utils-smarttransaction-removal/01-CONTEXT.md` -- User decisions and constraints - `.planning/REQUIREMENTS.md` -- Requirement definitions and traceability diff --git a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-VERIFICATION.md b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-VERIFICATION.md index 3a874d1..1b8a54c 100644 --- a/.planning/phases/01-ickb-utils-smarttransaction-removal/01-VERIFICATION.md +++ b/.planning/phases/01-ickb-utils-smarttransaction-removal/01-VERIFICATION.md @@ -23,7 +23,7 @@ re_verification: false | 1 | `SmartTransaction` class and `CapacityManager` class no longer exist in `@ickb/utils` source or exports | VERIFIED | `packages/utils/src/transaction.ts` and `packages/utils/src/capacity.ts` are deleted; `packages/utils/src/index.ts` exports only `codec.js`, `heap.js`, `udt.js`, `utils.js`; `grep SmartTransaction packages/ apps/` returns zero results | | 2 | `UdtHandler` interface and `UdtManager` class remain in `@ickb/utils` with method signatures updated from `SmartTransaction` to `ccc.TransactionLike` | VERIFIED | `packages/utils/src/udt.ts` exports both `UdtHandler` interface and `UdtManager` class; all methods accept `txLike: ccc.TransactionLike` and convert with `ccc.Transaction.from(txLike)` at entry | | 3 | `getHeader()` function and `HeaderKey` type are removed from `@ickb/utils`; all call sites inline CCC client calls; `SmartTransaction.addHeaders()` call sites push to `tx.headerDeps` directly | VERIFIED | `grep getHeader packages/utils/src/` returns zero results; `grep HeaderKey packages/` returns zero results; `grep addHeaders packages/` returns zero results; all 7 call sites replaced with `client.getTransactionWithHeader()` / `client.getHeaderByNumber()` with null-check-and-throw; 3 `headerDeps.push()` sites with `.some()` dedup in `dao/dao.ts` and `core/logic.ts` | -| 4 | A 64-output NervosDAO limit check exists in CCC core: `completeFee()` safety net, standalone async utility, and `ErrorNervosDaoOutputLimit` error class; all 6+ scattered checks replaced | VERIFIED | `ErrorNervosDaoOutputLimit` in `ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts` with `count` and `limit` fields; `assertDaoOutputLimit` exported from `ccc-fork/ccc/packages/core/src/ckb/transaction.ts`; called at lines 2257 and 2285 in `completeFee`; called in `packages/dao/src/dao.ts` (3×), `packages/core/src/logic.ts` (1×), `packages/core/src/owned_owner.ts` (2×); `grep "outputs.length > 64" packages/` returns zero results | +| 4 | A 64-output NervosDAO limit check exists in CCC core: `completeFee()` safety net, standalone async utility, and `ErrorNervosDaoOutputLimit` error class; all 6+ scattered checks replaced | VERIFIED | `ErrorNervosDaoOutputLimit` in `forks/ccc/packages/core/src/ckb/transactionErrors.ts` with `count` and `limit` fields; `assertDaoOutputLimit` exported from `forks/ccc/packages/core/src/ckb/transaction.ts`; called at lines 2257 and 2285 in `completeFee`; called in `packages/dao/src/dao.ts` (3×), `packages/core/src/logic.ts` (1×), `packages/core/src/owned_owner.ts` (2×); `grep "outputs.length > 64" packages/` returns zero results | | 5 | ALL manager method signatures across ALL 5 library packages accept `ccc.TransactionLike` instead of `SmartTransaction`, following CCC's convention (TransactionLike input, Transaction output with `Transaction.from()` conversion at entry point) | VERIFIED | `txLike: ccc.TransactionLike` present in dao, core/logic, core/owned_owner, core/udt, order, sdk, and utils/udt; `ccc.Transaction.from(txLike)` at entry in all 15 confirmed conversion points; `return tx;` present at all method exits across dao, core, order, sdk; `addUdtHandlers` fully removed, replaced with `tx.addCellDeps(this.udtHandler.cellDeps)` at 7 sites | | 6 | `pnpm check` passes after each feature-slice removal step — no intermediate broken states | VERIFIED | All 5 plans committed atomically with individual task commits (7081869, 2decd06, 85ead3a, 2e832ae, de8f4a7); `pnpm check` passes on current state (confirmed by build execution: all 5 packages compile clean) | @@ -33,8 +33,8 @@ re_verification: false | Artifact | Expected | Status | Details | |----------|----------|--------|---------| -| `ccc-fork/ccc/packages/core/src/ckb/transactionErrors.ts` | `ErrorNervosDaoOutputLimit` error class with `count` and `limit` fields | VERIFIED | Class exists, `public readonly count: number` and `public readonly limit: number` confirmed | -| `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` | `assertDaoOutputLimit` utility + `completeFee` safety net | VERIFIED | Function at line 2465, called in `completeFee` at lines 2257 and 2285 | +| `forks/ccc/packages/core/src/ckb/transactionErrors.ts` | `ErrorNervosDaoOutputLimit` error class with `count` and `limit` fields | VERIFIED | Class exists, `public readonly count: number` and `public readonly limit: number` confirmed | +| `forks/ccc/packages/core/src/ckb/transaction.ts` | `assertDaoOutputLimit` utility + `completeFee` safety net | VERIFIED | Function at line 2465, called in `completeFee` at lines 2257 and 2285 | | `packages/utils/src/utils.ts` | `TransactionHeader` type preserved; `getHeader` and `HeaderKey` absent | VERIFIED | `TransactionHeader` interface at line 19; no `getHeader` function or `HeaderKey` type found | | `packages/utils/src/index.ts` | Barrel exports without `transaction.js` or `capacity.js` | VERIFIED | Exports only `codec.js`, `heap.js`, `udt.js`, `utils.js` | | `packages/utils/src/udt.ts` | `UdtHandler` interface and `UdtManager` class with `TransactionLike` signatures | VERIFIED | Both present; all methods accept `txLike: ccc.TransactionLike` | @@ -48,7 +48,7 @@ re_verification: false | `packages/sdk/src/sdk.ts` | TransactionLike signatures + findCellsOnChain (replacing CapacityManager) | VERIFIED | 2× `txLike: ccc.TransactionLike`; `findCellsOnChain` at line 373 with `scriptLenRange` filter; `getTransactionWithHeader` with null check at line 401 | | `packages/utils/src/transaction.ts` | DELETED | VERIFIED | File does not exist | | `packages/utils/src/capacity.ts` | DELETED | VERIFIED | File does not exist | -| `ccc-fork/pins/` | Local patches for deterministic CCC replay | VERIFIED | Pins directory with multi-file format (manifest + resolutions + patches) | +| `forks/.pin/ccc/` | Local patches for deterministic CCC replay | VERIFIED | Pins directory with multi-file format (manifest + resolutions + patches) | ### Key Link Verification (from PLAN frontmatter) diff --git a/.planning/phases/02-ccc-utility-adoption/02-RESEARCH.md b/.planning/phases/02-ccc-utility-adoption/02-RESEARCH.md index d266ab5..08801cb 100644 --- a/.planning/phases/02-ccc-utility-adoption/02-RESEARCH.md +++ b/.planning/phases/02-ccc-utility-adoption/02-RESEARCH.md @@ -6,7 +6,7 @@ ## Summary -Phase 2 replaces five local utility functions in `@ickb/utils` (`max`, `min`, `gcd`, `isHex`, `hexFrom`) with their CCC equivalents, then deletes the local implementations. The CCC equivalents (`ccc.numMax`, `ccc.numMin`, `ccc.gcd`, `ccc.isHex`, `ccc.numToHex`, `ccc.hexFrom`) are all verified to exist in the CCC core barrel at `@ckb-ccc/core` (verified against `ccc-fork/ccc/packages/core/src/`). +Phase 2 replaces five local utility functions in `@ickb/utils` (`max`, `min`, `gcd`, `isHex`, `hexFrom`) with their CCC equivalents, then deletes the local implementations. The CCC equivalents (`ccc.numMax`, `ccc.numMin`, `ccc.gcd`, `ccc.isHex`, `ccc.numToHex`, `ccc.hexFrom`) are all verified to exist in the CCC core barrel at `@ckb-ccc/core` (verified against `forks/ccc/packages/core/src/`). The main complexity is that the replacements are not all 1:1 drop-ins. The local `max()`/`min()` is generic `` and both current call sites pass `number` (not `bigint`), while `ccc.numMax()`/`ccc.numMin()` return `bigint`. The local `hexFrom()` accepts `bigint | Entity | BytesLike`, while CCC's `hexFrom()` only accepts `HexLike` (= `BytesLike`). All external `hexFrom` call sites pass `ccc.Entity` instances, which have a `.toHex()` method that produces the same result. The `gcd` and `isHex` replacements are straightforward. Seven iCKB-unique utilities are confirmed to have no CCC equivalents and remain unchanged. @@ -48,7 +48,7 @@ No additional libraries needed. This phase only rearranges existing imports. **When to use:** Anywhere the local `hexFrom(entity)` was used with a `ccc.Entity` argument. **Example:** ```typescript -// Source: ccc-fork/ccc/packages/core/src/codec/entity.ts:135-137 +// Source: forks/ccc/packages/core/src/codec/entity.ts:135-137 // Before (local hexFrom): const key = hexFrom(cell.cellOutput.lock); @@ -61,7 +61,7 @@ const key = cell.cellOutput.lock.toHex(); **When to use:** For `gcd`, where the CCC equivalent is a direct function call. **Example:** ```typescript -// Source: ccc-fork/ccc/packages/core/src/utils/index.ts:276-285 +// Source: forks/ccc/packages/core/src/utils/index.ts:276-285 // Before: import { gcd } from "@ickb/utils"; const g = gcd(aScale, bScale); @@ -149,7 +149,7 @@ return Math.ceil(Math.log2(1 + Math.max(1, ...bins))); ## Code Examples -Verified patterns from CCC source (`ccc-fork/ccc/packages/core/src/`): +Verified patterns from CCC source (`forks/ccc/packages/core/src/`): ### numMax / numMin (from num/index.ts:30-62) ```typescript @@ -263,11 +263,11 @@ const hex2 = outPoint.toHex(); // OutPoint -> Hex ## Sources ### Primary (HIGH confidence) -- `ccc-fork/ccc/packages/core/src/num/index.ts` -- `numMax`, `numMin`, `numFrom`, `numToHex` signatures and implementations -- `ccc-fork/ccc/packages/core/src/utils/index.ts` -- `gcd` signature and implementation -- `ccc-fork/ccc/packages/core/src/hex/index.ts` -- `isHex`, `hexFrom` signatures and implementations -- `ccc-fork/ccc/packages/core/src/codec/entity.ts` -- `Entity.toHex()` method -- `ccc-fork/ccc/packages/core/src/barrel.ts` -- confirms all functions exported via CCC barrel +- `forks/ccc/packages/core/src/num/index.ts` -- `numMax`, `numMin`, `numFrom`, `numToHex` signatures and implementations +- `forks/ccc/packages/core/src/utils/index.ts` -- `gcd` signature and implementation +- `forks/ccc/packages/core/src/hex/index.ts` -- `isHex`, `hexFrom` signatures and implementations +- `forks/ccc/packages/core/src/codec/entity.ts` -- `Entity.toHex()` method +- `forks/ccc/packages/core/src/barrel.ts` -- confirms all functions exported via CCC barrel - `packages/utils/src/utils.ts` -- local implementations being replaced - All call sites verified via ripgrep across `packages/` and `apps/` diff --git a/.planning/phases/03-ccc-udt-integration-investigation/03-01-INVESTIGATION.md b/.planning/phases/03-ccc-udt-integration-investigation/03-01-INVESTIGATION.md index b9859f9..2141f01 100644 --- a/.planning/phases/03-ccc-udt-integration-investigation/03-01-INVESTIGATION.md +++ b/.planning/phases/03-ccc-udt-integration-investigation/03-01-INVESTIGATION.md @@ -1,14 +1,14 @@ # Phase 3 Plan 1: CCC Udt Integration Investigation **Investigated:** 2026-02-24 -**Source base:** ccc-fork/ccc (local fork with PR #328 integrated) +**Source base:** forks/ccc (local fork with PR #328 integrated) **Purpose:** Trace CCC Udt class internals end-to-end, verify infoFrom override feasibility, resolve all open questions from 03-RESEARCH.md ## CCC Udt Method Chain Trace ### Udt Constructor -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:412-425` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:412-425` ```typescript constructor( @@ -35,7 +35,7 @@ constructor( ### infoFrom (Override Target) -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:624-641` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:624-641` ```typescript async infoFrom( @@ -70,7 +70,7 @@ async infoFrom( ### isUdt -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:1063-1069` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:1063-1069` ```typescript isUdt(cellLike: ccc.CellAnyLike) { @@ -90,7 +90,7 @@ isUdt(cellLike: ccc.CellAnyLike) { ### balanceFromUnsafe -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:590-593` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:590-593` ```typescript static balanceFromUnsafe(outputData: ccc.HexLike): ccc.Num { @@ -106,7 +106,7 @@ static balanceFromUnsafe(outputData: ccc.HexLike): ccc.Num { ### getInputsInfo -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:1099-1108` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:1099-1108` ```typescript async getInputsInfo( @@ -130,7 +130,7 @@ async getInputsInfo( ### getOutputsInfo -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:1178-1184` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:1178-1184` ```typescript async getOutputsInfo( @@ -150,7 +150,7 @@ async getOutputsInfo( ### completeInputsByBalance -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:1394-1446` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:1394-1446` ```typescript async completeInputsByBalance( @@ -203,7 +203,7 @@ async completeInputsByBalance( ### completeInputs (Low-Level) -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:1309-1331` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:1309-1331` ```typescript async completeInputs( @@ -224,7 +224,7 @@ async completeInputs( ### CellAnyLike type -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:313-318` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:313-318` ```typescript export type CellAnyLike = { @@ -239,7 +239,7 @@ export type CellAnyLike = { ### CellAny class -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:331-457` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:331-457` ```typescript export class CellAny { @@ -260,7 +260,7 @@ export class CellAny { ### CellAny.from() factory -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:374-386` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:374-386` ```typescript static from(cell: CellAnyLike): CellAny { @@ -278,7 +278,7 @@ static from(cell: CellAnyLike): CellAny { ### CellAny.capacityFree -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:404-405` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:404-405` ```typescript get capacityFree() { @@ -290,7 +290,7 @@ get capacityFree() { ### Cell class (extends CellAny) -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:488-503` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:488-503` ```typescript export class Cell extends CellAny { @@ -309,7 +309,7 @@ export class Cell extends CellAny { ### CellInput.getCell() -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:861-872` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:861-872` ```typescript async getCell(client: Client): Promise { @@ -329,7 +329,7 @@ async getCell(client: Client): Promise { ### tx.outputCells getter -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:1715-1728` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:1715-1728` ```typescript get outputCells(): Iterable { @@ -362,7 +362,7 @@ get outputCells(): Iterable { ### UdtInfo structure -**File:** `ccc-fork/ccc/packages/udt/src/udt/index.ts:218-292` +**File:** `forks/ccc/packages/udt/src/udt/index.ts:218-292` ```typescript export class UdtInfo { @@ -408,7 +408,7 @@ Key difference: `addAssign` mutates in place and returns `this`. The override mu ### client.getTransactionWithHeader() -**File:** `ccc-fork/ccc/packages/core/src/client/client.ts:631-661` +**File:** `forks/ccc/packages/core/src/client/client.ts:631-661` ```typescript async getTransactionWithHeader( @@ -448,7 +448,7 @@ async getTransactionWithHeader( ### client.getCellWithHeader() -**File:** `ccc-fork/ccc/packages/core/src/client/client.ts:212-234` +**File:** `forks/ccc/packages/core/src/client/client.ts:212-234` ```typescript async getCellWithHeader( @@ -476,7 +476,7 @@ This is transparent to the `infoFrom` override -- just call `client.getTransacti ### FeePayer abstract class -**File:** `ccc-fork/ccc/packages/core/src/signer/feePayer/feePayer.ts:14-72` +**File:** `forks/ccc/packages/core/src/signer/feePayer/feePayer.ts:14-72` ```typescript export abstract class FeePayer { @@ -502,7 +502,7 @@ export abstract class FeePayer { ### Transaction.completeByFeePayer() -**File:** `ccc-fork/ccc/packages/core/src/ckb/transaction.ts:2264-2275` +**File:** `forks/ccc/packages/core/src/ckb/transaction.ts:2264-2275` ```typescript async completeByFeePayer(...feePayers: FeePayer[]): Promise { diff --git a/.planning/phases/03-ccc-udt-integration-investigation/03-01-PLAN.md b/.planning/phases/03-ccc-udt-integration-investigation/03-01-PLAN.md index edc0fb9..16ac974 100644 --- a/.planning/phases/03-ccc-udt-integration-investigation/03-01-PLAN.md +++ b/.planning/phases/03-ccc-udt-integration-investigation/03-01-PLAN.md @@ -23,7 +23,7 @@ must_haves: provides: "Detailed source code trace findings with exact line references and code snippets" contains: "## CCC Udt Method Chain Trace" key_links: - - from: "ccc-fork/ccc/packages/udt/src/udt/index.ts" + - from: "forks/ccc/packages/udt/src/udt/index.ts" to: "packages/core/src/udt.ts" via: "infoFrom override replacing getInputsUdtBalance" pattern: "infoFrom.*CellAnyLike" @@ -58,7 +58,7 @@ Output: `03-01-INVESTIGATION.md` -- structured findings with exact line referenc Read and trace the following CCC source files end-to-end to verify research findings: -1. **Udt class and infoFrom** (`ccc-fork/ccc/packages/udt/src/udt/index.ts`): +1. **Udt class and infoFrom** (`forks/ccc/packages/udt/src/udt/index.ts`): - Trace `infoFrom()` signature, return type, how it accumulates `UdtInfo` - Trace `getInputsInfo()` -- how it resolves `CellInput` to `Cell` via `input.getCell(client)`, confirm output `Cell` objects have `outPoint` set - Trace `getOutputsInfo()` -- how it iterates `tx.outputCells`, confirm yielded `CellAny` objects lack `outPoint` @@ -67,23 +67,23 @@ Read and trace the following CCC source files end-to-end to verify research find - Check `isUdt()` method -- how it differs from the current `UdtManager.isUdt()` in `@ickb/utils` - Check `Udt.balanceFromUnsafe()` -- how it reads UDT balance from output data -2. **CellAny and Cell types** (`ccc-fork/ccc/packages/core/src/ckb/transaction.ts`): +2. **CellAny and Cell types** (`forks/ccc/packages/core/src/ckb/transaction.ts`): - Verify `CellAny.outPoint` is `OutPoint | undefined` (not nullable) - Verify `CellAny` vs `Cell` -- confirm `CellAny` has `capacityFree` getter (pre-resolved: transaction.ts:404-405) - Check `Cell.capacityFree` -- exact computation (capacity - occupiedSize) - Check `CellInput.getCell(client)` -- confirm returned Cell has outPoint set - Check `tx.outputCells` getter -- confirm yielded values are `CellAny` without outPoint -3. **UdtInfo type** (`ccc-fork/ccc/packages/udt/src/udt/index.ts` or related): +3. **UdtInfo type** (`forks/ccc/packages/udt/src/udt/index.ts` or related): - Trace `UdtInfo` fields: balance, capacity, count - Trace `UdtInfo.addAssign()` method - Compare with current `IckbUdtManager.getInputsUdtBalance()` return type `[FixedPoint, FixedPoint]` -- map the migration -4. **Header access** (`ccc-fork/ccc/packages/core/src/client/client.ts`): +4. **Header access** (`forks/ccc/packages/core/src/client/client.ts`): - Verify `client.getTransactionWithHeader(txHash)` returns `{ transaction, header? }` where header has `.dao.ar` - Confirm CCC Client.cache handles repeated calls transparently -5. **PR #328 compatibility check** (`ccc-fork/ccc/packages/core/src/signer/feePayer/`): +5. **PR #328 compatibility check** (`forks/ccc/packages/core/src/signer/feePayer/`): - Check whether PR #328's FeePayer changes affect `infoFrom` or `getInputsInfo`/`getOutputsInfo` method signatures - Confirm `infoFrom` override is compatible with both current and PR #328 architectures diff --git a/.planning/phases/03-ccc-udt-integration-investigation/03-CONTEXT.md b/.planning/phases/03-ccc-udt-integration-investigation/03-CONTEXT.md index b4c09d4..79a0925 100644 --- a/.planning/phases/03-ccc-udt-integration-investigation/03-CONTEXT.md +++ b/.planning/phases/03-ccc-udt-integration-investigation/03-CONTEXT.md @@ -17,7 +17,7 @@ Assess feasibility of subclassing CCC's `udt.Udt` class for iCKB's multi-represe - CCC alignment is the primary driver — iCKB should feel native to CCC users and benefit from upstream improvements - Upstream CCC PRs are explicitly on the table if CCC's Udt class needs small, targeted changes to accommodate iCKB's multi-representation value - No concern about CCC upgrade risk — if we contribute to CCC's Udt, we co-own the design -- PR #328 (FeePayer abstraction by ashuralyk) is the target architecture — investigation should design around it and identify improvements that would better fit iCKB's needs. Now integrated into `ccc-fork/ccc` (available at `ccc-fork/ccc/packages/core/src/signer/feePayer/`) +- PR #328 (FeePayer abstraction by ashuralyk) is the target architecture — investigation should design around it and identify improvements that would better fit iCKB's needs. Now integrated into `forks/ccc` (available at `forks/ccc/packages/core/src/signer/feePayer/`) - Investigation should cover both cell discovery and balance calculation, not just balance - Design upstream: if CCC Udt changes are needed, design them generically as a "composite UDT" pattern that benefits other CKB tokens beyond iCKB @@ -56,7 +56,7 @@ Assess feasibility of subclassing CCC's `udt.Udt` class for iCKB's multi-represe ## Specific Ideas - `infoFrom` can detect input vs output cells via outpoint presence — investigate this as a cleaner override strategy. Note: STACK.md research incorrectly claimed `CellAnyLike` lacks `outPoint`; it actually has `outPoint?: OutPointLike | null`. `getInputsInfo()` passes `Cell` objects (always have outPoint) to `infoFrom()`, while `getOutputsInfo()` passes `CellAny` from `tx.outputCells` (no outPoint). Both override points are viable. -- PR #328's `completeInputs(tx, filter, accumulator)` pattern (now in `ccc-fork/ccc/packages/core/src/signer/feePayer/feePayer.ts`) could be the hook for auto-fetching iCKB receipt/deposit cells during transaction completion. Note: STACK.md research recommended `client.getHeaderByTxHash()` which does not exist in CCC — the correct API is `client.getTransactionWithHeader()` as used in the current codebase. +- PR #328's `completeInputs(tx, filter, accumulator)` pattern (now in `forks/ccc/packages/core/src/signer/feePayer/feePayer.ts`) could be the hook for auto-fetching iCKB receipt/deposit cells during transaction completion. Note: STACK.md research recommended `client.getHeaderByTxHash()` which does not exist in CCC — the correct API is `client.getTransactionWithHeader()` as used in the current codebase. - The `ickbValue()` function (core/udt.ts:151) and `convert()` function (core/udt.ts:179) are the core exchange rate calculation — these must work within the Udt override context - Current `IckbUdtManager.getInputsUdtBalance()` (core/udt.ts:66) is the reference implementation for multi-representation balance calculation — three cell types: xUDT cells, receipt cells (type = logicScript), deposit cells (lock = logicScript + isDeposit) diff --git a/.planning/phases/03-ccc-udt-integration-investigation/03-DECISION.md b/.planning/phases/03-ccc-udt-integration-investigation/03-DECISION.md index 9cc5670..94048c4 100644 --- a/.planning/phases/03-ccc-udt-integration-investigation/03-DECISION.md +++ b/.planning/phases/03-ccc-udt-integration-investigation/03-DECISION.md @@ -18,7 +18,7 @@ **Selected: `infoFrom`** (not `getInputsInfo`/`getOutputsInfo`) Rationale: -- `infoFrom` operates at the per-cell level (`ccc-fork/ccc/packages/udt/src/udt/index.ts:624-641`), providing fine-grained control over how each cell contributes to balance +- `infoFrom` operates at the per-cell level (`forks/ccc/packages/udt/src/udt/index.ts:624-641`), providing fine-grained control over how each cell contributes to balance - `getInputsInfo`/`getOutputsInfo` contain input resolution logic (`input.getCell(client)`) and output iteration (`tx.outputCells`) that would need to be duplicated if overridden - `infoFrom` receives a `client: ccc.Client` parameter (unused in base implementation) that the override needs for header fetches - `infoFrom` is async, allowing network calls within the override @@ -28,7 +28,7 @@ Rationale: **1. xUDT cells (standard UDT balance)** -- **Identification:** `this.isUdt(cell)` -- checks `cell.cellOutput.type?.eq(this.script)` with full `Script.eq()` (codeHash + hashType + args) and `outputData.length >= 16` bytes (`ccc-fork/ccc/packages/udt/src/udt/index.ts:1063-1069`) +- **Identification:** `this.isUdt(cell)` -- checks `cell.cellOutput.type?.eq(this.script)` with full `Script.eq()` (codeHash + hashType + args) and `outputData.length >= 16` bytes (`forks/ccc/packages/udt/src/udt/index.ts:1063-1069`) - **Balance:** `udt.Udt.balanceFromUnsafe(cell.outputData)` -- reads first 16 bytes as 128-bit LE integer (`index.ts:590-593`). Replaces the manual `ccc.numFromBytes(ccc.bytesFrom(outputData).slice(0, 16))` pattern in current `IckbUdtManager` - **Applies to:** Both input and output cells - **Sign:** Positive @@ -234,7 +234,7 @@ class IckbUdt extends udt.Udt { ``` This code sketch is derived from: -- The base `infoFrom` implementation (`ccc-fork/ccc/packages/udt/src/udt/index.ts:624-641`) +- The base `infoFrom` implementation (`forks/ccc/packages/udt/src/udt/index.ts:624-641`) - The current `IckbUdtManager.getInputsUdtBalance()` logic (`packages/core/src/udt.ts:66-141`) - The line-by-line migration mapping from `03-01-INVESTIGATION.md` diff --git a/.planning/phases/03-ccc-udt-integration-investigation/03-RESEARCH.md b/.planning/phases/03-ccc-udt-integration-investigation/03-RESEARCH.md index 55359f8..41bb6a0 100644 --- a/.planning/phases/03-ccc-udt-integration-investigation/03-RESEARCH.md +++ b/.planning/phases/03-ccc-udt-integration-investigation/03-RESEARCH.md @@ -11,7 +11,7 @@ - CCC alignment is the primary driver -- iCKB should feel native to CCC users and benefit from upstream improvements - Upstream CCC PRs are explicitly on the table if CCC's Udt class needs small, targeted changes to accommodate iCKB's multi-representation value - No concern about CCC upgrade risk -- if we contribute to CCC's Udt, we co-own the design -- PR #328 (FeePayer abstraction by ashuralyk) is the target architecture -- investigation should design around it and identify improvements that would better fit iCKB's needs. Now integrated into `ccc-fork/ccc` (available at `ccc-fork/ccc/packages/core/src/signer/feePayer/`) +- PR #328 (FeePayer abstraction by ashuralyk) is the target architecture -- investigation should design around it and identify improvements that would better fit iCKB's needs. Now integrated into `forks/ccc` (available at `forks/ccc/packages/core/src/signer/feePayer/`) - Investigation should cover both cell discovery and balance calculation, not just balance - Design upstream: if CCC Udt changes are needed, design them generically as a "composite UDT" pattern that benefits other CKB tokens beyond iCKB - Leaning toward `IckbUdt extends udt.Udt` -- iCKB is fundamentally a UDT, just with extra cell types carrying value @@ -301,7 +301,7 @@ override async getInputsUdtBalance( ### CCC Udt.infoFrom Base Implementation (Override Target) ```typescript -// Source: ccc-fork/ccc/packages/udt/src/udt/index.ts lines 624-641 +// Source: forks/ccc/packages/udt/src/udt/index.ts lines 624-641 async infoFrom( _client: ccc.Client, cells: ccc.CellAnyLike | ccc.CellAnyLike[], @@ -323,7 +323,7 @@ async infoFrom( ### CCC getInputsInfo Chain (How Input Cells Reach infoFrom) ```typescript -// Source: ccc-fork/ccc/packages/udt/src/udt/index.ts lines 1099-1108 +// Source: forks/ccc/packages/udt/src/udt/index.ts lines 1099-1108 async getInputsInfo(client: ccc.Client, txLike: ccc.TransactionLike): Promise { const tx = ccc.Transaction.from(txLike); const inputCells = await Promise.all( @@ -334,7 +334,7 @@ async getInputsInfo(client: ccc.Client, txLike: ccc.TransactionLike): Promise { const tx = ccc.Transaction.from(txLike); return this.infoFrom(client, Array.from(tx.outputCells)); @@ -344,7 +344,7 @@ async getOutputsInfo(client: ccc.Client, txLike: ccc.TransactionLike): Promise( tx: Transaction, filter: ClientCollectableSearchKeyFilterLike, @@ -381,13 +381,13 @@ abstract completeInputs( | `ccc.udtBalanceFrom()` (deprecated) | `udt.Udt.balanceFromUnsafe(outputData)` | Current CCC | Old API deprecated, new one in Udt class | | `tx.completeInputsByUdt()` (deprecated) | `udt.completeInputsByBalance(tx, signer)` | Current CCC | Old on Transaction, new on Udt instance | | `tx.getInputsUdtBalance()` / `tx.getOutputsUdtBalance()` (deprecated) | `udt.getInputsInfo(client, tx)` / `udt.getOutputsInfo(client, tx)` | Current CCC | New methods return UdtInfo (balance + capacity + count) | -| PR #328 FeePayer Udt (uses deprecated APIs) | Current CCC Udt (uses `infoFrom`) | Integrated into ccc-fork/ccc | PR #328's Udt is simpler, still uses old deprecated APIs; current CCC Udt is more complete | +| PR #328 FeePayer Udt (uses deprecated APIs) | Current CCC Udt (uses `infoFrom`) | Integrated into forks/ccc | PR #328's Udt is simpler, still uses old deprecated APIs; current CCC Udt is more complete | **Deprecated/outdated:** - `ccc.udtBalanceFrom()`: Replaced by `udt.Udt.balanceFromUnsafe()` - `tx.completeInputsByUdt()`: Replaced by `udt.Udt.completeInputsByBalance()` - `tx.getInputsUdtBalance()` / `tx.getOutputsUdtBalance()`: Replaced by `udt.Udt.getInputsInfo()` / `udt.Udt.getOutputsInfo()` -- PR #328 FeePayer branch's Udt class: Uses deprecated APIs above; the current CCC Udt class (which we work with via ccc-fork) is more advanced +- PR #328 FeePayer branch's Udt class: Uses deprecated APIs above; the current CCC Udt class (which we work with via forks/ccc) is more advanced ## Open Questions @@ -414,14 +414,14 @@ abstract completeInputs( ## Sources ### Primary (HIGH confidence) -- `ccc-fork/ccc/packages/udt/src/udt/index.ts` -- CCC Udt class source, `infoFrom`, `getInputsInfo`, `getOutputsInfo`, `completeInputsByBalance` full implementation -- `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` -- `CellAny`, `CellAnyLike`, `Cell`, `CellInput.getCell()`, `outputCells` getter -- `ccc-fork/ccc/packages/core/src/client/client.ts` -- `getTransactionWithHeader()`, `getCellWithHeader()` implementations +- `forks/ccc/packages/udt/src/udt/index.ts` -- CCC Udt class source, `infoFrom`, `getInputsInfo`, `getOutputsInfo`, `completeInputsByBalance` full implementation +- `forks/ccc/packages/core/src/ckb/transaction.ts` -- `CellAny`, `CellAnyLike`, `Cell`, `CellInput.getCell()`, `outputCells` getter +- `forks/ccc/packages/core/src/client/client.ts` -- `getTransactionWithHeader()`, `getCellWithHeader()` implementations - `packages/core/src/udt.ts` -- Current `IckbUdtManager`, `ickbValue()`, `convert()`, `ickbExchangeRatio()` - `packages/utils/src/udt.ts` -- Current `UdtHandler` interface, `UdtManager` base class - `packages/core/src/logic.ts` -- `LogicManager`, receipt/deposit identification - `packages/core/src/cells.ts` -- `ReceiptCell`, `IckbDepositCell`, `receiptCellFrom` header access pattern -- `ccc-fork/ccc/packages/core/src/signer/feePayer/feePayer.ts` -- PR #328 FeePayer abstract class (now integrated into ccc-fork/ccc) +- `forks/ccc/packages/core/src/signer/feePayer/feePayer.ts` -- PR #328 FeePayer abstract class (now integrated into forks/ccc) ### Secondary (MEDIUM confidence) - None -- all findings verified from source code diff --git a/.planning/research/ARCHITECTURE.md b/.planning/research/ARCHITECTURE.md index 68564af..d1f0a8e 100644 --- a/.planning/research/ARCHITECTURE.md +++ b/.planning/research/ARCHITECTURE.md @@ -532,8 +532,8 @@ The dependency graph still applies to the order of operations within each featur ## Sources -- CCC `@ckb-ccc/udt` source code: `/workspaces/stack/ccc-fork/ccc/packages/udt/src/udt/index.ts` (HIGH confidence -- direct code examination) -- CCC `@ckb-ccc/core` Transaction class: `/workspaces/stack/ccc-fork/ccc/packages/core/src/ckb/transaction.ts` (HIGH confidence -- direct code examination) +- CCC `@ckb-ccc/udt` source code: `/workspaces/stack/forks/ccc/packages/udt/src/udt/index.ts` (HIGH confidence -- direct code examination) +- CCC `@ckb-ccc/core` Transaction class: `/workspaces/stack/forks/ccc/packages/core/src/ckb/transaction.ts` (HIGH confidence -- direct code examination) - Current SmartTransaction: `/workspaces/stack/packages/utils/src/transaction.ts` (HIGH confidence -- direct code examination) - Current IckbUdtManager: `/workspaces/stack/packages/core/src/udt.ts` (HIGH confidence -- direct code examination) - Current UdtManager/UdtHandler: `/workspaces/stack/packages/utils/src/udt.ts` (HIGH confidence -- direct code examination) diff --git a/.planning/research/FEATURES.md b/.planning/research/FEATURES.md index 63030f0..383a871 100644 --- a/.planning/research/FEATURES.md +++ b/.planning/research/FEATURES.md @@ -56,7 +56,7 @@ Features that seem good but create problems in this context. Explicitly document | **AF-6: Embedded wallet/signer management** | Some SDKs bundle wallet management (key storage, mnemonic handling) | CCC already provides comprehensive signer abstraction (`ccc.Signer`, `ccc.SignerCkbPrivateKey`, JoyId integration). Duplicating this creates security liability. | Delegate all signing to CCC's signer infrastructure. The SDK accepts `ccc.Signer` or `ccc.Script` -- it never manages keys. | | **AF-7: Database/state persistence layer** | Bot and interface could benefit from persistent state (order history, balance cache) | All state is on-chain. Adding a database creates consistency problems (stale state vs chain state). The current stateless design is a feature, not a limitation. | Continue reading all state from L1 via CCC client. Pool snapshots (D-4) provide efficient state approximation without a database. | | **AF-8: New reference/example apps** | More apps might help adoption | Existing 5 apps (bot, interface, faucet, sampler, tester) already demonstrate all library capabilities. Adding more dilutes maintenance focus. | Polish existing apps. They serve as living documentation. | -| **AF-9: CCC framework fork** | Tempting to fork CCC to get features faster | Forking creates maintenance burden and diverges from ecosystem. Upstream PRs are the correct approach. | Submit PRs upstream (already doing this with 12 merged). Track CCC PR #328 (FeePayer). Use `ccc-fork/` local build for testing changes before they land upstream. | +| **AF-9: CCC framework fork** | Tempting to fork CCC to get features faster | Forking creates maintenance burden and diverges from ecosystem. Upstream PRs are the correct approach. | Submit PRs upstream (already doing this with 12 merged). Track CCC PR #328 (FeePayer). Use `forks/ccc/` local build for testing changes before they land upstream. | | **AF-10: On-chain contract changes** | Protocol improvements seem natural alongside library work | All contracts are deployed with zero-args locks (immutable, non-upgradable). Even if desirable, contract changes are impossible. | Library must match existing on-chain contract behavior exactly. All protocol rules are fixed. | ## Feature Dependencies diff --git a/.planning/research/PITFALLS.md b/.planning/research/PITFALLS.md index c91455f..c8be292 100644 --- a/.planning/research/PITFALLS.md +++ b/.planning/research/PITFALLS.md @@ -118,7 +118,7 @@ App migration (deferred to future milestone, not in v1 roadmap). The bot is the ### Pitfall 5: Molecule Codec Byte Layout Mismatch After Refactoring **What goes wrong:** -The TypeScript Molecule codecs (`ReceiptData`, `OwnedOwnerData`, `OrderInfo`, `Ratio`, etc.) use CCC's `@ccc.codec` decorators and `mol.Entity.Base`. These produce byte encodings that must match the Molecule schema at `reference/contracts/schemas/encoding.mol` exactly -- field order, sizes, endianness, padding. A refactoring that reorders fields in a TypeScript class, changes a field type, or inadvertently uses a different encoding for the same semantic value (e.g., `Uint32` vs `Int32` for `owned_distance`) will produce silently different byte encodings. The contracts will reject the transaction or, worse, misinterpret the data. +The TypeScript Molecule codecs (`ReceiptData`, `OwnedOwnerData`, `OrderInfo`, `Ratio`, etc.) use CCC's `@ccc.codec` decorators and `mol.Entity.Base`. These produce byte encodings that must match the Molecule schema at `forks/contracts/schemas/encoding.mol` exactly -- field order, sizes, endianness, padding. A refactoring that reorders fields in a TypeScript class, changes a field type, or inadvertently uses a different encoding for the same semantic value (e.g., `Uint32` vs `Int32` for `owned_distance`) will produce silently different byte encodings. The contracts will reject the transaction or, worse, misinterpret the data. Key risk areas: - `ReceiptData { deposit_quantity: Uint32, deposit_amount: Uint64 }` = 12 bytes. TypeScript uses `@ccc.codec` with fields `depositQuantity` (u32 LE) and `depositAmount` (u64 LE). If someone renames or reorders these fields, the encoded bytes change. @@ -176,7 +176,7 @@ Shortcuts that seem reasonable but create long-term problems. | Keeping SmartTransaction "just for now" while migrating apps | Apps work immediately without library changes | Two transaction models coexist, every new feature must work with both, CCC upgrades become harder | Never -- library refactor must come before app migration | | Passing `SmartTransaction` type through public API boundaries | Avoids rewriting callers | External consumers inherit a dependency on a non-standard Transaction subclass, blocking npm publication | Never for published packages -- internal-only is acceptable during transition | | Skipping codec roundtrip tests | Faster initial development | Silent byte-level bugs that only manifest on-chain | Never -- these tests are cheap to write and prevent catastrophic failures | -| Duplicating CCC utility functions locally instead of adopting upstream | Avoids dependency on specific CCC version | Drift between local and upstream implementations, double maintenance burden | Only if CCC version is not yet released (use `ccc-fork/` local builds to validate, then switch to published version) | +| Duplicating CCC utility functions locally instead of adopting upstream | Avoids dependency on specific CCC version | Drift between local and upstream implementations, double maintenance burden | Only if CCC version is not yet released (use `forks/ccc/` local builds to validate, then switch to published version) | | Migrating bot without parallel Lumos fallback | Cleaner codebase, single transaction path | If CCC-based bot has subtle bugs, no way to fall back; real funds at risk | Never for mainnet -- always keep Lumos bot runnable until CCC bot is validated on testnet | | Removing `@ickb/lumos-utils` and `@ickb/v1-core` from workspace before all apps are migrated | Simpler dependency tree | Breaks unmigrated apps, blocks incremental migration | Only after ALL apps are migrated and verified | @@ -272,11 +272,11 @@ How roadmap phases should address these pitfalls. ## Sources - Direct codebase analysis: `packages/utils/src/transaction.ts` (SmartTransaction, 517 lines), `packages/utils/src/udt.ts` (UdtManager, 393 lines), `packages/core/src/udt.ts` (IckbUdtManager, 213 lines) -- CCC `Udt` class source: `ccc-fork/ccc/packages/udt/src/udt/index.ts` (1798 lines) -- On-chain contract source: `reference/contracts/scripts/contracts/ickb_logic/src/entry.rs` (conservation law, exchange rate) -- On-chain contract source: `reference/contracts/scripts/contracts/owned_owner/` (owner/owned pairing) -- On-chain contract source: `reference/contracts/scripts/contracts/limit_order/` (order/master relationship) -- Molecule schema: `reference/contracts/schemas/encoding.mol` (byte layout definitions) +- CCC `Udt` class source: `forks/ccc/packages/udt/src/udt/index.ts` (1798 lines) +- On-chain contract source: `forks/contracts/scripts/contracts/ickb_logic/src/entry.rs` (conservation law, exchange rate) +- On-chain contract source: `forks/contracts/scripts/contracts/owned_owner/` (owner/owned pairing) +- On-chain contract source: `forks/contracts/scripts/contracts/limit_order/` (order/master relationship) +- Molecule schema: `forks/contracts/schemas/encoding.mol` (byte layout definitions) - NervosDAO RFC: https://github.com/nervosnetwork/rfcs/blob/master/rfcs/0023-dao-deposit-withdraw/0023-dao-deposit-withdraw.md (64-output limit) - `.planning/PROJECT.md` -- project requirements and constraints - `.planning/codebase/CONCERNS.md` -- known tech debt and fragile areas diff --git a/.planning/research/STACK.md b/.planning/research/STACK.md index d70e6df..c9bf9dc 100644 --- a/.planning/research/STACK.md +++ b/.planning/research/STACK.md @@ -2,7 +2,7 @@ **Domain:** CCC API adoption for iCKB protocol library migration **Researched:** 2026-02-21 -**Confidence:** HIGH (primary source: local CCC source code in `ccc-fork/ccc/`) +**Confidence:** HIGH (primary source: local CCC source code in `forks/ccc/`) ## Context @@ -67,7 +67,7 @@ This research focuses on the CCC APIs and patterns that should be adopted as par ### CCC `@ckb-ccc/udt` Package -**Version:** Local build from `ccc-fork/ccc/packages/udt/` +**Version:** Local build from `forks/ccc/packages/udt/` **Key classes:** `Udt`, `UdtInfo`, `UdtConfig`, `ErrorUdtInsufficientCoin` **Depends on:** `@ckb-ccc/core`, `@ckb-ccc/ssri` @@ -111,7 +111,7 @@ CCC's `Client.cache` handles purpose (1) -- all `getHeaderByHash()` and `getHead CCC's `Transaction.getInputsCapacity()` now includes DAO profit via `CellInput.getExtraCapacity()` -> `Cell.getDaoProfit()`. This means SmartTransaction's override of `getInputsCapacity()` is **no longer needed** -- CCC does this natively. -Verified in CCC source (`ccc-fork/ccc/packages/core/src/ckb/transaction.ts` lines 1860-1883): +Verified in CCC source (`forks/ccc/packages/core/src/ckb/transaction.ts` lines 1860-1883): ```typescript async getInputsCapacity(client: Client): Promise { return ( @@ -275,7 +275,7 @@ Both are now redundant: PR #328 proposes a `FeePayer` abstraction for CCC that would allow specifying who pays transaction fees. This is relevant because SmartTransaction's fee completion could designate a specific lock for fee payment. -**Current status (updated):** PR #328 is now integrated into `ccc-fork/ccc` via the pins/record system. FeePayer classes are available at `ccc-fork/ccc/packages/core/src/signer/feePayer/`. The user decided during Phase 3 context that PR #328 is the target architecture -- investigation should design around it. +**Current status (updated):** PR #328 is now integrated into `forks/ccc` via the .pin/record system. FeePayer classes are available at `forks/ccc/packages/core/src/signer/feePayer/`. The user decided during Phase 3 context that PR #328 is the target architecture -- investigation should design around it. **Impact on migration:** The FeePayer abstraction is available to build against directly. The `infoFrom()` override is compatible with both the current Signer-based completion and the FeePayer-based completion -- cells flow through `getInputsInfo` → `infoFrom` regardless of which completion plumbing is used. @@ -306,17 +306,17 @@ PR #328 proposes a `FeePayer` abstraction for CCC that would allow specifying wh # No other new dependencies needed -- all other changes use existing @ckb-ccc/core APIs ``` -**Note:** With `ccc-fork/` local build active, `.pnpmfile.cjs` automatically rewires all `@ckb-ccc/*` dependencies to local packages, so the `@ckb-ccc/udt` package is already available from the local CCC build. +**Note:** With `forks/ccc/` local build active, `.pnpmfile.cjs` automatically rewires all `@ckb-ccc/*` dependencies to local packages, so the `@ckb-ccc/udt` package is already available from the local CCC build. ## Sources -- `ccc-fork/ccc/packages/udt/src/udt/index.ts` -- CCC Udt class, full source (1798 lines) -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` -- CCC Transaction class, full source (2537 lines) -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/client/client.ts` -- CCC Client class with caching, cell finding -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/num/index.ts` -- `numMax`, `numMin`, `numFrom` etc. -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/hex/index.ts` -- `isHex`, `hexFrom`, `bytesLen` -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/utils/index.ts` -- `reduce`, `reduceAsync`, `gcd`, `apply`, `sleep` -- HIGH confidence -- `ccc-fork/ccc/packages/core/src/ckb/epoch.ts` -- `Epoch` class (already adopted) -- HIGH confidence +- `forks/ccc/packages/udt/src/udt/index.ts` -- CCC Udt class, full source (1798 lines) -- HIGH confidence +- `forks/ccc/packages/core/src/ckb/transaction.ts` -- CCC Transaction class, full source (2537 lines) -- HIGH confidence +- `forks/ccc/packages/core/src/client/client.ts` -- CCC Client class with caching, cell finding -- HIGH confidence +- `forks/ccc/packages/core/src/num/index.ts` -- `numMax`, `numMin`, `numFrom` etc. -- HIGH confidence +- `forks/ccc/packages/core/src/hex/index.ts` -- `isHex`, `hexFrom`, `bytesLen` -- HIGH confidence +- `forks/ccc/packages/core/src/utils/index.ts` -- `reduce`, `reduceAsync`, `gcd`, `apply`, `sleep` -- HIGH confidence +- `forks/ccc/packages/core/src/ckb/epoch.ts` -- `Epoch` class (already adopted) -- HIGH confidence - `packages/utils/src/transaction.ts` -- Current SmartTransaction implementation (517 lines) -- HIGH confidence - `packages/utils/src/udt.ts` -- Current UdtManager/UdtHandler implementation (393 lines) -- HIGH confidence - `packages/utils/src/capacity.ts` -- Current CapacityManager implementation (221 lines) -- HIGH confidence diff --git a/.planning/research/SUMMARY.md b/.planning/research/SUMMARY.md index 3b76acf..d99ad1a 100644 --- a/.planning/research/SUMMARY.md +++ b/.planning/research/SUMMARY.md @@ -17,11 +17,11 @@ The primary risk is losing implicit behaviors baked into `SmartTransaction` — ### Recommended Stack -The existing TypeScript/pnpm/CCC stack requires no new technology choices. The migration is a CCC API adoption exercise: replace 14 local utilities with CCC equivalents (`ccc.numMax`/`numMin`, `ccc.gcd`, `ccc.isHex`, `Udt.balanceFromUnsafe`, etc.), add `@ckb-ccc/udt` as a dependency to `@ickb/core`, and restructure transaction building around CCC's native completion pipeline. The local `ccc-fork/` build system already makes `@ckb-ccc/udt` available via `.pnpmfile.cjs` rewriting — no additional infrastructure work needed. +The existing TypeScript/pnpm/CCC stack requires no new technology choices. The migration is a CCC API adoption exercise: replace 14 local utilities with CCC equivalents (`ccc.numMax`/`numMin`, `ccc.gcd`, `ccc.isHex`, `Udt.balanceFromUnsafe`, etc.), add `@ckb-ccc/udt` as a dependency to `@ickb/core`, and restructure transaction building around CCC's native completion pipeline. The local `forks/ccc/` build system already makes `@ckb-ccc/udt` available via `.pnpmfile.cjs` rewriting — no additional infrastructure work needed. **Core technologies:** - `@ckb-ccc/core` ^1.12.2: Transaction building, cell queries, signer abstraction — already adopted, native replacement for all SmartTransaction behaviors -- `@ckb-ccc/udt` (local ccc-fork build): UDT lifecycle management (cell finding, balance calculation, input completion, change handling) — replaces local UdtManager/UdtHandler; `IckbUdt` subclasses this +- `@ckb-ccc/udt` (local forks/ccc build): UDT lifecycle management (cell finding, balance calculation, input completion, change handling) — replaces local UdtManager/UdtHandler; `IckbUdt` subclasses this - `ccc.Transaction.completeFeeBy` / `completeFeeChangeToLock`: CKB fee completion — direct SmartTransaction.completeFee replacement for the CKB-change portion - `ccc.Transaction.completeInputsByCapacity`: CKB capacity input collection — replaces CapacityManager's cell-finding role - `ccc.Client.cache`: Transparent header caching — replaces SmartTransaction's `headers` map for performance; header deps must still be added explicitly @@ -156,7 +156,7 @@ Standard patterns (skip research-phase): | Area | Confidence | Notes | |------|------------|-------| -| Stack | HIGH | Primary source is local CCC source code (`ccc-fork/ccc/`); all APIs verified by direct inspection | +| Stack | HIGH | Primary source is local CCC source code (`forks/ccc/`); all APIs verified by direct inspection | | Features | HIGH | Based on direct codebase analysis + CCC docs + npm ecosystem survey; competitor analysis confirms iCKB has no direct competitors | | Architecture | HIGH | Build order derived from package dependency graph; key patterns verified against CCC Udt source; override point resolved (`infoFrom`, not `getInputsInfo`/`getOutputsInfo` — see Phase 3 research) | | Pitfalls | HIGH | Derived from direct code reading (SmartTransaction 517 lines, IckbUdtManager 213 lines, CCC Udt 1798 lines) and on-chain contract constraints | @@ -167,20 +167,20 @@ Standard patterns (skip research-phase): - **Resolved — CCC Udt override point:** Phase 3 research (03-RESEARCH.md) determined that `infoFrom()` is the optimal override point. The earlier recommendation to override `getInputsInfo()`/`getOutputsInfo()` was based on the incorrect premise that `CellAnyLike` lacks `outPoint` — it actually has `outPoint?: OutPointLike | null`, and input cells from `getInputsInfo()` → `CellInput.getCell()` always have `outPoint` set. `CellAny` also has `capacityFree`. See 03-RESEARCH.md for the corrected design. - **Resolved — DAO profit in CCC `getInputsCapacity`:** Verified from CCC source (transaction.ts lines 1860-1883) that `Transaction.getInputsCapacity()` handles DAO profit natively via `getInputsCapacityExtra()` → `CellInput.getExtraCapacity()` → `Cell.getDaoProfit()`. No standalone utility needed. SmartTransaction's override of `getInputsCapacity()` can be dropped without replacement. -- **Resolved — CCC PR #328 (FeePayer):** PR #328 is now integrated into `ccc-fork/ccc` via pins. FeePayer classes are available at `ccc-fork/ccc/packages/core/src/signer/feePayer/`. User decision during Phase 3 context: design around PR #328 as target architecture. +- **Resolved — CCC PR #328 (FeePayer):** PR #328 is now integrated into `forks/ccc` via pins. FeePayer classes are available at `forks/ccc/packages/core/src/signer/feePayer/`. User decision during Phase 3 context: design around PR #328 as target architecture. - **Bot key logging security:** PITFALLS.md notes the faucet already has a private key logging bug. The bot migration must include an explicit security audit of all logging paths. ## Sources ### Primary (HIGH confidence) -- `ccc-fork/ccc/packages/udt/src/udt/index.ts` — CCC Udt class (1798 lines), complete UDT lifecycle API -- `ccc-fork/ccc/packages/core/src/ckb/transaction.ts` — CCC Transaction class (2537 lines), completeFee/completeInputsByCapacity/getInputsCapacity -- `ccc-fork/ccc/packages/core/src/client/client.ts` — CCC Client with cache, findCells, cell/header fetching +- `forks/ccc/packages/udt/src/udt/index.ts` — CCC Udt class (1798 lines), complete UDT lifecycle API +- `forks/ccc/packages/core/src/ckb/transaction.ts` — CCC Transaction class (2537 lines), completeFee/completeInputsByCapacity/getInputsCapacity +- `forks/ccc/packages/core/src/client/client.ts` — CCC Client with cache, findCells, cell/header fetching - `packages/utils/src/transaction.ts` — SmartTransaction (deleted in Phase 1), was source of truth for replacement requirements - `packages/utils/src/udt.ts` — Current UdtManager/UdtHandler (393 lines) - `packages/core/src/udt.ts` — Current IckbUdtManager (213 lines), triple-representation balance logic -- `reference/contracts/schemas/encoding.mol` — Molecule schema, byte layout ground truth -- `reference/contracts/scripts/contracts/ickb_logic/src/entry.rs` — On-chain conservation law and exchange rate +- `forks/contracts/schemas/encoding.mol` — Molecule schema, byte layout ground truth +- `forks/contracts/scripts/contracts/ickb_logic/src/entry.rs` — On-chain conservation law and exchange rate - `.planning/PROJECT.md` — Project requirements and constraints ### Secondary (MEDIUM confidence) diff --git a/.pnpmfile.cjs b/.pnpmfile.cjs index c2ae705..716c34b 100644 --- a/.pnpmfile.cjs +++ b/.pnpmfile.cjs @@ -1,10 +1,11 @@ // .pnpmfile.cjs — Two jobs: // -// 1. Auto-replay: clone + patch managed forks on first `pnpm install` (if pins exist). -// replay.sh handles git clone, merge replay, lockfile removal, and source -// patching (jq exports rewrite). It does NOT run pnpm install -// internally — the root workspace install handles fork deps alongside -// everything else. +// 1. Auto-replay: bootstrap forker tool, then clone + patch managed forks on +// first `pnpm install` (if pins exist). Reference-only entries (no pins, +// empty refs) are shallow-cloned. replay.sh handles git clone, merge replay, +// lockfile removal, and source patching (jq exports rewrite). It does NOT +// run pnpm install internally — the root workspace install handles fork deps +// alongside everything else. // // 2. readPackage hook: rewrite fork deps from catalog ranges to workspace:*. // Fork packages live in pnpm-workspace.yaml, so you'd expect pnpm to link @@ -17,56 +18,110 @@ // through to the registry normally. const { execFileSync } = require("child_process"); -const { existsSync, readdirSync, readFileSync } = require("fs"); +const { existsSync, readdirSync, readFileSync, rmSync } = require("fs"); const { join } = require("path"); -// Discover all *-fork/ directories with config.json -const forkDirs = []; -for (const entry of readdirSync(__dirname, { withFileTypes: true })) { - if (!entry.isDirectory() || !entry.name.endsWith("-fork")) continue; - const configPath = join(__dirname, entry.name, "config.json"); - if (existsSync(configPath)) { - const config = JSON.parse(readFileSync(configPath, "utf8")); - if (!config.cloneDir) continue; - forkDirs.push({ - name: entry.name, - dir: join(__dirname, entry.name), - config, - }); - } +const forksDir = join(__dirname, "forks"); +const configPath = join(forksDir, "config.json"); + +// Read unified config +let config = {}; +if (existsSync(configPath)) { + config = JSON.parse(readFileSync(configPath, "utf8")); } // 1. Auto-replay fork pins on first pnpm install -// Skip when fork:record is running — it rebuilds pins from scratch. -// Detect via argv since pnpmfile loads before npm_lifecycle_event is set. -const isRecord = process.argv.some((a) => a === "fork:record"); +// Skip when record.sh is running — it rebuilds pins from scratch. +const isRecord = process.env.FORKER_RECORDING === "1"; if (!isRecord) { - for (const fork of forkDirs) { - const cloneDir = join(fork.dir, fork.config.cloneDir); - const hasPins = existsSync(join(fork.dir, "pins", "manifest")); - if (!existsSync(cloneDir) && hasPins) { + // Bootstrap forker tool: if forks/forker/ doesn't exist, clone it + const forkerDir = join(forksDir, "forker"); + if (!existsSync(forkerDir) && config.forker) { + const upstream = config.forker.upstream; + if (upstream) { try { - execFileSync("bash", ["fork-scripts/replay.sh", fork.name], { + execFileSync("git", ["clone", "--depth", "1", upstream, forkerDir], { cwd: __dirname, stdio: ["ignore", "pipe", "pipe"], }); + // Apply any local patches for forker + const forkerPinDir = join(forksDir, ".pin", "forker"); + if (existsSync(forkerPinDir)) { + const patches = readdirSync(forkerPinDir) + .filter((f) => f.startsWith("local-") && f.endsWith(".patch")) + .sort(); + for (const patch of patches) { + execFileSync( + "git", + ["apply", join(forkerPinDir, patch)], + { cwd: forkerDir, stdio: ["ignore", "pipe", "pipe"] }, + ); + } + } } catch (err) { - process.stderr.write(`Replaying ${fork.name} pins…\n`); + // Clean up partial state so next install retries from scratch + try { + rmSync(forkerDir, { recursive: true, force: true }); + } catch {} + process.stderr.write("Bootstrapping forker tool…\n"); process.stderr.write(err.stdout?.toString() ?? ""); process.stderr.write(err.stderr?.toString() ?? ""); throw err; } } } + + // Replay/clone each entry + for (const [name, entry] of Object.entries(config)) { + if (name === "forker") continue; // already handled above + const cloneDir = join(forksDir, name); + const hasPins = existsSync(join(forksDir, ".pin", name, "manifest")); + + if (!existsSync(cloneDir)) { + if (hasPins) { + // Replay from pins using forker + try { + execFileSync( + "bash", + [join(forkerDir, "replay.sh"), name], + { cwd: __dirname, stdio: ["ignore", "pipe", "pipe"] }, + ); + } catch (err) { + process.stderr.write(`Replaying ${name} pins…\n`); + process.stderr.write(err.stdout?.toString() ?? ""); + process.stderr.write(err.stderr?.toString() ?? ""); + throw err; + } + } else if ( + Array.isArray(entry.refs) && + entry.refs.length === 0 && + entry.upstream + ) { + // Reference-only entry: shallow clone + try { + execFileSync( + "git", + ["clone", "--depth", "1", entry.upstream, cloneDir], + { cwd: __dirname, stdio: ["ignore", "pipe", "pipe"] }, + ); + } catch (err) { + process.stderr.write(`Cloning ${name} (reference)…\n`); + process.stderr.write(err.stdout?.toString() ?? ""); + process.stderr.write(err.stderr?.toString() ?? ""); + throw err; + } + } + } + } } // 2. Discover local fork packages and build the override map const localOverrides = {}; -for (const fork of forkDirs) { - const cloneDir = join(fork.dir, fork.config.cloneDir); +for (const [name, entry] of Object.entries(config)) { + const cloneDir = join(forksDir, name); if (!existsSync(cloneDir)) continue; - const includes = fork.config.workspace?.include ?? []; - const excludes = new Set(fork.config.workspace?.exclude ?? []); + const includes = entry.workspace?.include ?? []; + const excludes = new Set(entry.workspace?.exclude ?? []); for (const pattern of includes) { // Simple glob: only supports trailing /* (e.g. "packages/*") const base = pattern.replace(/\/\*$/, ""); @@ -78,9 +133,11 @@ for (const fork of forkDirs) { if (excludes.has(relPath)) continue; const pkgJsonPath = join(pkgsRoot, dir.name, "package.json"); if (!existsSync(pkgJsonPath)) continue; - const { name } = JSON.parse(readFileSync(pkgJsonPath, "utf8")); - if (name) { - localOverrides[name] = "workspace:*"; + const { name: pkgName } = JSON.parse( + readFileSync(pkgJsonPath, "utf8"), + ); + if (pkgName) { + localOverrides[pkgName] = "workspace:*"; } } } diff --git a/AGENTS.md b/AGENTS.md index 516dbe7..38a9792 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -2,66 +2,28 @@ ## Meta -- **Learn**: When a non-obvious constraint causes a failure, leave a concise note here and a detailed comment at the relevant location +- **Learn**: When a non-obvious constraint causes a failure or surprises you, leave a concise note here and a detailed comment at the relevant location - `CLAUDE.md` is a symlink to this file, created by `pnpm coworker` - Refer to yourself as "AI Coworker" in docs and comments, not by product or company name - Never add AI tool attribution or branding to PR descriptions, commit messages, or code comments - Do not install or use `gh` CLI - When a post-plan fix changes a documented decision, update the planning docs in the same commit -- **Copy to clipboard**: - ```sh - head -c -1 <<'EOF' | wl-copy - content goes here - EOF - ``` +## Knowledge + +- **Fork Management**: Before working in `forks/`, read `forks/forker/README.md` for directory structure, pin format, and workflows +- Use `git -C ` to run git commands in fork clones or other repos — never `cd` into them +- Always compare CKB scripts using full `Script.eq()` (codeHash + hashType + args), never just `codeHash`. Partial comparison silently matches wrong scripts ## PR Workflow 1. **Routine Pre-PR Validation**: `pnpm check:full`, it wipes derived state and regenerates from scratch. If any fork clone has pending work, the wipe is skipped to prevent data loss — re-record or push fork changes first for a clean validation -2. **Open a PR**: Run `pnpm changeset` to generate a changeset entry, then push the branch and present a clickable markdown link `[title](url)` where the URL is a GitHub compare URL (`quick_pull=1`). Base branch is `master`. Prefill "title" (concise, under 70 chars) and "body" (markdown with ## Why and ## Changes sections) -3. **Fetch PR review comments**: Use the GitHub REST API via curl. Fetch all three comment types (issue comments, reviews, and inline comments). Categorize feedback by actionability (action required / informational), not by source (human / bot). Reviewers reply asynchronously — poll every minute until comments arrive - -## Fork Management (fork-scripts/ + *-fork/) - -The `fork-scripts/` system uses a record/replay mechanism for deterministic builds of external repo forks. Each fork lives in a `-fork/` directory with a `config.json` specifying upstream URL, fork URL, merge refs, and workspace config. Scripts in `fork-scripts/` are generic and accept the fork directory as their first argument. - -### Per-fork directory structure - -Each `-fork/` contains: -- `config.json` — upstream URL, fork URL, refs to merge, cloneDir, workspace include/exclude -- `pins/` — **committed** to git (manifest + counted resolutions + local patches), regenerated by `pnpm fork:record -fork` - - `pins/HEAD` — expected final SHA after full replay - - `pins/manifest` — base SHA + merge refs (TSV, one per line) - - `pins/res-N.resolution` — conflict resolution for merge step N (counted format: `--- path` file headers, `CONFLICT ours=N base=M theirs=K resolution=R` conflict headers followed by R resolution lines; parser is purely positional — reads counts and skips lines, never inspects content) - - `pins/local-*.patch` — local development patches (applied after merges + patch.sh) -- `/` — **not in git** — rebuilt from pins on `pnpm install` - -### Key behaviors - -- The developer may have **pending work** in a fork clone. Run `pnpm fork:status -fork` (exit 0 = safe to wipe, exit 1 = has custom work) before any operation that would destroy it. `fork:record`, `fork:clean`, and `fork:reset` already guard against this automatically -- `.pnpmfile.cjs` scans all `*-fork/config.json` directories and silently rewrites matching dependencies to `workspace:*` when clones exist. Local fork packages override published ones without any visible change in package.json files -- `pnpm install` has a side effect: if `-fork/pins/manifest` exists but the clone does not, it automatically runs `fork-scripts/replay.sh` to rebuild from pins. This is intentional -- `fork-scripts/patch.sh` rewrites fork package exports to point at `.ts` source instead of `.d.ts`, then creates a deterministic git commit (fixed author/date) so record and replay produce the same HEAD hash. This is why imports from fork packages resolve to TypeScript source files — it is not a bug -- `fork-scripts/tsgo-filter.sh` is a bash wrapper around `tsgo` that filters out diagnostics originating from all `*-fork/` clone paths. Fork source may not satisfy this repo's strict tsconfig, so the wrapper suppresses those errors while still reporting errors in stack source -- `pnpm fork:save -fork [description]` captures local work as a patch in `pins/`. Patches survive re-records and replays -- `pnpm fork:record` regenerates the fork workspace entries in `pnpm-workspace.yaml` (between `@generated` markers) from all `*-fork/config.json` files — manual edits to that section are overwritten on re-record - -### CCC upstream contributions - -Work locally via `ccc-fork/` first. Only push to the fork (`phroi/ccc`) when changes are validated against the stack. Do not open PRs against `ckb-devrel/ccc` prematurely — keep changes on the fork until they are production-ready and the maintainer decides to upstream. - -1. Develop and test in `ccc-fork/ccc/` on the `wip` branch -2. When ready, use `pnpm fork:push ccc-fork` to cherry-pick commits onto a PR branch -3. Push the PR branch to `phroi/ccc` for review -4. Add the PR number to `refs` in `ccc-fork/config.json` — order PRs by target branch from upstream to downstream, so each group merges cleanly onto its base before the next layer begins -5. Run `pnpm fork:record ccc-fork` and `pnpm check:full` to verify -6. Only open an upstream PR against `ckb-devrel/ccc` when the maintainer explicitly decides to upstream - -## Reference Repos - -`reference/` contains read-only clones (project knowledge, dependency sources, etc.) fetched via `pnpm reference`. To refresh, just re-run `pnpm reference`. If a task requires knowledge of an external repo not yet cloned, add it to `reference/clone.sh` and run `pnpm reference` to fetch it for consultation - -## Knowledge - -- Always compare CKB scripts using full `Script.eq()` (codeHash + hashType + args), never just `codeHash`. Partial comparison silently matches wrong scripts +2. **Open a PR**: If any package needs a version bump, run `pnpm changeset` first. Push the branch and present a clickable markdown link `[title](url)` where the URL is a GitHub compare URL (`quick_pull=1`). Base branch is `master`. Prefill "title" (concise, under 70 chars) and "body" (markdown with ## Why and ## Changes sections) +3. **Fetch PR review comments**: Use the GitHub REST API via curl. Fetch all three comment types (issue comments, reviews, and inline comments). Categorize feedback by actionability (action required / informational), not by source (human / bot) +4. **Copy to clipboard replies**: + +```sh +head -c -1 <<'EOF' | wl-copy +@account-name content goes here +EOF +``` diff --git a/README.md b/README.md index f8ad029..3830914 100644 --- a/README.md +++ b/README.md @@ -56,41 +56,28 @@ graph TD; ## Develop with Forks -When `-fork/pins/manifest` is committed, `pnpm install` automatically sets up the local fork development environment on first run (by replaying pinned merges via `fork-scripts/replay.sh`). No manual setup step is needed — just clone and install: +When `forks/.pin//manifest` is committed, `pnpm install` automatically sets up the local fork development environment on first run (by replaying pinned merges via `forks/forker/replay.sh`). No manual setup step is needed — just clone and install: ```bash git clone git@github.com:ickb/stack.git && cd stack && pnpm install ``` -To redo the setup from scratch: `pnpm fork:clean-all && pnpm install`. +To redo the setup from scratch: `bash forks/forker/clean-all.sh && pnpm install`. -See [ccc-fork/README.md](ccc-fork/README.md) for recording new pins, developing CCC PRs, and the full workflow. - -## Reference - -Clone the on-chain contracts and whitepaper repos locally for AI context: - -```bash -pnpm reference -``` - -This clones two repos into the project root (both are git-ignored and made read-only): - -- **[contracts](https://github.com/ickb/contracts)** — Rust L1 scripts deployed on Nervos CKB -- **[whitepaper](https://github.com/ickb/whitepaper)** — iCKB protocol design and specification +After `pnpm install`, see `forks/forker/README.md` for recording new pins, developing fork PRs, and the full workflow. ## Developer Scripts | Command | Description | | -------------------------------- | --------------------------------------------------------------------------------- | | `pnpm coworker` | Launch an interactive AI Coworker session (full autonomy, opus model). | -| `pnpm coworker:ask` | One-shot AI query for scripting (sonnet model, stateless). Used by fork:record. | -| `pnpm fork:status -fork` | Check if fork clone matches pinned state. Exit 0 = safe to wipe. | -| `pnpm fork:record -fork` | Record fork pins (clone, merge refs, build). Guarded against pending work. | -| `pnpm fork:save -fork` | Capture local fork work as a patch in pins/ (survives re-records and replays). | -| `pnpm fork:push -fork` | Cherry-pick commits from wip branch onto a PR branch for pushing to the fork. | -| `pnpm fork:clean -fork` | Remove fork clone, keep pins (guarded). Re-replay on next `pnpm install`. | -| `pnpm fork:reset -fork` | Remove fork clone and pins (guarded). Restores published packages. | +| `pnpm coworker:ask` | One-shot AI query for scripting (sonnet model, stateless). Used by record.sh. | +| `bash forks/forker/status.sh ` | Check if fork clone matches pinned state. Exit 0 = safe to wipe. | +| `bash forks/forker/record.sh ` | Record fork pins (clone, merge refs, build). Guarded against pending work. | +| `bash forks/forker/save.sh ` | Capture local fork work as a patch in .pin/ (survives re-records and replays). | +| `bash forks/forker/push.sh ` | Cherry-pick commits from wip branch onto a PR branch for pushing to the fork. | +| `bash forks/forker/clean.sh ` | Remove fork clone, keep pins (guarded). Re-replay on next `pnpm install`. | +| `bash forks/forker/reset.sh ` | Remove fork clone and pins (guarded). Restores published packages. | | `pnpm check:full` | Wipe derived state and validate from scratch. Skips wipe if forks have pending work.| ## Epoch Semantic Versioning diff --git a/apps/faucet/package.json b/apps/faucet/package.json index fc82e4e..3c55a2a 100644 --- a/apps/faucet/package.json +++ b/apps/faucet/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules", diff --git a/apps/interface/README.md b/apps/interface/README.md index a22dd18..d767026 100644 --- a/apps/interface/README.md +++ b/apps/interface/README.md @@ -11,7 +11,7 @@ git clone https://github.com/ickb/stack.git 2. Enter into the repo folder: ```bash -cd stack/app/interface +cd stack/apps/interface ``` 3. Install dependencies: diff --git a/apps/interface/package.json b/apps/interface/package.json index 65d5435..11fef37 100644 --- a/apps/interface/package.json +++ b/apps/interface/package.json @@ -13,7 +13,7 @@ "type": "module", "scripts": { "dev": "vite", - "build": "bash ../../fork-scripts/tsgo-filter.sh && vite build", + "build": "bash ../../tsgo-filter.sh && vite build", "preview": "vite preview", "lint": "eslint ./src", "clean": "rm -fr dist", diff --git a/apps/interface/vite.config.ts b/apps/interface/vite.config.ts index 16a43df..60457d6 100644 --- a/apps/interface/vite.config.ts +++ b/apps/interface/vite.config.ts @@ -2,20 +2,20 @@ import { defineConfig } from "vite"; import tailwindcss from "@tailwindcss/vite"; import react from "@vitejs/plugin-react"; import basicSsl from '@vitejs/plugin-basic-ssl' -import { existsSync, readdirSync, readFileSync } from "fs"; +import { existsSync, readFileSync } from "fs"; import { join } from "path"; -// Detect if any managed fork clones are present +// Detect if any managed fork clones are present (forks/ directory layout) const root = join(__dirname, "../.."); const hasForkSource = (() => { try { - for (const entry of readdirSync(root, { withFileTypes: true })) { - if (!entry.isDirectory() || !entry.name.endsWith("-fork")) continue; - const configPath = join(root, entry.name, "config.json"); - if (!existsSync(configPath)) continue; - const { cloneDir } = JSON.parse(readFileSync(configPath, "utf8")); - if (!cloneDir) continue; - if (existsSync(join(root, entry.name, cloneDir))) return true; + const configPath = join(root, "forks", "config.json"); + if (!existsSync(configPath)) return false; + const config = JSON.parse(readFileSync(configPath, "utf8")); + for (const [name, entry] of Object.entries(config)) { + // Managed forks have non-empty refs; reference-only clones have refs: [] + if (!Array.isArray(entry.refs) || entry.refs.length === 0) continue; + if (existsSync(join(root, "forks", name))) return true; } } catch (err) { console.error("Failed to detect fork sources:", err); @@ -32,7 +32,7 @@ export default defineConfig({ tailwindcss(), react({ // Fork source uses decorators — skip babel, let esbuild handle them - ...(hasForkSource && { exclude: [/\w+-fork\/\w+\//] }), + ...(hasForkSource && { exclude: [/forks\/\w+\//] }), babel: { plugins: [["babel-plugin-react-compiler"]], }, diff --git a/apps/sampler/package.json b/apps/sampler/package.json index 8d319e1..9e5a7cd 100644 --- a/apps/sampler/package.json +++ b/apps/sampler/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules", diff --git a/ccc-fork/.gitignore b/ccc-fork/.gitignore deleted file mode 100644 index e555b40..0000000 --- a/ccc-fork/.gitignore +++ /dev/null @@ -1 +0,0 @@ -ccc/ diff --git a/ccc-fork/README.md b/ccc-fork/README.md deleted file mode 100644 index b6b7332..0000000 --- a/ccc-fork/README.md +++ /dev/null @@ -1,156 +0,0 @@ -# CCC Local Development - -## Why - -CCC has unreleased branches (`releases/next`, `releases/udt`) that this project depends on. The fork management system deterministically merges them locally so the monorepo can build against unpublished CCC changes until they're published upstream. - -## How it works - -1. **Auto-replay** — `.pnpmfile.cjs` runs at `pnpm install` time. If `ccc-fork/pins/manifest` exists but `ccc-fork/ccc/` doesn't, it auto-triggers `fork-scripts/replay.sh` to clone and set up CCC. - -2. **Workspace override** — When `ccc-fork/ccc/` is present, `.pnpmfile.cjs` auto-discovers all CCC packages (via `config.json` workspace settings) and rewrites `@ckb-ccc/*` dependencies to `workspace:*` — no manual `pnpm.overrides` needed. This is necessary because `catalog:` specifiers resolve to a semver range _before_ pnpm considers workspace linking — even with `link-workspace-packages = true`, pnpm fetches from the registry without this hook. When CCC is not cloned, the hook is a no-op and deps resolve from the registry normally. - -3. **Source-level types** — `fork-scripts/patch.sh` (called by both `record.sh` and `replay.sh`) patches CCC's `package.json` exports to point TypeScript at `.ts` source instead of built `.d.ts`, then creates a deterministic git commit (fixed author/date). This gives real-time type feedback when editing across the CCC/stack boundary — changes in CCC source are immediately visible to stack packages without rebuilding. - -4. **Diagnostic filtering** — `fork-scripts/tsgo-filter.sh` is a bash wrapper around `tsgo` used by stack package builds. Because CCC `.ts` source is type-checked under the stack's stricter tsconfig (`verbatimModuleSyntax`, `noImplicitOverride`, `noUncheckedIndexedAccess`), plain `tsgo` would report hundreds of CCC diagnostics that aren't real integration errors. The wrapper emits output normally and only fails on diagnostics from stack source files. When no forks are cloned, packages fall back to plain `tsgo`. - -## Configuration - -CCC-specific settings live in `ccc-fork/config.json`: - -```json -{ - "upstream": "https://github.com/ckb-devrel/ccc.git", - "fork": "git@github.com:phroi/ccc.git", - "refs": ["359", "328", "releases/next", "releases/udt"], - "cloneDir": "ccc", - "workspace": { - "include": ["packages/*"], - "exclude": ["packages/demo", "packages/docs", ...] - } -} -``` - -- **upstream**: Git URL to clone from -- **fork**: SSH URL of developer fork, added as `fork` remote after replay -- **refs**: Merge refs — PR numbers, branch names, or commit SHAs (auto-detected) -- **cloneDir**: Name of the cloned directory inside `ccc-fork/` -- **workspace**: Glob patterns for pnpm workspace inclusion/exclusion - -## `pins/` format - -``` -ccc-fork/pins/ - HEAD # expected SHA after full replay (merges + patch.sh + local patches) - manifest # base SHA + merge refs, TSV, one per line - res-2.resolution # conflict resolution for merge step 2 (if any) - res-4.resolution # conflict resolution for merge step 4 (gaps = no conflicts) - local-001.patch # local development patch (applied after merges + patch.sh) - local-002.patch # local development patch -``` - -- **`HEAD`**: one line, the expected final SHA after everything (merges, `patch.sh`, local patches). Verification happens at the end of replay. -- **`manifest`**: TSV, one line per ref. Line 1 is the base commit (`SHA\tbranchname`); subsequent lines are merge refs applied sequentially onto `wip`. -- **`res-N.resolution`**: counted conflict resolution for merge step N. Only present for merge steps that had conflicts. Uses positional parsing (line counts, not content inspection) for deterministic replay. -- **`local-*.patch`**: standard unified diffs of local work, applied in lexicographic order after merges + `patch.sh`, each as a deterministic commit. - -All files are human-readable and editable. - -## Recording - -Recording captures the current upstream state and any conflict resolutions: - -```bash -pnpm fork:record ccc-fork -``` - -This runs `fork-scripts/record.sh` which reads refs from `config.json`, clones CCC, merges the configured refs, uses AI Coworker to resolve any conflicts, patches for source-level type resolution, and writes `pins/`. Commit the resulting `ccc-fork/pins/` directory so other contributors get the same build. - -You can override refs on the command line: - -```bash -pnpm fork:record ccc-fork 359 328 releases/next releases/udt -``` - -### Ref auto-detection - -`record.sh` accepts any number of refs and auto-detects their type: -- `^[0-9a-f]{7,40}$` → commit SHA -- `^[0-9]+$` → GitHub PR number -- everything else → branch name - -### Conflict resolution format - -When merges produce conflicts, `record.sh` resolves them and stores the resolution as a counted resolution file in `pins/res-N.resolution` (where N is the 1-indexed merge step). These use a positional format with `CONFLICT ours=N base=M theirs=K resolution=R` headers, so you can: - -- **Inspect** exactly what was resolved and how -- **Edit by hand** if the AI resolution needs adjustment -- **Diff across re-records** to see what changed - -## Developing CCC changes - -Work directly in `ccc-fork/ccc/` on the `wip` branch. `pnpm fork:status ccc-fork` tracks pending changes (exit 0 = clean, exit 1 = has work). - -### Development loop - -1. **Edit code** on `wip` in `ccc-fork/ccc/`. Commit normally. -2. **Rebuild**: `pnpm build` (builds stack packages with CCC type integration). -3. **Run tests**: `pnpm test` - -### Saving local patches - -When you have local changes that should persist across re-records: - -```bash -pnpm fork:save ccc-fork [description] -``` - -This captures all changes (committed + uncommitted) relative to the pinned HEAD as a patch file in `pins/`. The patch is applied deterministically during replay, so it survives `pnpm fork:clean ccc-fork && pnpm install` cycles. - -Example workflow: -1. Edit files in `ccc-fork/ccc/` -2. `pnpm fork:save ccc-fork my-feature` → creates `pins/local-001-my-feature.patch` -3. Edit more files -4. `pnpm fork:save ccc-fork another-fix` → creates `pins/local-002-another-fix.patch` -5. `pnpm fork:clean ccc-fork && pnpm install` → replays merges + patches, HEAD matches - -Local patches are preserved across `pnpm fork:record ccc-fork` — they're backed up before re-recording and restored afterwards. - -### Committing CCC changes to stack - -When ready to commit stack changes that depend on CCC modifications: - -1. Push changes to the fork (`phroi/ccc`) using `pnpm fork:push ccc-fork` -2. Add the PR number to `refs` in `ccc-fork/config.json` -3. Run `pnpm fork:record ccc-fork` — this re-records with the PR as a merge ref -4. Commit the updated `ccc-fork/pins/` to the stack repo - -Only open a PR against `ckb-devrel/ccc` when the maintainer decides to upstream — keep changes on the fork until then. - -### Pushing to a PR branch - -Extract your commits (those after the recording) onto the PR branch: - -```bash -pnpm fork:push ccc-fork -cd ccc-fork/ccc -git push fork pr-666:your-branch-name -git checkout wip # return to development -``` - -## Switching modes - -**Check for pending work:** `pnpm fork:status ccc-fork` — exit 0 if clone matches pinned state (safe to wipe), exit 1 otherwise. - -**Local CCC (default when `pins/` is committed):** `pnpm install` auto-replays pins and overrides deps. - -**Published CCC:** `pnpm fork:reset ccc-fork && pnpm install` — removes clone and pins, restores published packages. - -**Re-record:** `pnpm fork:record ccc-fork` wipes and re-records everything from scratch. Aborts if clone has pending work. Local patches are preserved. - -**Force re-replay:** `pnpm fork:clean ccc-fork && pnpm install` — removes clone but keeps pins, replays on next install. - -## Requirements - -- **Recording** (`pnpm fork:record`): Requires the AI Coworker CLI (installed as a devDependency; invoked via `pnpm coworker:ask`) for automated conflict resolution (only when merging refs). Also requires `jq` for config.json and package.json processing. -- **Replay** (`pnpm install`): Requires `jq`. No other extra tools — works for any contributor with just pnpm. diff --git a/ccc-fork/config.json b/ccc-fork/config.json deleted file mode 100644 index 4d59d37..0000000 --- a/ccc-fork/config.json +++ /dev/null @@ -1,17 +0,0 @@ -{ - "upstream": "https://github.com/ckb-devrel/ccc.git", - "fork": "git@github.com:phroi/ccc.git", - "refs": ["359", "328", "releases/next", "releases/udt"], - "cloneDir": "ccc", - "workspace": { - "include": ["packages/*"], - "exclude": [ - "packages/demo", - "packages/docs", - "packages/examples", - "packages/faucet", - "packages/playground", - "packages/tests" - ] - } -} diff --git a/fork-scripts/clean-all.sh b/fork-scripts/clean-all.sh deleted file mode 100644 index f65af35..0000000 --- a/fork-scripts/clean-all.sh +++ /dev/null @@ -1,12 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Clean all managed fork clones (status-check each before removing). -# Usage: fork-scripts/clean-all.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -while IFS= read -r dev_dir; do - bash "$FORK_SCRIPTS_DIR/clean.sh" "$dev_dir" || true -done < <(discover_fork_dirs) diff --git a/fork-scripts/clean.sh b/fork-scripts/clean.sh deleted file mode 100644 index 1b75df6..0000000 --- a/fork-scripts/clean.sh +++ /dev/null @@ -1,14 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Remove a fork clone after verifying it has no pending work. -# Usage: fork-scripts/clean.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/clean.sh }" -DEV_DIR=$(cd "$DEV_DIR" && pwd) - -bash "$FORK_SCRIPTS_DIR/status.sh" "$DEV_DIR" -rm -rf "$(repo_dir "$DEV_DIR")" diff --git a/fork-scripts/lib.sh b/fork-scripts/lib.sh deleted file mode 100644 index 7f56371..0000000 --- a/fork-scripts/lib.sh +++ /dev/null @@ -1,241 +0,0 @@ -#!/usr/bin/env bash -# Shared helpers for fork management scripts - -FORK_SCRIPTS_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -ROOT_DIR="$(cd "$FORK_SCRIPTS_DIR/.." && pwd)" - -# Read a value from a fork directory's config.json -# Usage: config_val -config_val() { - jq -r "$2" "$1/config.json" -} - -# Get the clone directory path for a fork -# Usage: repo_dir -repo_dir() { - local clone_dir - clone_dir=$(config_val "$1" '.cloneDir') - echo "$1/$clone_dir" -} - -# Get the pins directory path for a fork -# Usage: pins_dir -pins_dir() { - echo "$1/pins" -} - -# Get the upstream URL from config -# Usage: upstream_url -upstream_url() { - config_val "$1" '.upstream' -} - -# Get the fork URL from config (may be empty) -# Usage: fork_url -fork_url() { - local url - url=$(config_val "$1" '.fork // empty') - [ -n "$url" ] && echo "$url" -} - -# Get the refs array from config as lines -# Usage: repo_refs -repo_refs() { - config_val "$1" '.refs[]' -} - -# Discover all *-fork/ directories with config.json at repo root -# Usage: discover_fork_dirs -discover_fork_dirs() { - for d in "$ROOT_DIR"/*-fork; do - [ -f "$d/config.json" ] && echo "$d" - done -} - -# Read the expected HEAD SHA from pins/HEAD -# Usage: pinned_head -pinned_head() { - local f="$1/HEAD" - [ -f "$f" ] && cat "$f" || return 1 -} - -# Return path to pins/manifest if it exists -# Usage: manifest_file -manifest_file() { - local f="$1/manifest" - [ -f "$f" ] && echo "$f" || return 1 -} - -# Check whether pins exist (manifest present) -# Usage: has_pins -has_pins() { - [ -f "$1/manifest" ] -} - -# Count merge refs in manifest (total lines minus base line) -# Usage: merge_count -merge_count() { - local mf - mf=$(manifest_file "$1") || return 1 - echo $(( $(wc -l < "$mf") - 1 )) -} - -# Export deterministic git identity for reproducible commits -# Usage: deterministic_env -deterministic_env() { - export GIT_AUTHOR_NAME="ci" GIT_AUTHOR_EMAIL="ci@local" - export GIT_COMMITTER_NAME="ci" GIT_COMMITTER_EMAIL="ci@local" - export GIT_AUTHOR_DATE="@$1 +0000" GIT_COMMITTER_DATE="@$1 +0000" -} - -# Count files matching a glob pattern (pipefail-safe alternative to ls|wc -l) -# Usage: count_glob pattern (e.g., count_glob "$dir"/local-*.patch) -count_glob() { - local n=0 - for f in "$@"; do - [ -f "$f" ] && n=$((n + 1)) - done - echo "$n" -} - -# Apply local patches from pins/ as deterministic commits. -# Timestamp sequence continues from patch.sh: merge_count+1 is patch.sh, -# so local patches start at merge_count+2. -# Returns 1 if any patch fails to apply (caller should add remediation advice). -# Usage: apply_local_patches -apply_local_patches() { - local repo_dir="$1" p_dir="$2" - local mc ts patch name - mc=$(merge_count "$p_dir") || mc=0 - ts=$((mc + 2)) - for patch in "$p_dir"/local-*.patch; do - [ -f "$patch" ] || return 0 - name=$(basename "$patch" .patch) - echo "Applying local patch: $name" >&2 - if ! git -C "$repo_dir" apply "$patch"; then - echo "ERROR: Local patch $name failed to apply." >&2 - return 1 - fi - deterministic_env "$ts" - git -C "$repo_dir" add -A - git -C "$repo_dir" commit -m "local: $name" - ts=$((ts + 1)) - done -} - -# Apply counted conflict resolutions to a single conflicted file. -# Reads resolution data (CONFLICT headers + content lines) from $1, -# walks the conflicted file $2 positionally by line counts (never inspects -# content), and outputs the resolved file to stdout. -# Exits non-zero if the conflict count in the resolution data doesn't match -# the number of <<<<<<< markers in the file (catches fake markers). -# Usage: apply_counted_resolutions -apply_counted_resolutions() { - awk ' - FNR==NR { - if (/^CONFLICT /) { - n++ - for (i=2; i<=NF; i++) { - split($i, kv, "=") - c[n, kv[1]] = kv[2]+0 - } - rn[n] = 0 - next - } - rn[n]++ - r[n, rn[n]] = $0 - next - } - { - if (substr($0,1,7) == "<<<<<<<") { - cn++ - if (cn > n) { - printf "ERROR: more conflicts in file than in resolution data (%d > %d)\n", cn, n > "/dev/stderr" - err = 1; exit 1 - } - for (i = 0; i < c[cn,"ours"]; i++) getline - getline # ||||||| - for (i = 0; i < c[cn,"base"]; i++) getline - getline # ======= - for (i = 0; i < c[cn,"theirs"]; i++) getline - getline # >>>>>>> - for (i = 1; i <= c[cn,"resolution"]; i++) print r[cn,i] - next - } - print - } - END { - if (!err && cn != n) { - printf "ERROR: expected %d conflicts, found %d\n", n, cn > "/dev/stderr" - exit 1 - } - } - ' "$1" "$2" -} - -# Regenerate fork workspace entries in pnpm-workspace.yaml. -# Reads all *-fork/config.json files and replaces the section between -# @generated markers with computed include/exclude globs. -# Usage: sync_workspace_yaml -sync_workspace_yaml() { - local yaml="$ROOT_DIR/pnpm-workspace.yaml" - local entries="" - - for dev_dir in "$ROOT_DIR"/*-fork; do - [ -f "$dev_dir/config.json" ] || continue - local name clone_dir - name=$(basename "$dev_dir") - clone_dir=$(config_val "$dev_dir" '.cloneDir') - - mapfile -t includes < <(config_val "$dev_dir" '.workspace.include // [] | .[]') - for inc in "${includes[@]}"; do - entries+=" - ${name}/${clone_dir}/${inc}"$'\n' - done - - mapfile -t excludes < <(config_val "$dev_dir" '.workspace.exclude // [] | .[]') - for excl in "${excludes[@]}"; do - entries+=" - \"!${name}/${clone_dir}/${excl}\""$'\n' - done - done - - awk -v entries="$entries" ' - /^ # @generated begin fork-workspaces/ { print; printf "%s", entries; skip=1; next } - /^ # @generated end fork-workspaces/ { skip=0; print; next } - !skip { print } - ' "$yaml" > "$yaml.tmp" && mv "$yaml.tmp" "$yaml" -} - -# Apply a multi-file resolution file to a repo directory. -# Splits by "--- path" headers into per-file chunks, then calls -# apply_counted_resolutions for each file, replacing it in-place. -# Usage: apply_resolution_file -apply_resolution_file() { - local repo_dir="$1" res_file="$2" - local tmp_dir - tmp_dir=$(mktemp -d) - trap 'rm -rf "$tmp_dir"' RETURN - - # Split by --- headers; write path list and per-file chunks - awk -v dir="$tmp_dir" ' - /^--- / { - if (f) close(f) - n++ - path = substr($0, 5) - print path > (dir "/paths") - f = dir "/chunk-" n - next - } - f { print > f } - END { if (f) close(f) } - ' "$res_file" - - [ -f "$tmp_dir/paths" ] || return 0 - - local i=0 path - while IFS= read -r path; do - i=$((i + 1)) - apply_counted_resolutions "$tmp_dir/chunk-$i" "$repo_dir/$path" \ - > "$repo_dir/${path}.resolved.tmp" - mv "$repo_dir/${path}.resolved.tmp" "$repo_dir/$path" - done < "$tmp_dir/paths" -} diff --git a/fork-scripts/patch.sh b/fork-scripts/patch.sh deleted file mode 100644 index 24347c5..0000000 --- a/fork-scripts/patch.sh +++ /dev/null @@ -1,38 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Patch a cloned repo for use in the stack workspace. -# Usage: fork-scripts/patch.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -REPO_DIR="${1:?Usage: fork-scripts/patch.sh }" -MERGE_COUNT="${2:?Missing merge-count argument}" - -# Remove the repo's own lockfile so deps are recorded in the root pnpm-lock.yaml -rm -f "$REPO_DIR/pnpm-lock.yaml" - -# Patch packages so the stack resolves directly to .ts source: -# - "type":"module" → NodeNext treats .ts files as ESM -# - "types" export condition → TypeScript resolves .ts source before .js dist -# - "import" rewritten to .ts source → Vite/esbuild can bundle without building -for pkg_json in "$REPO_DIR"/packages/*/package.json; do - [ -f "$pkg_json" ] || continue - jq '.type = "module" | - if (.exports | type) == "object" then .exports |= with_entries( - if .value | type == "object" and has("import") - then .value |= ( - (.import | sub("/dist/";"/src/") | sub("\\.m?js$";".ts")) as $src | - {types: $src, import: $src} + (. | del(.import, .types)) - ) - else . end - ) else . end' "$pkg_json" > "$pkg_json.tmp" && mv "$pkg_json.tmp" "$pkg_json" -done - -# Commit patched files with deterministic identity so record and replay produce the same hash -deterministic_env "$((MERGE_COUNT + 1))" -git -C "$REPO_DIR" add -A -if ! git -C "$REPO_DIR" diff --cached --quiet; then - git -C "$REPO_DIR" commit -m "patch: source-level type resolution" -fi diff --git a/fork-scripts/push.sh b/fork-scripts/push.sh deleted file mode 100644 index 9dd1200..0000000 --- a/fork-scripts/push.sh +++ /dev/null @@ -1,75 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Usage: fork-scripts/push.sh [target-branch] -# Cherry-picks commits made after recording onto the PR branch. -# target-branch: defaults to the last pr-* branch found. - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/push.sh [target-branch]}" -DEV_DIR=$(cd "$DEV_DIR" && pwd) -shift - -REPO_DIR=$(repo_dir "$DEV_DIR") -PINS_DIR=$(pins_dir "$DEV_DIR") -FORK_NAME=$(basename "$DEV_DIR") -CLONE_DIR=$(config_val "$DEV_DIR" '.cloneDir') - -# Verify prerequisites -if [ ! -d "$REPO_DIR" ]; then - echo "ERROR: $FORK_NAME clone does not exist. Run 'pnpm fork:record $FORK_NAME' first." >&2 - exit 1 -fi - -WIP_HEAD=$(pinned_head "$PINS_DIR" 2>/dev/null) || { - echo "ERROR: No pins found. Run 'pnpm fork:record $FORK_NAME' first." >&2 - exit 1 -} - -# Verify we're on the wip branch -CURRENT_BRANCH=$(git -C "$REPO_DIR" branch --show-current) -if [ "$CURRENT_BRANCH" != "wip" ]; then - echo "ERROR: Expected to be on 'wip' branch, but on '$CURRENT_BRANCH'." >&2 - echo "Switch back with: cd $FORK_NAME/$CLONE_DIR && git checkout wip" >&2 - exit 1 -fi - -# Show commits to push -echo "Commits since recording:" -git -C "$REPO_DIR" log --oneline "$WIP_HEAD..HEAD" -echo "" - -COMMIT_COUNT=$(git -C "$REPO_DIR" rev-list --count "$WIP_HEAD..HEAD") -if [ "$COMMIT_COUNT" -eq 0 ]; then - echo "No new commits to push." - exit 0 -fi - -# Determine target branch -if [ $# -gt 0 ]; then - TARGET="$1" -else - TARGET=$(git -C "$REPO_DIR" branch --list 'pr-*' | sed 's/^[* ]*//' | tail -1) - if [ -z "$TARGET" ]; then - echo "ERROR: No target branch. Pass a branch name or record a PR first." >&2 - exit 1 - fi -fi - -echo "Cherry-picking $COMMIT_COUNT commit(s) onto $TARGET..." -git -C "$REPO_DIR" checkout "$TARGET" -if ! git -C "$REPO_DIR" cherry-pick "$WIP_HEAD..wip"; then - echo "" >&2 - echo "ERROR: Cherry-pick failed. To recover:" >&2 - echo " cd $FORK_NAME/$CLONE_DIR" >&2 - echo " # Resolve conflicts, then: git cherry-pick --continue" >&2 - echo " # Or abort with: git cherry-pick --abort && git checkout wip" >&2 - exit 1 -fi - -echo "" -echo "Done. You are now on $TARGET with your commits applied." -echo "Push with: cd $FORK_NAME/$CLONE_DIR && git push $TARGET:" -echo "Return to: cd $FORK_NAME/$CLONE_DIR && git checkout wip" diff --git a/fork-scripts/record.sh b/fork-scripts/record.sh deleted file mode 100644 index 258b50f..0000000 --- a/fork-scripts/record.sh +++ /dev/null @@ -1,348 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Usage: fork-scripts/record.sh [ref ...] -# ref auto-detection: -# ^[0-9a-f]{7,40}$ → commit SHA -# ^[0-9]+$ → GitHub PR number -# everything else → branch name -# No refs on CLI → reads from config.json -# No refs at all → just clone, no merges - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/record.sh [ref ...]}" -DEV_DIR=$(cd "$DEV_DIR" && pwd) -shift - -REPO_DIR=$(repo_dir "$DEV_DIR") -PINS_DIR=$(pins_dir "$DEV_DIR") -UPSTREAM=$(upstream_url "$DEV_DIR") - -# Collect refs: CLI args override config.json -if [ $# -gt 0 ]; then - REFS=("$@") -else - mapfile -t REFS < <(repo_refs "$DEV_DIR") -fi - -# --------------------------------------------------------------------------- -# resolve_conflict -# Tiered merge conflict resolution (diff3 markers required): -# Tier 0: Deterministic — one side matches base → take the other (0 tokens) -# Tier 1: Strategy classification — LLM picks OURS/THEIRS/BOTH/GENERATE (~5 tokens) -# Tier 2: Code generation — LLM generates merged code for hunks only -# Outputs the resolved file to stdout. -# Writes counted resolution to .resolution (collected into pins/res-N.resolution after merge). -# --------------------------------------------------------------------------- -resolve_conflict() { - local FILE="$1" F_REL="$2" - local COUNT WORK i OURS BASE THEIRS - - COUNT=$(awk 'substr($0,1,7)=="<<<<<<<"{n++} END{print n+0}' "$FILE") - [ "$COUNT" -gt 0 ] || { echo "ERROR: no conflict markers in $FILE" >&2; return 1; } - - WORK=$(mktemp -d) - trap 'rm -rf "$WORK"' RETURN - - # Extract ours / base / theirs for each conflict hunk - awk -v dir="$WORK" ' - substr($0,1,7) == "<<<<<<<" { n++; section = "ours"; next } - substr($0,1,7) == "|||||||" { section = "base"; next } - substr($0,1,7) == "=======" { section = "theirs"; next } - substr($0,1,7) == ">>>>>>>" { section = ""; next } - section { print > (dir "/c" n "_" section) } - ' "$FILE" - - # Ensure ours/theirs files exist even for empty hunks (edit/delete conflicts) - for i in $(seq 1 "$COUNT"); do - touch "$WORK/c${i}_ours" "$WORK/c${i}_theirs" - done - - # Tier 0: Deterministic resolution (no LLM needed) - local NEED_LLM=() - for i in $(seq 1 "$COUNT"); do - OURS="$WORK/c${i}_ours"; BASE="$WORK/c${i}_base"; THEIRS="$WORK/c${i}_theirs" - if [ ! -f "$BASE" ]; then - NEED_LLM+=("$i"); continue - fi - if diff -q "$OURS" "$BASE" >/dev/null 2>&1; then - cp "$THEIRS" "$WORK/r$i" - echo " conflict $i: deterministic (take theirs)" >&2 - elif diff -q "$THEIRS" "$BASE" >/dev/null 2>&1; then - cp "$OURS" "$WORK/r$i" - echo " conflict $i: deterministic (take ours)" >&2 - elif diff -q "$OURS" "$THEIRS" >/dev/null 2>&1; then - cp "$OURS" "$WORK/r$i" - echo " conflict $i: deterministic (sides identical)" >&2 - else - NEED_LLM+=("$i") - fi - done - - # --- helper: verify, reconstruct resolved file, write resolution sidecar --- - _finish() { - for i in $(seq 1 "$COUNT"); do - [ -f "$WORK/r$i" ] || { echo "ERROR: missing resolution for conflict $i in $FILE" >&2; return 1; } - done - - # Build per-file counted resolution data - local res_data="$WORK/res_data" - : > "$res_data" - for i in $(seq 1 "$COUNT"); do - local ours_n=0 base_n=0 theirs_n=0 res_n=0 - ours_n=$(wc -l < "$WORK/c${i}_ours") - [ -f "$WORK/c${i}_base" ] && base_n=$(wc -l < "$WORK/c${i}_base") - theirs_n=$(wc -l < "$WORK/c${i}_theirs") - res_n=$(wc -l < "$WORK/r$i") - printf 'CONFLICT ours=%d base=%d theirs=%d resolution=%d\n' \ - "$ours_n" "$base_n" "$theirs_n" "$res_n" >> "$res_data" - cat "$WORK/r$i" >> "$res_data" - done - - # Apply counted resolutions to reconstruct resolved file (verifies counts) - apply_counted_resolutions "$res_data" "$FILE" - - # Write resolution sidecar (collected into res-N.resolution by caller) - cp "$res_data" "$FILE.resolution" - } - - [ ${#NEED_LLM[@]} -eq 0 ] && { _finish; return; } - - # Tier 1: Strategy classification (~5 output tokens per conflict) - local CLASSIFY_INPUT="" STRATEGIES NUM STRATEGY REST NEED_GENERATE=() - for i in "${NEED_LLM[@]}"; do - CLASSIFY_INPUT+="=== CONFLICT $i === ---- ours --- -$(cat "$WORK/c${i}_ours") ---- base --- -$(cat "$WORK/c${i}_base" 2>/dev/null || echo "(unavailable)") ---- theirs --- -$(cat "$WORK/c${i}_theirs") - -" - done - - STRATEGIES=$(printf '%s\n' "$CLASSIFY_INPUT" | pnpm --silent coworker:ask \ - -p "For each conflict, respond with ONLY the conflict number and one strategy per line: -N OURS — keep ours (theirs is outdated/superseded) -N THEIRS — keep theirs (ours is outdated/superseded) -N BOTH_OT — concatenate ours then theirs -N BOTH_TO — concatenate theirs then ours -N GENERATE — needs custom merge -No explanations.") - - while IFS=' ' read -r NUM STRATEGY REST; do - [[ "${NUM:-}" =~ ^[0-9]+$ ]] || continue - case "$STRATEGY" in - OURS) cp "$WORK/c${NUM}_ours" "$WORK/r$NUM"; echo " conflict $NUM: classified → OURS" >&2 ;; - THEIRS) cp "$WORK/c${NUM}_theirs" "$WORK/r$NUM"; echo " conflict $NUM: classified → THEIRS" >&2 ;; - BOTH_OT) cat "$WORK/c${NUM}_ours" "$WORK/c${NUM}_theirs" > "$WORK/r$NUM"; echo " conflict $NUM: classified → BOTH (ours first)" >&2 ;; - BOTH_TO) cat "$WORK/c${NUM}_theirs" "$WORK/c${NUM}_ours" > "$WORK/r$NUM"; echo " conflict $NUM: classified → BOTH (theirs first)" >&2 ;; - GENERATE) NEED_GENERATE+=("$NUM"); echo " conflict $NUM: classified → GENERATE" >&2 ;; - *) NEED_GENERATE+=("$NUM"); echo " conflict $NUM: unrecognized '$STRATEGY', falling back to GENERATE" >&2 ;; - esac - done <<< "$STRATEGIES" - - [ ${#NEED_GENERATE[@]} -eq 0 ] && { _finish; return; } - - # Tier 2: Code generation (only for GENERATE conflicts — hunks only output) - local GENERATE_INPUT="" GENERATED - for i in "${NEED_GENERATE[@]}"; do - GENERATE_INPUT+="=== CONFLICT $i === ---- ours --- -$(cat "$WORK/c${i}_ours") ---- base --- -$(cat "$WORK/c${i}_base" 2>/dev/null || echo "(unavailable)") ---- theirs --- -$(cat "$WORK/c${i}_theirs") - -" - done - - GENERATED=$(printf '%s\n' "$GENERATE_INPUT" | pnpm --silent coworker:ask \ - -p "Merge each conflict meaningfully. Output '=== RESOLUTION N ===' header followed by ONLY the merged code. No explanations, no code fences.") - - printf '%s\n' "$GENERATED" | awk -v dir="$WORK" ' - /^=== RESOLUTION [0-9]+ ===$/ { if (f) close(f); f = dir "/r" $3; buf = ""; next } - f && /^[[:space:]]*$/ { buf = buf $0 "\n"; next } - f { if (buf != "") { printf "%s", buf > f; buf = "" }; print > f } - END { if (f) close(f) } - ' - - _finish -} - -# Guard: abort if clone has pending work -FORK_NAME=$(basename "$DEV_DIR") -if ! bash "$FORK_SCRIPTS_DIR/status.sh" "$DEV_DIR" >/dev/null 2>&1; then - bash "$FORK_SCRIPTS_DIR/status.sh" "$DEV_DIR" >&2 - echo "" >&2 - echo "ERROR: $FORK_NAME has pending work that would be lost." >&2 - echo "Push with 'pnpm fork:push $FORK_NAME', commit, or remove the clone manually." >&2 - exit 1 -fi - -# Preserve local patches before wiping -LOCAL_PATCHES_TMP="" -if [ "$(count_glob "$PINS_DIR"/local-*.patch)" -gt 0 ]; then - LOCAL_PATCHES_TMP=$(mktemp -d) - cp "$PINS_DIR"/local-*.patch "$LOCAL_PATCHES_TMP/" - echo "Preserved $(count_glob "$LOCAL_PATCHES_TMP"/local-*.patch) local patch(es)" -fi - -# Always start fresh — wipe previous clone and pins -rm -rf "$REPO_DIR" "$PINS_DIR" -mkdir -p "$PINS_DIR" - -cleanup_on_error() { - rm -rf "$REPO_DIR" "$PINS_DIR" - if [ -n "${LOCAL_PATCHES_TMP:-}" ] && [ -d "${LOCAL_PATCHES_TMP:-}" ]; then - echo "FAILED — cleaned up clone and pins/" >&2 - echo "Local patches preserved in: $LOCAL_PATCHES_TMP" >&2 - echo "Restore manually or re-record without local patches." >&2 - else - echo "FAILED — cleaned up clone and pins/" >&2 - fi -} -trap cleanup_on_error ERR - -git clone --filter=blob:none "$UPSTREAM" "$REPO_DIR" - -# Enable diff3 conflict markers so conflict resolution can see the base version. -# Force full 40-char SHAs in |||||| base markers so they're identical across runs -# (default core.abbrev varies with object count, breaking resolution replay). -git -C "$REPO_DIR" config merge.conflictStyle diff3 -git -C "$REPO_DIR" config core.abbrev 40 - -# Capture default branch name and base SHA before any merges -DEFAULT_BRANCH=$(git -C "$REPO_DIR" branch --show-current) -BASE_SHA=$(git -C "$REPO_DIR" rev-parse HEAD) -git -C "$REPO_DIR" checkout -b wip - -# Write manifest: base line first -printf '%s\t%s\n' "$BASE_SHA" "$DEFAULT_BRANCH" > "$PINS_DIR/manifest" - -MERGE_IDX=0 - -for REF in "${REFS[@]}"; do - MERGE_IDX=$((MERGE_IDX + 1)) - - deterministic_env "$MERGE_IDX" - - # Case A: full (7-40 char) hex commit SHA - if [[ $REF =~ ^[0-9a-f]{7,40}$ ]]; then - git -C "$REPO_DIR" fetch --depth=1 origin "$REF" - MERGE_REF="FETCH_HEAD" - - # Case B: all digits → GitHub pull request number - elif [[ $REF =~ ^[0-9]+$ ]]; then - git -C "$REPO_DIR" fetch origin "pull/$REF/head:pr-$REF" - MERGE_REF="pr-$REF" - - # Case C: branch name - else - git -C "$REPO_DIR" fetch origin "refs/heads/$REF:$REF" - MERGE_REF="$REF" - fi - - # Capture the resolved SHA for this ref before merging - MERGE_SHA=$(git -C "$REPO_DIR" rev-parse "$MERGE_REF") - - # Append merge ref line to manifest - printf '%s\t%s\n' "$MERGE_SHA" "$REF" >> "$PINS_DIR/manifest" - - # Use explicit merge message so record and replay produce identical commits - MERGE_MSG="Merge $REF into wip" - - # Merge by SHA (not named ref or FETCH_HEAD) so conflict marker lines - # (>>>>>>> ) are identical between record and replay. Both use the - # same pinned SHA, so counted resolutions apply with correct line counts. - if ! git -C "$REPO_DIR" merge --no-ff -m "$MERGE_MSG" "$MERGE_SHA"; then - # Capture conflicted file list BEFORE resolution - mapfile -t CONFLICTED < <(git -C "$REPO_DIR" diff --name-only --diff-filter=U) - - # Resolve conflicted files with AI Coworker (parallel, hunks-only) - PIDS=() - for FILE in "${CONFLICTED[@]}"; do - resolve_conflict "$REPO_DIR/$FILE" "$FILE" \ - > "$REPO_DIR/${FILE}.resolved" & - PIDS+=($!) - done - - # Wait for all resolutions and check exit codes - for i in "${!PIDS[@]}"; do - if ! wait "${PIDS[$i]}"; then - echo "ERROR: AI Coworker failed for ${CONFLICTED[$i]}" >&2 - exit 1 - fi - done - - # Validate, apply resolutions, and collect per-file diffs - for FILE in "${CONFLICTED[@]}"; do - if [ ! -s "$REPO_DIR/${FILE}.resolved" ]; then - echo "ERROR: AI Coworker returned empty resolution for $FILE" >&2 - exit 1 - fi - if grep -q '<<<<<<<' "$REPO_DIR/${FILE}.resolved"; then - echo "ERROR: Conflict markers remain in $FILE after resolution" >&2 - exit 1 - fi - - mv "$REPO_DIR/${FILE}.resolved" "$REPO_DIR/$FILE" - git -C "$REPO_DIR" add "$FILE" - - # Append per-file resolution with path header (written by resolve_conflict) - printf -- '--- %s\n' "$FILE" >> "$PINS_DIR/res-${MERGE_IDX}.resolution" - cat "$REPO_DIR/${FILE}.resolution" >> "$PINS_DIR/res-${MERGE_IDX}.resolution" - rm "$REPO_DIR/${FILE}.resolution" - done - - # Overwrite MERGE_MSG so merge --continue uses our deterministic message - echo "$MERGE_MSG" > "$REPO_DIR/.git/MERGE_MSG" - GIT_EDITOR=true git -C "$REPO_DIR" merge --continue - fi -done - -bash "$FORK_SCRIPTS_DIR/patch.sh" "$REPO_DIR" "$MERGE_IDX" - -# Restore and apply local patches -if [ -n "${LOCAL_PATCHES_TMP:-}" ]; then - cp "$LOCAL_PATCHES_TMP"/local-*.patch "$PINS_DIR/" - rm -rf "$LOCAL_PATCHES_TMP" - - apply_local_patches "$REPO_DIR" "$PINS_DIR" || { - echo "Upstream changes may have invalidated it. Edit or remove the patch and re-record." >&2 - exit 1 - } -fi - -# Write HEAD file -HEAD_SHA=$(git -C "$REPO_DIR" rev-parse HEAD) -printf '%s\n' "$HEAD_SHA" > "$PINS_DIR/HEAD" - -# Add fork remote for pushing (SSH for auth), if configured -FORK_REMOTE=$(fork_url "$DEV_DIR" 2>/dev/null) || true -if [ -n "${FORK_REMOTE:-}" ]; then - git -C "$REPO_DIR" remote add fork "$FORK_REMOTE" -fi - -# Regenerate fork workspace entries in pnpm-workspace.yaml -sync_workspace_yaml - -LOCAL_PATCH_COUNT=$(count_glob "$PINS_DIR"/local-*.patch) -RESOLUTION_COUNT=$(count_glob "$PINS_DIR"/res-*.resolution) - -echo "Pins recorded in $PINS_DIR/" -echo " BASE=$BASE_SHA ($DEFAULT_BRANCH)" -echo " Merges: $MERGE_IDX ref(s)" -if [ "$RESOLUTION_COUNT" -gt 0 ]; then - echo " Resolutions: $RESOLUTION_COUNT merge step(s) with conflicts" -else - echo " Resolutions: none (no conflicts)" -fi -if [ "$LOCAL_PATCH_COUNT" -gt 0 ]; then - echo " Local patches: $LOCAL_PATCH_COUNT" -fi -echo " HEAD=$HEAD_SHA" diff --git a/fork-scripts/replay-all.sh b/fork-scripts/replay-all.sh deleted file mode 100644 index c656c13..0000000 --- a/fork-scripts/replay-all.sh +++ /dev/null @@ -1,12 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Replay all managed fork directories from their pins. -# Usage: fork-scripts/replay-all.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -while IFS= read -r dev_dir; do - bash "$FORK_SCRIPTS_DIR/replay.sh" "$dev_dir" -done < <(discover_fork_dirs) diff --git a/fork-scripts/replay.sh b/fork-scripts/replay.sh deleted file mode 100644 index af17015..0000000 --- a/fork-scripts/replay.sh +++ /dev/null @@ -1,102 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Usage: fork-scripts/replay.sh -# Deterministic replay from manifest + counted resolutions + local patches - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/replay.sh }" -DEV_DIR=$(cd "$DEV_DIR" && pwd) - -REPO_DIR=$(repo_dir "$DEV_DIR") -PINS_DIR=$(pins_dir "$DEV_DIR") -UPSTREAM=$(upstream_url "$DEV_DIR") -FORK_NAME=$(basename "$DEV_DIR") - -# Skip if already cloned -if [ -d "$REPO_DIR" ]; then - echo "$FORK_NAME: clone already exists, skipping (remove it to redo setup)" >&2 - exit 0 -fi - -# Skip if no pins to replay -MANIFEST=$(manifest_file "$PINS_DIR" 2>/dev/null) || { - echo "$FORK_NAME: no pins to replay, skipping" >&2 - exit 0 -} - -trap 'rm -rf "$REPO_DIR"; echo "FAILED — cleaned up $FORK_NAME clone" >&2' ERR - -# Read base SHA from first line of manifest -BASE_SHA=$(head -1 "$MANIFEST" | cut -d$'\t' -f1) -git clone --filter=blob:none "$UPSTREAM" "$REPO_DIR" - -# Match record.sh's conflict marker style and SHA abbreviation for identical markers -git -C "$REPO_DIR" config merge.conflictStyle diff3 -git -C "$REPO_DIR" config core.abbrev 40 - -git -C "$REPO_DIR" checkout "$BASE_SHA" -git -C "$REPO_DIR" checkout -b wip - -# Replay merges from manifest (skip line 1 = base) -MERGE_IDX=0 -while IFS=$'\t' read -r SHA REF_NAME; do - MERGE_IDX=$((MERGE_IDX + 1)) - echo "Replaying merge $MERGE_IDX: $REF_NAME ($SHA)" >&2 - - deterministic_env "$MERGE_IDX" - - git -C "$REPO_DIR" fetch origin "$SHA" - - # Use explicit merge message matching record.sh for deterministic commits - MERGE_MSG="Merge $REF_NAME into wip" - - # Merge by SHA (matching record.sh) so conflict markers are identical - if ! git -C "$REPO_DIR" merge --no-ff -m "$MERGE_MSG" "$SHA"; then - RES_FILE="$PINS_DIR/res-${MERGE_IDX}.resolution" - if [ ! -f "$RES_FILE" ]; then - if [ -f "$PINS_DIR/res-${MERGE_IDX}.diff" ]; then - echo "ERROR: Legacy diff format detected (res-${MERGE_IDX}.diff)." >&2 - echo "Re-record with: pnpm fork:record $FORK_NAME" >&2 - exit 1 - fi - echo "ERROR: Merge $MERGE_IDX ($REF_NAME) has conflicts but no resolution file." >&2 - echo "Re-record with: pnpm fork:record $FORK_NAME" >&2 - exit 1 - fi - - # Apply counted resolutions (positional — no sed stripping or patch needed) - apply_resolution_file "$REPO_DIR" "$RES_FILE" - - # Stage resolved files and complete the merge - git -C "$REPO_DIR" add -A - echo "$MERGE_MSG" > "$REPO_DIR/.git/MERGE_MSG" - GIT_EDITOR=true git -C "$REPO_DIR" merge --continue - fi -done < <(tail -n +2 "$MANIFEST") - -bash "$FORK_SCRIPTS_DIR/patch.sh" "$REPO_DIR" "$(merge_count "$PINS_DIR")" - -apply_local_patches "$REPO_DIR" "$PINS_DIR" || { - echo "Re-record with: pnpm fork:record $FORK_NAME" >&2 - exit 1 -} - -# Verify HEAD SHA matches pins/HEAD -ACTUAL=$(git -C "$REPO_DIR" rev-parse HEAD) -EXPECTED=$(pinned_head "$PINS_DIR") -if [ "$ACTUAL" != "$EXPECTED" ]; then - echo "FAIL: replay HEAD ($ACTUAL) != pinned HEAD ($EXPECTED)" >&2 - echo "Pins are stale or corrupted. Re-record with 'pnpm fork:record $FORK_NAME'." >&2 - exit 1 -fi - -# Add fork remote for pushing (SSH for auth), if configured -FORK_REMOTE=$(fork_url "$DEV_DIR" 2>/dev/null) || true -if [ -n "${FORK_REMOTE:-}" ]; then - git -C "$REPO_DIR" remote add fork "$FORK_REMOTE" -fi - -echo "OK — replay HEAD matches pinned HEAD ($EXPECTED)" diff --git a/fork-scripts/reset.sh b/fork-scripts/reset.sh deleted file mode 100644 index c46f7cd..0000000 --- a/fork-scripts/reset.sh +++ /dev/null @@ -1,14 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Remove a fork clone and its pins (full reset). -# Usage: fork-scripts/reset.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/reset.sh }" -DEV_DIR=$(cd "$DEV_DIR" && pwd) - -bash "$FORK_SCRIPTS_DIR/clean.sh" "$DEV_DIR" -rm -rf "$(pins_dir "$DEV_DIR")" diff --git a/fork-scripts/save.sh b/fork-scripts/save.sh deleted file mode 100644 index b2a654e..0000000 --- a/fork-scripts/save.sh +++ /dev/null @@ -1,91 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Usage: fork-scripts/save.sh [description] -# Captures local work in the fork clone as a patch file in pins/. -# description: short label for the patch (default: "local") - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/save.sh [description]}" -DEV_DIR=$(cd "$DEV_DIR" && pwd) -shift - -REPO_DIR=$(repo_dir "$DEV_DIR") -PINS_DIR=$(pins_dir "$DEV_DIR") -FORK_NAME=$(basename "$DEV_DIR") - -DESCRIPTION="${1:-local}" -# Sanitize description for use in filename (fallback if nothing alphanumeric remains) -DESCRIPTION=$(printf '%s' "$DESCRIPTION" | tr -c '[:alnum:]-_' '-' | sed 's/--*/-/g; s/^-//; s/-$//') -[ -z "$DESCRIPTION" ] && DESCRIPTION="local" - -# Check prerequisites -if [ ! -d "$REPO_DIR" ]; then - echo "ERROR: $FORK_NAME clone does not exist. Run 'pnpm fork:record $FORK_NAME' first." >&2 - exit 1 -fi - -PINNED_HEAD=$(pinned_head "$PINS_DIR" 2>/dev/null) || { - echo "ERROR: No pins found. Run 'pnpm fork:record $FORK_NAME' first." >&2 - exit 1 -} - -CURRENT_BRANCH=$(git -C "$REPO_DIR" branch --show-current) -if [ "$CURRENT_BRANCH" != "wip" ]; then - echo "ERROR: Expected to be on 'wip' branch, but on '$CURRENT_BRANCH'." >&2 - exit 1 -fi - -# Check for changes (committed + uncommitted + staged + untracked) relative to pinned HEAD -if git -C "$REPO_DIR" diff "$PINNED_HEAD" --quiet 2>/dev/null \ - && git -C "$REPO_DIR" diff --cached "$PINNED_HEAD" --quiet 2>/dev/null \ - && [ -z "$(git -C "$REPO_DIR" ls-files --others --exclude-standard 2>/dev/null)" ]; then - echo "No changes to save (working tree matches pinned HEAD)." - exit 0 -fi - -# Count existing local patches to find the pre-local-patches base state. -# Local patches are linear commits on top of post-patch.sh, so PINNED_HEAD~N -# gives us the base before any local patches were applied. -EXISTING=$(count_glob "$PINS_DIR"/local-*.patch) -if [ "$EXISTING" -gt 0 ]; then - PATCH_BASE=$(git -C "$REPO_DIR" rev-parse "${PINNED_HEAD}~${EXISTING}" 2>/dev/null) || { - echo "ERROR: Cannot compute base state. Pins may be corrupted." >&2 - echo "Re-record with: pnpm fork:record $FORK_NAME" >&2 - exit 1 - } -else - PATCH_BASE="$PINNED_HEAD" -fi - -NEXT_NUM=$(printf '%03d' $((EXISTING + 1))) -PATCH_NAME="local-${NEXT_NUM}-${DESCRIPTION}" - -# Stage everything so untracked files are included in the diff -git -C "$REPO_DIR" add -A -# Generate patch: incremental changes relative to pinned HEAD (not base) -git -C "$REPO_DIR" diff --cached "$PINNED_HEAD" > "$PINS_DIR/${PATCH_NAME}.patch" - -# Verify patch is non-empty -if [ ! -s "$PINS_DIR/${PATCH_NAME}.patch" ]; then - rm -f "$PINS_DIR/${PATCH_NAME}.patch" - echo "No diff to save." - exit 0 -fi - -# Rebuild deterministic state from base (before any local patches) -git -C "$REPO_DIR" reset --hard "$PATCH_BASE" - -apply_local_patches "$REPO_DIR" "$PINS_DIR" || { - # Remove the newly-written patch so a retry doesn't hit the same failure - rm -f "$PINS_DIR/${PATCH_NAME}.patch" - echo "Earlier patches may have changed the base. Edit or reorder patches." >&2 - exit 1 -} - -# Update HEAD -git -C "$REPO_DIR" rev-parse HEAD > "$PINS_DIR/HEAD" - -echo "Saved ${PATCH_NAME}.patch. Commit pins/ to share." diff --git a/fork-scripts/status-all.sh b/fork-scripts/status-all.sh deleted file mode 100644 index ecaffb5..0000000 --- a/fork-scripts/status-all.sh +++ /dev/null @@ -1,15 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Check status of all managed fork directories. -# Exits non-zero if any fork has pending work. -# Usage: fork-scripts/status-all.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -EXIT=0 -while IFS= read -r dev_dir; do - bash "$FORK_SCRIPTS_DIR/status.sh" "$dev_dir" || EXIT=1 -done < <(discover_fork_dirs) -exit $EXIT diff --git a/fork-scripts/status.sh b/fork-scripts/status.sh deleted file mode 100644 index a45f573..0000000 --- a/fork-scripts/status.sh +++ /dev/null @@ -1,52 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail - -# Check whether a fork clone is safe to wipe. -# Exit 0 → safe (not cloned, or matches pins exactly) -# Exit 1 → has custom work (any changes vs pinned commit, diverged HEAD, or no pins to compare) -# Usage: fork-scripts/status.sh - -# shellcheck source=lib.sh -source "$(cd "$(dirname "$0")" && pwd)/lib.sh" - -DEV_DIR="${1:?Usage: fork-scripts/status.sh }" -DEV_DIR=$(cd "$DEV_DIR" && pwd) - -REPO_DIR=$(repo_dir "$DEV_DIR") -PINS_DIR=$(pins_dir "$DEV_DIR") -FORK_NAME=$(basename "$DEV_DIR") - -if [ ! -d "$REPO_DIR" ]; then - echo "$FORK_NAME: clone is not present" - exit 0 -fi - -PINNED=$(pinned_head "$PINS_DIR" 2>/dev/null) || { - echo "$FORK_NAME: clone exists but no pins — custom clone" - exit 1 -} - -ACTUAL=$(git -C "$REPO_DIR" rev-parse HEAD) - -if [ "$ACTUAL" != "$PINNED" ]; then - echo "$FORK_NAME: HEAD diverged from pinned HEAD:" - echo " pinned $PINNED" - echo " actual $ACTUAL" - git -C "$REPO_DIR" log --oneline "$PINNED..$ACTUAL" 2>/dev/null || true - exit 1 -fi - -# Compare pinned commit against working tree AND index. -# git diff catches unstaged changes; --cached catches staged-only changes -# (e.g. staged edits where the working tree was reverted). -if ! git -C "$REPO_DIR" diff "$PINNED" --quiet 2>/dev/null \ - || ! git -C "$REPO_DIR" diff --cached "$PINNED" --quiet 2>/dev/null \ - || [ -n "$(git -C "$REPO_DIR" ls-files --others --exclude-standard 2>/dev/null)" ]; then - echo "$FORK_NAME: clone has changes relative to pins:" - git -C "$REPO_DIR" diff "$PINNED" --stat 2>/dev/null || true - git -C "$REPO_DIR" diff --cached "$PINNED" --stat 2>/dev/null || true - git -C "$REPO_DIR" ls-files --others --exclude-standard 2>/dev/null || true - exit 1 -fi - -echo "$FORK_NAME: clone is clean (matches pins)" diff --git a/forks/.gitignore b/forks/.gitignore new file mode 100644 index 0000000..e5bf8b4 --- /dev/null +++ b/forks/.gitignore @@ -0,0 +1,5 @@ +* +!.gitignore +!config.json +!.pin/ +!.pin/** diff --git a/ccc-fork/pins/HEAD b/forks/.pin/ccc/HEAD similarity index 100% rename from ccc-fork/pins/HEAD rename to forks/.pin/ccc/HEAD diff --git a/ccc-fork/pins/manifest b/forks/.pin/ccc/manifest similarity index 100% rename from ccc-fork/pins/manifest rename to forks/.pin/ccc/manifest diff --git a/ccc-fork/pins/res-2.resolution b/forks/.pin/ccc/res-2.resolution similarity index 61% rename from ccc-fork/pins/res-2.resolution rename to forks/.pin/ccc/res-2.resolution index c264e5d..711d04e 100644 --- a/ccc-fork/pins/res-2.resolution +++ b/forks/.pin/ccc/res-2.resolution @@ -1,10 +1,10 @@ --- packages/core/src/ckb/transaction.ts -CONFLICT ours=5 base=4 theirs=1 resolution=4 +CONFLICT ours=5 base=4 theirs=1 resolution=4 sha=9b7e58e11a803c51911391e089e70989663efb79fe8775ddc4335e5c2202248d import { ErrorNervosDaoOutputLimit, ErrorTransactionInsufficientCoin, } from "./transactionErrors.js"; -CONFLICT ours=97 base=95 theirs=7 resolution=8 +CONFLICT ours=97 base=95 theirs=7 resolution=8 sha=35fcc18a78d1a7d169108f94fda08c9a4d720da6a4c8be68130da8a3a8257de3 const result = await from.completeFee(this, { changeFn: change, feeRate: expectedFeeRate, diff --git a/ccc-fork/pins/res-4.resolution b/forks/.pin/ccc/res-4.resolution similarity index 55% rename from ccc-fork/pins/res-4.resolution rename to forks/.pin/ccc/res-4.resolution index bcb456c..5ecd1e0 100644 --- a/ccc-fork/pins/res-4.resolution +++ b/forks/.pin/ccc/res-4.resolution @@ -1,5 +1,5 @@ --- packages/core/src/ckb/transactionErrors.ts -CONFLICT ours=13 base=0 theirs=3 resolution=13 +CONFLICT ours=13 base=0 theirs=3 resolution=13 sha=fb78be5c37c81b8ddc5817c91bc2399cb8a22a6a36b4615041c16f3ee8baa4a3 export class ErrorNervosDaoOutputLimit extends Error { public readonly count: number; public readonly limit: number; @@ -14,7 +14,7 @@ export class ErrorNervosDaoOutputLimit extends Error { } --- vitest.config.mts -CONFLICT ours=1 base=1 theirs=1 resolution=1 +CONFLICT ours=1 base=1 theirs=1 resolution=1 sha=944a4a3ae09aceaa620bd0783e575d8519671a30ea7db8e975d6cff49d8156d2 projects: packages, -CONFLICT ours=1 base=1 theirs=1 resolution=1 +CONFLICT ours=1 base=1 theirs=1 resolution=1 sha=ac0f08d951dbb8bc8f2fe686d0a5ab8e6d8bb47b2e6bc0ce8d565fa4e44d5b5a include: packages, diff --git a/forks/.pin/forker/HEAD b/forks/.pin/forker/HEAD new file mode 100644 index 0000000..89a77f6 --- /dev/null +++ b/forks/.pin/forker/HEAD @@ -0,0 +1 @@ +981bcbddc25b70cb4a0f72c459d067f89ceb252d diff --git a/forks/.pin/forker/manifest b/forks/.pin/forker/manifest new file mode 100644 index 0000000..2faa0c9 --- /dev/null +++ b/forks/.pin/forker/manifest @@ -0,0 +1 @@ +981bcbddc25b70cb4a0f72c459d067f89ceb252d master diff --git a/forks/config.json b/forks/config.json new file mode 100644 index 0000000..fd99a2a --- /dev/null +++ b/forks/config.json @@ -0,0 +1,38 @@ +{ + "ccc": { + "upstream": "https://github.com/ckb-devrel/ccc.git", + "fork": "git@github.com:phroi/ccc.git", + "refs": [ + "359", + "328", + "releases/next", + "releases/udt" + ], + "workspace": { + "include": [ + "packages/*" + ], + "exclude": [ + "packages/demo", + "packages/docs", + "packages/examples", + "packages/faucet", + "packages/playground", + "packages/tests" + ] + } + }, + "forker": { + "upstream": "https://github.com/phroi/forker.git", + "fork": "git@github.com:phroi/forker.git", + "refs": [] + }, + "contracts": { + "upstream": "https://github.com/ickb/contracts.git", + "refs": [] + }, + "whitepaper": { + "upstream": "https://github.com/ickb/whitepaper.git", + "refs": [] + } +} \ No newline at end of file diff --git a/package.json b/package.json index 7d0968d..7671183 100644 --- a/package.json +++ b/package.json @@ -1,34 +1,25 @@ { "private": true, "scripts": { - "fork:record": "bash fork-scripts/record.sh", - "fork:status": "bash fork-scripts/status.sh", - "fork:status-all": "bash fork-scripts/status-all.sh", - "fork:push": "bash fork-scripts/push.sh", - "fork:save": "bash fork-scripts/save.sh", - "fork:clean": "bash fork-scripts/clean.sh", - "fork:clean-all": "bash fork-scripts/clean-all.sh", - "fork:reset": "bash fork-scripts/reset.sh", - "build": "pnpm -r --filter !./apps/** --filter '!./*-fork/**' build", - "build:all": "pnpm -r --filter '!./*-fork/**' build", + "build": "pnpm -r --filter !./apps/** --filter '!./forks/**' build", + "build:all": "pnpm -r --filter '!./forks/**' build", "check": "pnpm clean:deep && pnpm install && pnpm lint && pnpm build:all && pnpm test:ci", "check:fresh": "rm pnpm-lock.yaml && pnpm run check", "check:ci": "CI=true pnpm run check", - "check:full": "pnpm fork:clean-all; pnpm check:fresh && pnpm check:ci", + "check:full": "[ -f forks/forker/clean-all.sh ] && bash forks/forker/clean-all.sh; pnpm check:fresh && pnpm check:ci", "test": "vitest", "test:ci": "vitest run", "test:cov": "vitest run --coverage", - "lint": "pnpm -r --filter '!./*-fork/**' lint", - "clean": "rm -fr dist packages/*/dist apps/*/dist *-fork/*/packages/*/dist", - "clean:deep": "pnpm clean && rm -fr node_modules packages/*/node_modules apps/*/node_modules *-fork/*/packages/*/node_modules", + "lint": "pnpm -r --filter '!./forks/**' lint", + "clean": "rm -fr dist packages/*/dist apps/*/dist forks/*/packages/*/dist", + "clean:deep": "pnpm clean && rm -fr node_modules packages/*/node_modules apps/*/node_modules forks/*/packages/*/node_modules", "sync:template": "pnpm -r --filter !./apps/interface --filter !./packages/utils --filter !. -c exec 'for f in .npmignore tsconfig.json typedoc.json vitest.config.mts; do cp ../../packages/utils/$f .; done'", "change": "pnpm changeset", "version": "pnpm changeset version", "publish": "pnpm publish -r", "docs": "typedoc", "coworker": "ln -sf AGENTS.md CLAUDE.md && claude --model opus --dangerously-skip-permissions", - "coworker:ask": "env -u CLAUDECODE claude --print --model sonnet --no-session-persistence", - "reference": "bash reference/clone.sh" + "coworker:ask": "env -u CLAUDECODE claude --print --model sonnet --no-session-persistence" }, "engines": { "node": ">=24" @@ -49,4 +40,4 @@ "vitest": "^3.2.4" }, "packageManager": "pnpm@10.30.2+sha512.36cdc707e7b7940a988c9c1ecf88d084f8514b5c3f085f53a2e244c2921d3b2545bc20dd4ebe1fc245feec463bb298aecea7a63ed1f7680b877dc6379d8d0cb4" -} \ No newline at end of file +} diff --git a/packages/core/package.json b/packages/core/package.json index 28f6039..4ccd4a3 100644 --- a/packages/core/package.json +++ b/packages/core/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules" diff --git a/packages/dao/package.json b/packages/dao/package.json index 2d036bb..8453f59 100644 --- a/packages/dao/package.json +++ b/packages/dao/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules" diff --git a/packages/dao/src/dao.ts b/packages/dao/src/dao.ts index 6cd2948..23382f2 100644 --- a/packages/dao/src/dao.ts +++ b/packages/dao/src/dao.ts @@ -139,7 +139,7 @@ export class DaoManager implements ScriptDeps { tx.inputs.length !== tx.outputs.length || tx.outputs.length !== tx.outputsData.length ) { - throw new Error("Transaction have different inputs and outputs lengths"); + throw new Error("Transaction has different inputs and outputs lengths"); } for (const deposit of deposits) { diff --git a/packages/order/package.json b/packages/order/package.json index bdaa345..4140405 100644 --- a/packages/order/package.json +++ b/packages/order/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules" diff --git a/packages/sdk/package.json b/packages/sdk/package.json index dfc15ea..8d89fbd 100644 --- a/packages/sdk/package.json +++ b/packages/sdk/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules" diff --git a/packages/sdk/src/sdk.ts b/packages/sdk/src/sdk.ts index b63237b..87d1c09 100644 --- a/packages/sdk/src/sdk.ts +++ b/packages/sdk/src/sdk.ts @@ -188,7 +188,7 @@ export class IckbSdk { // For UDT to CKB orders, add available CKB. ckb += ckbAvailable; - if (ckb >= 0) { + if (ckb >= 0n) { return maturity + ("info" in o ? BigInt(Date.now()) : tip.timestamp); } diff --git a/packages/utils/package.json b/packages/utils/package.json index 07f2a52..b36c6f6 100644 --- a/packages/utils/package.json +++ b/packages/utils/package.json @@ -31,7 +31,7 @@ "scripts": { "test": "vitest", "test:ci": "vitest run", - "build": "bash ../../fork-scripts/tsgo-filter.sh", + "build": "bash ../../tsgo-filter.sh", "lint": "eslint ./src", "clean": "rm -fr dist", "clean:deep": "rm -fr dist node_modules" diff --git a/packages/utils/src/udt.ts b/packages/utils/src/udt.ts index 72375c8..e799067 100644 --- a/packages/utils/src/udt.ts +++ b/packages/utils/src/udt.ts @@ -77,7 +77,6 @@ export interface UdtHandler extends ScriptDeps { * * UDT Handler implementer should use this error class where appropriate. */ -// eslint-disable-next-line @typescript-eslint/no-deprecated export class ErrorTransactionInsufficientCoin extends ccc.ErrorTransactionInsufficientCoin { /** * @param amount - The additional amount required (in fixed-point). @@ -91,7 +90,6 @@ export class ErrorTransactionInsufficientCoin extends ccc.ErrorTransactionInsuff public readonly symbol: string, public readonly decimals: number, ) { - // eslint-disable-next-line @typescript-eslint/no-deprecated super(amount, type); this.message = `Insufficient coin, need ${ccc.fixedPointToString( amount, diff --git a/packages/utils/src/utils.ts b/packages/utils/src/utils.ts index ee62712..00193f9 100644 --- a/packages/utils/src/utils.ts +++ b/packages/utils/src/utils.ts @@ -145,7 +145,7 @@ export function binarySearch(n: number, f: (i: number) => boolean): number { * @param f - An async function that takes an index `i` and returns a boolean value. * @returns The smallest index `i` such that `f(i)` is true, or `n` if no such index exists. * - * @credits go standard library authors, this implementation is just a translation or that code: + * @credits go standard library authors, this implementation is just a translation of that code: * https://go.dev/src/sort/search.go * */ export async function asyncBinarySearch( diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 95513d4..1ba306a 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -10,7 +10,7 @@ catalogs: specifier: ^24.8.1 version: 24.10.13 -pnpmfileChecksum: sha256-SZA+voq6zh4ZSCAUW2BMUyVEx+37xe76r5k7kD+Z3v4= +pnpmfileChecksum: sha256-ii1GW6FVR8S6G6xZNLWscHPcjUwHIzgpN53HWAjURI4= importers: @@ -18,7 +18,7 @@ importers: devDependencies: '@anthropic-ai/claude-code': specifier: latest - version: 2.1.51 + version: 2.1.53 '@changesets/changelog-github': specifier: ^0.5.2 version: 0.5.2 @@ -30,7 +30,7 @@ importers: version: 9.39.3 '@typescript/native-preview': specifier: latest - version: 7.0.0-dev.20260223.1 + version: 7.0.0-dev.20260224.1 '@vitest/coverage-v8': specifier: 3.2.4 version: 3.2.4(vitest@3.2.4(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2)) @@ -94,7 +94,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/core': specifier: workspace:* version: link:../../packages/core @@ -116,7 +116,7 @@ importers: dependencies: '@ckb-ccc/ccc': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/ccc + version: link:../../forks/ccc/packages/ccc '@ckb-lumos/base': specifier: ^0.23.0 version: 0.23.0 @@ -216,7 +216,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/core': specifier: workspace:* version: link:../../packages/core @@ -256,7 +256,7 @@ importers: specifier: 'catalog:' version: 24.10.13 - ccc-fork/ccc/packages/ccc: + forks/ccc/packages/ccc: dependencies: '@ckb-ccc/eip6963': specifier: workspace:* @@ -317,7 +317,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/ckb-ccc: + forks/ccc/packages/ckb-ccc: dependencies: '@ckb-ccc/ccc': specifier: workspace:* @@ -354,7 +354,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/connector: + forks/ccc/packages/connector: dependencies: '@ckb-ccc/ccc': specifier: workspace:* @@ -391,7 +391,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/connector-react: + forks/ccc/packages/connector-react: dependencies: '@ckb-ccc/connector': specifier: workspace:* @@ -434,7 +434,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/core: + forks/ccc/packages/core: dependencies: '@joyid/ckb': specifier: ^1.1.2 @@ -504,7 +504,7 @@ importers: specifier: ^3.2.4 version: 3.2.4(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2) - ccc-fork/ccc/packages/did-ckb: + forks/ccc/packages/did-ckb: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -539,7 +539,7 @@ importers: version: 4.3.0(prettier@3.8.1)(typescript@5.9.3) tsdown: specifier: 0.19.0-beta.3 - version: 0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260223.1)(synckit@0.11.12)(typescript@5.9.3) + version: 0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260224.1)(synckit@0.11.12)(typescript@5.9.3) typescript: specifier: ^5.9.2 version: 5.9.3 @@ -550,7 +550,7 @@ importers: specifier: ^3.2.4 version: 3.2.4(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2) - ccc-fork/ccc/packages/eip6963: + forks/ccc/packages/eip6963: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -587,7 +587,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/joy-id: + forks/ccc/packages/joy-id: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -630,7 +630,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/lumos-patches: + forks/ccc/packages/lumos-patches: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -685,7 +685,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/nip07: + forks/ccc/packages/nip07: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -722,7 +722,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/okx: + forks/ccc/packages/okx: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -765,7 +765,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/rei: + forks/ccc/packages/rei: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -802,7 +802,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/shell: + forks/ccc/packages/shell: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -854,7 +854,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/spore: + forks/ccc/packages/spore: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -903,7 +903,7 @@ importers: specifier: ^3.2.4 version: 3.2.4(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2) - ccc-fork/ccc/packages/ssri: + forks/ccc/packages/ssri: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -943,7 +943,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/type-id: + forks/ccc/packages/type-id: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -972,7 +972,7 @@ importers: version: 4.3.0(prettier@3.8.1)(typescript@5.9.3) tsdown: specifier: 0.19.0-beta.3 - version: 0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260223.1)(synckit@0.11.12)(typescript@5.9.3) + version: 0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260224.1)(synckit@0.11.12)(typescript@5.9.3) typescript: specifier: ^5.9.2 version: 5.9.3 @@ -983,7 +983,7 @@ importers: specifier: ^3.2.4 version: 3.2.4(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2) - ccc-fork/ccc/packages/udt: + forks/ccc/packages/udt: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -1026,7 +1026,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/uni-sat: + forks/ccc/packages/uni-sat: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -1063,7 +1063,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/utxo-global: + forks/ccc/packages/utxo-global: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -1100,7 +1100,7 @@ importers: specifier: ^8.41.0 version: 8.56.1(eslint@9.39.3(jiti@2.6.1))(typescript@5.9.3) - ccc-fork/ccc/packages/xverse: + forks/ccc/packages/xverse: dependencies: '@ckb-ccc/core': specifier: workspace:* @@ -1144,7 +1144,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/dao': specifier: workspace:* version: link:../dao @@ -1156,7 +1156,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/utils': specifier: workspace:* version: link:../utils @@ -1165,7 +1165,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/utils': specifier: workspace:* version: link:../utils @@ -1174,7 +1174,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core '@ickb/core': specifier: workspace:* version: link:../core @@ -1192,7 +1192,7 @@ importers: dependencies: '@ckb-ccc/core': specifier: workspace:* - version: link:../../ccc-fork/ccc/packages/core + version: link:../../forks/ccc/packages/core packages: @@ -1203,8 +1203,8 @@ packages: resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==} engines: {node: '>=6.0.0'} - '@anthropic-ai/claude-code@2.1.51': - resolution: {integrity: sha512-9mNl3C+6xyj3QmUGzj9TDVntVogtOOKyps/d14k1SYLsyM5S/lJlMthlapDZ1E2EHipXSxDMN6IspSsdtPHVDA==} + '@anthropic-ai/claude-code@2.1.53': + resolution: {integrity: sha512-iKzS7+ktmxKoGXMyrahsMHpwwI36mIv674exgN7MdUySOFmo0MY4ahVxTMuZQkSXJVgKCcU2OpeSeH8Rz6WDzw==} engines: {node: '>=18.0.0'} hasBin: true @@ -2512,43 +2512,43 @@ packages: resolution: {integrity: sha512-KiROIzYdEV85YygXw6BI/Dx4fnBlFQu6Mq4QE4MOH9fFnhohw6wX/OAvDY2/C+ut0I3RSPKenvZJIVYqJNkhEw==} engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} - '@typescript/native-preview-darwin-arm64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-uDvCfIGr3PR8iKBA6OCNq6w0b2WMvmtkS8KUZVy04CH8ieFsxChYStLiyFTDX4GZs9BtWKeth/7qGDZewY20sQ==} + '@typescript/native-preview-darwin-arm64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-9VHXRhB7sM5DFqdlKaeDww8vuklgfzhYCjBazLCEnuFvb4J+rJ1DodLykc2bL+6kE8k6sdhYi3x8ipfbjtO44g==} cpu: [arm64] os: [darwin] - '@typescript/native-preview-darwin-x64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-hOKQicSgd1DhFbsqdpC5fMgg0R46sYbbtVjfXgYTAHg/WO6whfZ2SfPy9IIzsQ/CXYUZuwoJElCnc9DTcd66+w==} + '@typescript/native-preview-darwin-x64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-uCHipPRcIhHnvb7lAM29MQ1QT9pZ+uirqtH630aOMFm8VG3j8mkxVM9iGRLx829n38DMSDLjc3joCrQO3+sDcQ==} cpu: [x64] os: [darwin] - '@typescript/native-preview-linux-arm64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-oRt0l3O/itqBEwd5rhfDAyziEzbSgWar1NShduK4n2mHWTHCI1I7mFsbSPbox2pdrqOwOr0QW8xu7xEgDWWRXA==} + '@typescript/native-preview-linux-arm64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-yFEEq6hD2R70+lTogb211sPdCwz3H5hpYh0+YuKVMPsKo0oM8/jMvgjj2pyutmj/uCKLdbcJ9HP2vJ/13Szbcg==} cpu: [arm64] os: [linux] - '@typescript/native-preview-linux-arm@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-FVq6XjzqtLC1MVgQiumwpuW7Ug+S+WVEbvCUJQhrs8Szbf6fIFU/6+D6fOGCKzzo9SAD6zq2RNHtejBw74JSFA==} + '@typescript/native-preview-linux-arm@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-cEWSRQ8b+CXdMJvoG18IjNTvBo+qT22B5imqm6nAssMpyHHQb62PvZGnrA8mPRQNPzLpa5F956j8GwAjyP8hBQ==} cpu: [arm] os: [linux] - '@typescript/native-preview-linux-x64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-qpFTW7q8Vvq1v/0bzfT8+D0wLjqydIP0qKlomrEGLlMnCCAnPodo2oLc2JCtacc40TSMZZARvhctTszCn1gWBA==} + '@typescript/native-preview-linux-x64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-zGz5kVcCeBRheQwA4jVTAxtbLsBsTkp9AEvWK5AlyCs1rQCUQobBhtx37X4VEmxn4ekIDMxYgaZdlZb7/PGp8w==} cpu: [x64] os: [linux] - '@typescript/native-preview-win32-arm64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-HHu63F8cDhgIlqFGBnqBVQn7HSiORxyT0M6yPzG4tG4gdzx+aFUdogbYily0nzN5b6NolQTrFfh3Q85UfHCHqg==} + '@typescript/native-preview-win32-arm64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-A0f9ZDQqKvGk/an59HuAJuzoI/wMyrgTd69oX9gFCx7+5E/ajSdgv0Eg1Fco+nyLfT/UVM0CV3ERyWrKzx277w==} cpu: [arm64] os: [win32] - '@typescript/native-preview-win32-x64@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-vSis36O5qT+vOYfei7GtfWWzvIoaNdmxa1zDypBKkGGCCHt/c5vp0pXls85+8jBVS11Ep6p7ECcHlt+R5CBaug==} + '@typescript/native-preview-win32-x64@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-Se9JrcMdVLeDYMLn+CKEV3qy1yiildb5N23USGvnC9siNFalz8tVgd589dhRP+ywDhXnbIsZiFKDrZF/7B4wSQ==} cpu: [x64] os: [win32] - '@typescript/native-preview@7.0.0-dev.20260223.1': - resolution: {integrity: sha512-NEifR9F/0khbTQRztM4Yuxcj9dFuK9ubWIXJwLSmKMlncSp4u1fzRnlfv1vlNKKrXB7BUXoANFHpsM5BEXJ06w==} + '@typescript/native-preview@7.0.0-dev.20260224.1': + resolution: {integrity: sha512-PU0zBXLvz6RKxbIubT66RCnJXgScdDIhfmNMkvRhOnX/C4SZom5TFSn7BEHC3w8JPj7OSz5OYoubtV1Haty2GA==} hasBin: true '@vitejs/plugin-basic-ssl@1.2.0': @@ -4385,7 +4385,7 @@ snapshots: '@jridgewell/gen-mapping': 0.3.13 '@jridgewell/trace-mapping': 0.3.31 - '@anthropic-ai/claude-code@2.1.51': + '@anthropic-ai/claude-code@2.1.53': optionalDependencies: '@img/sharp-darwin-arm64': 0.34.5 '@img/sharp-darwin-x64': 0.34.5 @@ -5786,36 +5786,36 @@ snapshots: '@typescript-eslint/types': 8.56.1 eslint-visitor-keys: 5.0.1 - '@typescript/native-preview-darwin-arm64@7.0.0-dev.20260223.1': + '@typescript/native-preview-darwin-arm64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-darwin-x64@7.0.0-dev.20260223.1': + '@typescript/native-preview-darwin-x64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-linux-arm64@7.0.0-dev.20260223.1': + '@typescript/native-preview-linux-arm64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-linux-arm@7.0.0-dev.20260223.1': + '@typescript/native-preview-linux-arm@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-linux-x64@7.0.0-dev.20260223.1': + '@typescript/native-preview-linux-x64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-win32-arm64@7.0.0-dev.20260223.1': + '@typescript/native-preview-win32-arm64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview-win32-x64@7.0.0-dev.20260223.1': + '@typescript/native-preview-win32-x64@7.0.0-dev.20260224.1': optional: true - '@typescript/native-preview@7.0.0-dev.20260223.1': + '@typescript/native-preview@7.0.0-dev.20260224.1': optionalDependencies: - '@typescript/native-preview-darwin-arm64': 7.0.0-dev.20260223.1 - '@typescript/native-preview-darwin-x64': 7.0.0-dev.20260223.1 - '@typescript/native-preview-linux-arm': 7.0.0-dev.20260223.1 - '@typescript/native-preview-linux-arm64': 7.0.0-dev.20260223.1 - '@typescript/native-preview-linux-x64': 7.0.0-dev.20260223.1 - '@typescript/native-preview-win32-arm64': 7.0.0-dev.20260223.1 - '@typescript/native-preview-win32-x64': 7.0.0-dev.20260223.1 + '@typescript/native-preview-darwin-arm64': 7.0.0-dev.20260224.1 + '@typescript/native-preview-darwin-x64': 7.0.0-dev.20260224.1 + '@typescript/native-preview-linux-arm': 7.0.0-dev.20260224.1 + '@typescript/native-preview-linux-arm64': 7.0.0-dev.20260224.1 + '@typescript/native-preview-linux-x64': 7.0.0-dev.20260224.1 + '@typescript/native-preview-win32-arm64': 7.0.0-dev.20260224.1 + '@typescript/native-preview-win32-x64': 7.0.0-dev.20260224.1 '@vitejs/plugin-basic-ssl@1.2.0(vite@6.4.1(@types/node@22.19.11)(jiti@2.6.1)(lightningcss@1.31.1)(yaml@2.8.2))': dependencies: @@ -7047,7 +7047,7 @@ snapshots: glob: 13.0.6 package-json-from-dist: 1.0.1 - rolldown-plugin-dts@0.20.0(@typescript/native-preview@7.0.0-dev.20260223.1)(rolldown@1.0.0-beta.58)(typescript@5.9.3): + rolldown-plugin-dts@0.20.0(@typescript/native-preview@7.0.0-dev.20260224.1)(rolldown@1.0.0-beta.58)(typescript@5.9.3): dependencies: '@babel/generator': 7.29.1 '@babel/parser': 7.29.0 @@ -7059,7 +7059,7 @@ snapshots: obug: 2.1.1 rolldown: 1.0.0-beta.58 optionalDependencies: - '@typescript/native-preview': 7.0.0-dev.20260223.1 + '@typescript/native-preview': 7.0.0-dev.20260224.1 typescript: 5.9.3 transitivePeerDependencies: - oxc-resolver @@ -7266,7 +7266,7 @@ snapshots: dependencies: typescript: 5.9.3 - tsdown@0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260223.1)(synckit@0.11.12)(typescript@5.9.3): + tsdown@0.19.0-beta.3(@typescript/native-preview@7.0.0-dev.20260224.1)(synckit@0.11.12)(typescript@5.9.3): dependencies: ansis: 4.2.0 cac: 6.7.14 @@ -7277,7 +7277,7 @@ snapshots: obug: 2.1.1 picomatch: 4.0.3 rolldown: 1.0.0-beta.58 - rolldown-plugin-dts: 0.20.0(@typescript/native-preview@7.0.0-dev.20260223.1)(rolldown@1.0.0-beta.58)(typescript@5.9.3) + rolldown-plugin-dts: 0.20.0(@typescript/native-preview@7.0.0-dev.20260224.1)(rolldown@1.0.0-beta.58)(typescript@5.9.3) semver: 7.7.4 tinyexec: 1.0.2 tinyglobby: 0.2.15 diff --git a/pnpm-workspace.yaml b/pnpm-workspace.yaml index 77c1b54..d8a154b 100644 --- a/pnpm-workspace.yaml +++ b/pnpm-workspace.yaml @@ -1,15 +1,15 @@ packages: - packages/* - apps/* - # @generated begin fork-workspaces - - ccc-fork/ccc/packages/* - - "!ccc-fork/ccc/packages/demo" - - "!ccc-fork/ccc/packages/docs" - - "!ccc-fork/ccc/packages/examples" - - "!ccc-fork/ccc/packages/faucet" - - "!ccc-fork/ccc/packages/playground" - - "!ccc-fork/ccc/packages/tests" - # @generated end fork-workspaces + # @generated begin forker-workspaces + - forks/ccc/packages/* + - "!forks/ccc/packages/demo" + - "!forks/ccc/packages/docs" + - "!forks/ccc/packages/examples" + - "!forks/ccc/packages/faucet" + - "!forks/ccc/packages/playground" + - "!forks/ccc/packages/tests" + # @generated end forker-workspaces catalog: "@ckb-ccc/core": ^1.12.2 diff --git a/reference/.gitignore b/reference/.gitignore deleted file mode 100644 index f00f623..0000000 --- a/reference/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -* -!.gitignore -!clone.sh -!README.md diff --git a/reference/README.md b/reference/README.md deleted file mode 100644 index c76161f..0000000 --- a/reference/README.md +++ /dev/null @@ -1,21 +0,0 @@ -# Reference Repos - -Read-only shallow clones of repos useful as context for the AI Coworker — project knowledge, dependency sources, usage examples, etc. - -## Usage - -```bash -pnpm reference # clone missing repos, update stale ones -``` - -## Adding a repo - -Append a line to the `repos` array in `clone.sh`: - -```bash -"name https://github.com/org/repo.git" -``` - -Extra git-clone flags can follow the URL (e.g. `--branch v2`). - -All clones are `--depth 1` (shallow) and made read-only (`chmod -R a-w`). On each run, the script checks if the local HEAD matches the remote — stale repos are automatically re-cloned. diff --git a/reference/clone.sh b/reference/clone.sh deleted file mode 100755 index 6d0aec4..0000000 --- a/reference/clone.sh +++ /dev/null @@ -1,28 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail -cd "$(dirname "$0")" - -# Each line: [clone-flags...] -repos=( - "contracts https://github.com/ickb/contracts.git" - "whitepaper https://github.com/ickb/whitepaper.git" -) - -for entry in "${repos[@]}"; do - read -r dir url flags <<< "$entry" - if [ -d "$dir" ]; then - local_head=$(git -C "$dir" rev-parse HEAD 2>/dev/null || echo "unknown") - remote_head=$(git ls-remote "$url" HEAD 2>/dev/null | cut -f1) - if [[ "$local_head" == "$remote_head" ]]; then - echo "reference/$dir: up to date" - continue - fi - echo "reference/$dir: outdated, re-cloning..." - chmod -R u+w "$dir" - rm -rf "$dir" - else - echo "reference/$dir: cloning..." - fi - git clone --depth 1 $flags "$url" "$dir" - chmod -R a-w "$dir" -done diff --git a/fork-scripts/tsgo-filter.sh b/tsgo-filter.sh similarity index 80% rename from fork-scripts/tsgo-filter.sh rename to tsgo-filter.sh index aafd873..2734887 100644 --- a/fork-scripts/tsgo-filter.sh +++ b/tsgo-filter.sh @@ -9,7 +9,7 @@ # integration errors, just tsconfig-strictness mismatches. # # This wrapper: -# 1. Detects all *-fork/ clone directories at repo root +# 1. Reads fork entry names from forks/config.json # 2. If none are cloned, runs plain tsgo (no filtering needed) # 3. Otherwise runs tsgo with noEmitOnError=false so fork diagnostics don't block emit # 4. Reports only diagnostics from stack source files @@ -17,14 +17,12 @@ set -euo pipefail -ROOT="$(cd "$(dirname "$0")/.." && pwd)" +ROOT="$(cd "$(dirname "$0")" && pwd)" -# Build filter pattern from all cloned fork directories +# Build filter pattern from cloned fork entries FILTER_PARTS=() -for d in "$ROOT"/*-fork; do - [ -f "$d/config.json" ] || continue - clone_dir=$(jq -r '.cloneDir' "$d/config.json") - [ -d "$d/$clone_dir" ] && FILTER_PARTS+=("$(basename "$d")/$clone_dir/") +for name in $(jq -r 'keys[]' "$ROOT/forks/config.json" 2>/dev/null); do + [ -d "$ROOT/forks/$name" ] && FILTER_PARTS+=("forks/$name/") done # No managed repos cloned — run plain tsgo