Skip to content

Gip 25 deploy reserve xdc#295

Open
sirpy wants to merge 17 commits intomasterfrom
gip-25-deploy-reserve-xdc
Open

Gip 25 deploy reserve xdc#295
sirpy wants to merge 17 commits intomasterfrom
gip-25-deploy-reserve-xdc

Conversation

@sirpy
Copy link
Contributor

@sirpy sirpy commented Mar 9, 2026

Summary by Sourcery

Deploy and wire up an updated multi-chain reserve and Mento integration (including XDC) with safer trading limits, distribution helper improvements, and bridge/identity upgrades across networks.

New Features:

  • Expose additional IBancorExchangeProvider and IBroker query/update interfaces needed for Mento reserve management and trading limits inspection.
  • Introduce an UpdateReserveRatio helper contract to programmatically adjust reserve ratios and create exchanges for new reserves.
  • Add a proposal script to orchestrate step‑2 deployment of the XDC reserve and related upgrades across Fuse, Celo, Mainnet, and XDC.

Bug Fixes:

  • Fix UBI claim eligibility so that claims are blocked when a member’s whitelisted root has already claimed.
  • Correct bulk whitelist reporting to account for more recent farming cohorts and accurate totals.

Enhancements:

  • Refine trading limits configuration to separate in/out limits and expose broker trading limit state/config getters.
  • Improve GenericDistributionHelper swap routing by selecting Uniswap V3 pools based on highest liquidity, handling zero-amount and missing-pool cases, and emitting clearer failure reasons.
  • Extend multichain deployment helpers with network-parameterization, gas estimation/simulation controls, and Safe API compatibility for Fuse’s custom transaction service.
  • Adjust Uniswap V3 test pool creation to support separate fee tiers for gas vs G$ pools and correct price/decimals handling.
  • Enhance IdentityV3 with events for account connect/disconnect actions and UBISchemeV2 to respect whitelisted root account claim state.
  • Update deployment helper script to be network-agnostic, tune distribution parameters, and reuse configured ReserveToken addresses.
  • Tighten Hardhat configuration and key handling by switching to a generic PRIVATE_KEY env var and cleaning minor config issues.
  • Bump protocol version from 2.2.0-beta.1 to 2.2.0.

Deployment:

  • Add Fuse-specific Safe transaction proposal flow and options, including custom transaction service endpoint and origin handling for chain 122.

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 6 issues, and left some high level feedback:

  • In UpdateReserveRatio.sol you still import hardhat/console.sol; this will bloat deployed bytecode and should be removed or guarded so it is not included in production builds.
  • In executeViaGuardian you accumulate gasUsed by converting each estimateGas result with .toNumber(), which risks overflow or precision loss for larger batches; consider keeping this as a BigNumber and only converting/formatting at the end for logging.
  • In createUniswapTestPools.ts, GAS_POOL_FEE is set to 3000 but the comment still says 0.01% (3000 is 0.3%), which can be misleading when configuring or debugging test pools; it would be safer to update the comment or the value to match.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In `UpdateReserveRatio.sol` you still import `hardhat/console.sol`; this will bloat deployed bytecode and should be removed or guarded so it is not included in production builds.
- In `executeViaGuardian` you accumulate `gasUsed` by converting each `estimateGas` result with `.toNumber()`, which risks overflow or precision loss for larger batches; consider keeping this as a `BigNumber` and only converting/formatting at the end for logging.
- In `createUniswapTestPools.ts`, `GAS_POOL_FEE` is set to `3000` but the comment still says `0.01%` (3000 is 0.3%), which can be misleading when configuring or debugging test pools; it would be safer to update the comment or the value to match.

## Individual Comments

### Comment 1
<location path="scripts/multichain-deploy/createUniswapTestPools.ts" line_range="12-13" />
<code_context>
 const GASPRICE_STABLE = 0.08;
-const POOL_FEE = 100; // 0.01%
+const GDPRICE_STABLE = 0.0001
+const GAS_POOL_FEE = 3000; // 0.01%
+const GD_POOL_FEE = 500; // 0.05%
 const main = async () => {
   let protocolSettings = defaultsDeep({}, ProtocolSettings[network.name], ProtocolSettings["default"]);
</code_context>
<issue_to_address>
**issue:** Uniswap v3 fee comments are inconsistent with the actual fee tier values.

`GAS_POOL_FEE = 3000` corresponds to a 0.3% fee tier, not 0.01%. Please correct the comments for these fee tiers (e.g., `3000 // 0.3%`, `500 // 0.05%`, `100 // 0.01%`) to avoid confusion when configuring or tuning pools.
</issue_to_address>

### Comment 2
<location path="scripts/multichain-deploy/helpers.ts" line_range="433-436" />
<code_context>
   const safeService = new SafeApiKit({
     chainId: BigInt(chainId),
     txServiceUrl,
-    apiKey: process.env.SAFE_TX_SERVICE_API_KEY || ""
+    apiKey: txServiceUrl ? undefined : process.env.SAFE_TX_SERVICE_API_KEY
   });

</code_context>
<issue_to_address>
**question (bug_risk):** SafeApiKit apiKey selection logic is counterintuitive and may disable API key usage unintentionally.

With `apiKey: txServiceUrl ? undefined : process.env.SAFE_TX_SERVICE_API_KEY`, any time `txServiceUrl` is set (likely the default), the API key is suppressed. If you want to allow custom URLs while still using the API key when present, decouple these conditions so the API key is passed whenever `SAFE_TX_SERVICE_API_KEY` is set, independent of `txServiceUrl`.
</issue_to_address>

### Comment 3
<location path="hardhat.config.ts" line_range="31-33" />
<code_context>
 config();

 const mnemonic = process.env.MNEMONIC || "test test test test test test test test test test test junk";
-const deployerPrivateKey = process.env.ADMIN_KEY || ethers.utils.hexZeroPad("0x11", 32);
+const deployerPrivateKey = process.env.PRIVATE_KEY || ethers.utils.hexZeroPad("0x11", 32);
 const infura_api = process.env.INFURA_API;
 const alchemy_key = process.env.ALCHEMY_KEY;
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Changing the environment variable name for the deployer key may silently break existing CI or deploy setups.

The code now reads `process.env.PRIVATE_KEY` instead of `ADMIN_KEY`, so any existing pipelines or scripts using `ADMIN_KEY` will quietly fall back to the default `0x11...` key. If the rename is intentional, please add a safeguard (e.g., temporary support for both vars or a warning when neither is set) to prevent accidental deployments with the default key.

```suggestion
const mnemonic = process.env.MNEMONIC || "test test test test test test test test test test test junk";

// Prefer PRIVATE_KEY, but fall back to legacy ADMIN_KEY with a warning.
const rawDeployerKey =
  process.env.PRIVATE_KEY || process.env.ADMIN_KEY || undefined;

if (!process.env.PRIVATE_KEY && process.env.ADMIN_KEY) {
  console.warn(
    "[hardhat.config] Warning: Using legacy ADMIN_KEY env var. " +
      "Please migrate to PRIVATE_KEY; support for ADMIN_KEY may be removed in a future release."
  );
}

if (!rawDeployerKey) {
  console.warn(
    "[hardhat.config] Warning: No PRIVATE_KEY or ADMIN_KEY set. " +
      "Falling back to the default hexZeroPad('0x11', 32) deployer key. " +
      "This is unsafe for real deployments."
  );
}

const deployerPrivateKey =
  rawDeployerKey ?? ethers.utils.hexZeroPad("0x11", 32);

const infura_api = process.env.INFURA_API;
```
</issue_to_address>

### Comment 4
<location path="scripts/proposals/gip-25-xdc-deploy-reserve.ts" line_range="59" />
<code_context>
+  "production-xdc": "0x00F7f61080EF40d3832C8C06e8A2af757839e1F7"
+};
+
+const circuitBreaker = "";
+export const upgradeCeloStep2 = async (network, checksOnly) => {
+  const ExchangeProviderV2Impl = "0xe930CDE20f60d0A4fc9487874861AE259F5Bed48";
</code_context>
<issue_to_address>
**issue (bug_risk):** Using an empty string for `circuitBreaker` will cause `registerScheme` to operate on the zero address.

In the proposal action:
```ts
[release.Controller,
 "registerScheme(address,bytes32,bytes4,address)",
 ethers.utils.defaultAbiCoder.encode(
   ["address", "bytes32", "bytes4", "address"],
   [circuitBreaker, ethers.constants.HashZero, "0x00000010", release.Avatar]
 ),
 "0"
]
```
`circuitBreaker = ""` is ABI-encoded as `address(0)`, so you end up registering a scheme at the zero address, which is almost certainly wrong and difficult to recover from. `circuitBreaker` should be a required, non-empty address, validated before constructing these actions.
</issue_to_address>

### Comment 5
<location path="contracts/utils/UpdateReserveRatio.sol" line_range="9" />
<code_context>
+import "../DAOStackInterfaces.sol";
+import "../MentoInterfaces.sol";
+
+import "hardhat/console.sol";
+
+interface MentoExchange {
</code_context>
<issue_to_address>
**issue:** Including `hardhat/console.sol` in a production contract can increase bytecode size and is generally not suitable for mainnet.

Please remove this import and any `console.log` usages from the deployable contract so the compiled bytecode matches production expectations and avoids unnecessary size increase.
</issue_to_address>

### Comment 6
<location path="scripts/proposals/gip-25-xdc-deploy-reserve.ts" line_range="60" />
<code_context>
+};
+
+const circuitBreaker = "";
+export const upgradeCeloStep2 = async (network, checksOnly) => {
+  const ExchangeProviderV2Impl = "0xe930CDE20f60d0A4fc9487874861AE259F5Bed48";
+  const MentoBrokerV2Impl = "0xc69ae3550E25C7AB28301B9Bf75F20f5AF47B7d2";
</code_context>
<issue_to_address>
**issue (complexity):** Consider refactoring the upgrade scripts to centralize shared setup, proposal execution, and simulation/dev utilities into reusable helpers and separate modules to reduce duplication and cognitive load.

You can trim quite a bit of complexity/duplication here without changing behavior by extracting a few small helpers and separating simulation/dev utilities.

### 1. Factor out shared guardian/setup logic

`upgradeCeloStep2`, `upgradeFuseStep2`, `upgradeEthStep2`, and `upgradeXdcStep2` all repeat:

- `isProduction` / `verifyProductionSigner`
- `networkEnv` / `guardian` resolution
- simulation impersonation & funding
- loading `release` from `dao`

You can put that into a single helper:

```ts
// near top of the file (or in a shared module)
type GuardianContext = {
  isProduction: boolean;
  isSimulation: boolean;
  networkEnv: string;
  guardian: ethers.Signer;
  root: ethers.Signer;
  release: { [key: string]: any };
};

async function getGuardianContext(networkArg: string): Promise<GuardianContext> {
  const [root] = await ethers.getSigners();
  const isProduction = networkName.includes("production");
  if (isProduction) verifyProductionSigner(root);

  let networkEnv = networkName;
  let guardian: ethers.Signer = root;

  if (isSimulation) {
    networkEnv = networkArg;
  }

  let release: { [key: string]: any } = dao[networkEnv];

  if (isSimulation) {
    guardian = await ethers.getImpersonatedSigner(release.GuardiansSafe);
    await root.sendTransaction({
      value: ethers.utils.parseEther("1"),
      to: release.GuardiansSafe
    });
  }

  return { isProduction, isSimulation, networkEnv, guardian, root, release };
}
```

Then `upgradeCeloStep2` shrinks to:

```ts
export const upgradeCeloStep2 = async (network: string, checksOnly: boolean) => {
  const { isProduction, isSimulation, networkEnv, guardian, root, release } =
    await getGuardianContext(network);

  const ExchangeProviderV2Impl = "0xe930CDE20f60d0A4fc9487874861AE259F5Bed48";
  const MentoBrokerV2Impl = "0xc69ae3550E25C7AB28301B9Bf75F20f5AF47B7d2";

  const identityImpl = await ethers.deployContract("IdentityV4");
  const upgradeCall = identityImpl.interface.encodeFunctionData("setReverifyDaysOptions", [[180]]);
  const reserveUpdate = (await deployDeterministic(
    { name: "UpdateReserveRatio" },
    [root.address],
    {},
    false,
    networkEnv
  )) as UpdateReserveRatio;

  const bridgeImpl = bridgeUpgradeImpl[networkEnv];

  const proposalActions = [
    // ...unchanged proposal entries...
  ];

  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "celo",
    networkEnv
  });

  if (!isProduction && isSimulation) {
    // keep the simulation block as-is
  }
};
```

The same helper can be reused in the Fuse/Eth/Xdc functions.

### 2. Factor out proposal execution boilerplate

The `[target, signature, encodedArgs, value]` → arrays → `executeViaSafe`/`executeViaGuardian` pattern is repeated in all `upgrade*Step2`. Wrap it once:

```ts
type ProposalAction = [string, string, string, string | number];

async function runProposal(opts: {
  proposalActions: ProposalAction[];
  release: { [key: string]: any };
  guardian: ethers.Signer;
  isProduction: boolean;
  checksOnly: boolean;
  safeChainName: string;
  networkEnv: string;
  guardianOptions?: { safeTxGas?: string };
}) {
  const {
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName,
    networkEnv,
    guardianOptions
  } = opts;

  const proposalContracts = proposalActions.map(a => a[0]);
  const proposalFunctionSignatures = proposalActions.map(a => a[1]);
  const proposalFunctionInputs = proposalActions.map(a => a[2]);
  const proposalEthValues = proposalActions.map(a => a[3]);

  if (isProduction && !checksOnly) {
    await executeViaSafe(
      proposalContracts,
      proposalEthValues,
      proposalFunctionSignatures,
      proposalFunctionInputs,
      release.GuardiansSafe,
      safeChainName,
      { safeTxGas: "2000000", ...guardianOptions }
    );
  } else if (!checksOnly) {
    await executeViaGuardian(
      proposalContracts,
      proposalEthValues,
      proposalFunctionSignatures,
      proposalFunctionInputs,
      guardian,
      networkEnv
    );
  }
}
```

Then in each `upgrade*Step2` you only build `proposalActions` and call `runProposal`, which reduces repeated control-flow and makes the intent (“these are the actions”) clearer.

Example in `upgradeFuseStep2`:

```ts
export const upgradeFuseStep2 = async (network: string, checksOnly: boolean) => {
  const { isProduction, isSimulation, networkEnv, guardian, root, release } =
    await getGuardianContext(network);

  const identityImpl = await ethers.deployContract("IdentityV4");
  const upgradeCall = identityImpl.interface.encodeFunctionData("setReverifyDaysOptions", [[180]]);
  const bridgeImpl = bridgeUpgradeImpl[networkEnv];

  const proposalActions: ProposalAction[] = [
    // ...current entries...
  ];

  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "fuse",
    networkEnv
  });
};
```

### 3. Extract simulation / test logic into separate functions

`upgradeXdcStep2` in particular mixes proposal construction and a long simulation block. You can keep the behavior but move the simulation into its own function in the same file to reduce the cognitive load of the main flow:

```ts
async function simulateXdcUpgrade({
  reserveUpdate,
  release,
  guardian,
  root,
  reserveParams,
  networkEnv
}: {
  reserveUpdate: UpdateReserveRatio;
  release: { [k: string]: any };
  guardian: ethers.Signer;
  root: ethers.Signer;
  reserveParams: { reserveRatioXdc: number; xdcGdSupplyEquivalent: string };
  networkEnv: string;
}) {
  const gd = (await ethers.getContractAt("IGoodDollar", release.GoodDollar)) as IGoodDollar;
  // ...move the entire `if (!isProduction && isSimulation)` body here unchanged...
}
```

And then in `upgradeXdcStep2`:

```ts
  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "xdc",
    networkEnv
  });

  if (!isProduction && isSimulation) {
    await simulateXdcUpgrade({
      reserveUpdate,
      release,
      guardian,
      root,
      reserveParams: reserveParams!,
      networkEnv
    });
  }
```

This keeps `upgradeXdcStep2` focused on building the proposal and delegating simulation/testing.

### 4. Move dev / calculation utilities into a separate module

`calculateReserveParams` and `testSwap` are dev/test tools with their own RPC providers and impersonation. To keep the proposal orchestration lean, consider moving them into a dedicated script (e.g. `reserve-tools.ts`) and importing where needed:

```ts
// scripts/tools/reserve-tools.ts
export const calculateReserveParams = async () => {
  // ...existing implementation...
};

export const testSwap = async () => {
  // ...existing implementation...
};
```

Then in this file:

```ts
import { calculateReserveParams, testSwap } from "./tools/reserve-tools";
```

This keeps the step-2 upgrade script mostly about governance proposals, while preserving the existing functionality and behavior of the tools.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment on lines +12 to +13
const GAS_POOL_FEE = 3000; // 0.01%
const GD_POOL_FEE = 500; // 0.05%
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue: Uniswap v3 fee comments are inconsistent with the actual fee tier values.

GAS_POOL_FEE = 3000 corresponds to a 0.3% fee tier, not 0.01%. Please correct the comments for these fee tiers (e.g., 3000 // 0.3%, 500 // 0.05%, 100 // 0.01%) to avoid confusion when configuring or tuning pools.

Comment on lines 433 to +436
const safeService = new SafeApiKit({
chainId: BigInt(chainId),
txServiceUrl,
apiKey: process.env.SAFE_TX_SERVICE_API_KEY || ""
apiKey: txServiceUrl ? undefined : process.env.SAFE_TX_SERVICE_API_KEY
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question (bug_risk): SafeApiKit apiKey selection logic is counterintuitive and may disable API key usage unintentionally.

With apiKey: txServiceUrl ? undefined : process.env.SAFE_TX_SERVICE_API_KEY, any time txServiceUrl is set (likely the default), the API key is suppressed. If you want to allow custom URLs while still using the API key when present, decouple these conditions so the API key is passed whenever SAFE_TX_SERVICE_API_KEY is set, independent of txServiceUrl.

Comment on lines 31 to 33
const mnemonic = process.env.MNEMONIC || "test test test test test test test test test test test junk";
const deployerPrivateKey = process.env.ADMIN_KEY || ethers.utils.hexZeroPad("0x11", 32);
const deployerPrivateKey = process.env.PRIVATE_KEY || ethers.utils.hexZeroPad("0x11", 32);
const infura_api = process.env.INFURA_API;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (bug_risk): Changing the environment variable name for the deployer key may silently break existing CI or deploy setups.

The code now reads process.env.PRIVATE_KEY instead of ADMIN_KEY, so any existing pipelines or scripts using ADMIN_KEY will quietly fall back to the default 0x11... key. If the rename is intentional, please add a safeguard (e.g., temporary support for both vars or a warning when neither is set) to prevent accidental deployments with the default key.

Suggested change
const mnemonic = process.env.MNEMONIC || "test test test test test test test test test test test junk";
const deployerPrivateKey = process.env.ADMIN_KEY || ethers.utils.hexZeroPad("0x11", 32);
const deployerPrivateKey = process.env.PRIVATE_KEY || ethers.utils.hexZeroPad("0x11", 32);
const infura_api = process.env.INFURA_API;
const mnemonic = process.env.MNEMONIC || "test test test test test test test test test test test junk";
// Prefer PRIVATE_KEY, but fall back to legacy ADMIN_KEY with a warning.
const rawDeployerKey =
process.env.PRIVATE_KEY || process.env.ADMIN_KEY || undefined;
if (!process.env.PRIVATE_KEY && process.env.ADMIN_KEY) {
console.warn(
"[hardhat.config] Warning: Using legacy ADMIN_KEY env var. " +
"Please migrate to PRIVATE_KEY; support for ADMIN_KEY may be removed in a future release."
);
}
if (!rawDeployerKey) {
console.warn(
"[hardhat.config] Warning: No PRIVATE_KEY or ADMIN_KEY set. " +
"Falling back to the default hexZeroPad('0x11', 32) deployer key. " +
"This is unsafe for real deployments."
);
}
const deployerPrivateKey =
rawDeployerKey ?? ethers.utils.hexZeroPad("0x11", 32);
const infura_api = process.env.INFURA_API;

"production-xdc": "0x00F7f61080EF40d3832C8C06e8A2af757839e1F7"
};

const circuitBreaker = "";
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Using an empty string for circuitBreaker will cause registerScheme to operate on the zero address.

In the proposal action:

[release.Controller,
 "registerScheme(address,bytes32,bytes4,address)",
 ethers.utils.defaultAbiCoder.encode(
   ["address", "bytes32", "bytes4", "address"],
   [circuitBreaker, ethers.constants.HashZero, "0x00000010", release.Avatar]
 ),
 "0"
]

circuitBreaker = "" is ABI-encoded as address(0), so you end up registering a scheme at the zero address, which is almost certainly wrong and difficult to recover from. circuitBreaker should be a required, non-empty address, validated before constructing these actions.

Copy link
Collaborator

@blueogin blueogin Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think inserting blank string into circuitBreaker is not ideal.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree, using an empty string here is risky.

Two concrete suggestions:

  1. Make it explicit and required
    Instead of hard‑coding const circuitBreaker = "", read it from config or deployment.json, and fail fast if it’s missing:

    const circuitBreaker = dao[networkName]?.CircuitBreaker;
    if (!circuitBreaker || circuitBreaker === ethers.constants.AddressZero) {
      throw new Error(`CircuitBreaker address is required for ${networkName}`);
    }

    That way the proposal script can’t accidentally register the zero address.

  2. Guard before building the actions
    Even if you keep the constant, add a check before constructing any registerScheme call that uses it:

    if (!circuitBreaker || circuitBreaker === ethers.constants.AddressZero) {
      throw new Error('circuitBreaker must be a non-zero address');
    }
    
    const actions = [
      [
        release.Controller,
        "registerScheme(address,bytes32,bytes4,address)",
        ethers.utils.defaultAbiCoder.encode(
          ["address", "bytes32", "bytes4", "address"],
          [circuitBreaker, ethers.constants.HashZero, "0x00000010", release.Avatar]
        ),
        "0"
      ],
      // ...
    ];

If you’d like, I can suggest a small helper to fetch and validate circuitBreaker similar to how other deployment addresses are handled in the script, so it’s consistent with the rest of the proposal tooling.

import "../DAOStackInterfaces.sol";
import "../MentoInterfaces.sol";

import "hardhat/console.sol";
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue: Including hardhat/console.sol in a production contract can increase bytecode size and is generally not suitable for mainnet.

Please remove this import and any console.log usages from the deployable contract so the compiled bytecode matches production expectations and avoids unnecessary size increase.

};

const circuitBreaker = "";
export const upgradeCeloStep2 = async (network, checksOnly) => {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider refactoring the upgrade scripts to centralize shared setup, proposal execution, and simulation/dev utilities into reusable helpers and separate modules to reduce duplication and cognitive load.

You can trim quite a bit of complexity/duplication here without changing behavior by extracting a few small helpers and separating simulation/dev utilities.

1. Factor out shared guardian/setup logic

upgradeCeloStep2, upgradeFuseStep2, upgradeEthStep2, and upgradeXdcStep2 all repeat:

  • isProduction / verifyProductionSigner
  • networkEnv / guardian resolution
  • simulation impersonation & funding
  • loading release from dao

You can put that into a single helper:

// near top of the file (or in a shared module)
type GuardianContext = {
  isProduction: boolean;
  isSimulation: boolean;
  networkEnv: string;
  guardian: ethers.Signer;
  root: ethers.Signer;
  release: { [key: string]: any };
};

async function getGuardianContext(networkArg: string): Promise<GuardianContext> {
  const [root] = await ethers.getSigners();
  const isProduction = networkName.includes("production");
  if (isProduction) verifyProductionSigner(root);

  let networkEnv = networkName;
  let guardian: ethers.Signer = root;

  if (isSimulation) {
    networkEnv = networkArg;
  }

  let release: { [key: string]: any } = dao[networkEnv];

  if (isSimulation) {
    guardian = await ethers.getImpersonatedSigner(release.GuardiansSafe);
    await root.sendTransaction({
      value: ethers.utils.parseEther("1"),
      to: release.GuardiansSafe
    });
  }

  return { isProduction, isSimulation, networkEnv, guardian, root, release };
}

Then upgradeCeloStep2 shrinks to:

export const upgradeCeloStep2 = async (network: string, checksOnly: boolean) => {
  const { isProduction, isSimulation, networkEnv, guardian, root, release } =
    await getGuardianContext(network);

  const ExchangeProviderV2Impl = "0xe930CDE20f60d0A4fc9487874861AE259F5Bed48";
  const MentoBrokerV2Impl = "0xc69ae3550E25C7AB28301B9Bf75F20f5AF47B7d2";

  const identityImpl = await ethers.deployContract("IdentityV4");
  const upgradeCall = identityImpl.interface.encodeFunctionData("setReverifyDaysOptions", [[180]]);
  const reserveUpdate = (await deployDeterministic(
    { name: "UpdateReserveRatio" },
    [root.address],
    {},
    false,
    networkEnv
  )) as UpdateReserveRatio;

  const bridgeImpl = bridgeUpgradeImpl[networkEnv];

  const proposalActions = [
    // ...unchanged proposal entries...
  ];

  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "celo",
    networkEnv
  });

  if (!isProduction && isSimulation) {
    // keep the simulation block as-is
  }
};

The same helper can be reused in the Fuse/Eth/Xdc functions.

2. Factor out proposal execution boilerplate

The [target, signature, encodedArgs, value] → arrays → executeViaSafe/executeViaGuardian pattern is repeated in all upgrade*Step2. Wrap it once:

type ProposalAction = [string, string, string, string | number];

async function runProposal(opts: {
  proposalActions: ProposalAction[];
  release: { [key: string]: any };
  guardian: ethers.Signer;
  isProduction: boolean;
  checksOnly: boolean;
  safeChainName: string;
  networkEnv: string;
  guardianOptions?: { safeTxGas?: string };
}) {
  const {
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName,
    networkEnv,
    guardianOptions
  } = opts;

  const proposalContracts = proposalActions.map(a => a[0]);
  const proposalFunctionSignatures = proposalActions.map(a => a[1]);
  const proposalFunctionInputs = proposalActions.map(a => a[2]);
  const proposalEthValues = proposalActions.map(a => a[3]);

  if (isProduction && !checksOnly) {
    await executeViaSafe(
      proposalContracts,
      proposalEthValues,
      proposalFunctionSignatures,
      proposalFunctionInputs,
      release.GuardiansSafe,
      safeChainName,
      { safeTxGas: "2000000", ...guardianOptions }
    );
  } else if (!checksOnly) {
    await executeViaGuardian(
      proposalContracts,
      proposalEthValues,
      proposalFunctionSignatures,
      proposalFunctionInputs,
      guardian,
      networkEnv
    );
  }
}

Then in each upgrade*Step2 you only build proposalActions and call runProposal, which reduces repeated control-flow and makes the intent (“these are the actions”) clearer.

Example in upgradeFuseStep2:

export const upgradeFuseStep2 = async (network: string, checksOnly: boolean) => {
  const { isProduction, isSimulation, networkEnv, guardian, root, release } =
    await getGuardianContext(network);

  const identityImpl = await ethers.deployContract("IdentityV4");
  const upgradeCall = identityImpl.interface.encodeFunctionData("setReverifyDaysOptions", [[180]]);
  const bridgeImpl = bridgeUpgradeImpl[networkEnv];

  const proposalActions: ProposalAction[] = [
    // ...current entries...
  ];

  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "fuse",
    networkEnv
  });
};

3. Extract simulation / test logic into separate functions

upgradeXdcStep2 in particular mixes proposal construction and a long simulation block. You can keep the behavior but move the simulation into its own function in the same file to reduce the cognitive load of the main flow:

async function simulateXdcUpgrade({
  reserveUpdate,
  release,
  guardian,
  root,
  reserveParams,
  networkEnv
}: {
  reserveUpdate: UpdateReserveRatio;
  release: { [k: string]: any };
  guardian: ethers.Signer;
  root: ethers.Signer;
  reserveParams: { reserveRatioXdc: number; xdcGdSupplyEquivalent: string };
  networkEnv: string;
}) {
  const gd = (await ethers.getContractAt("IGoodDollar", release.GoodDollar)) as IGoodDollar;
  // ...move the entire `if (!isProduction && isSimulation)` body here unchanged...
}

And then in upgradeXdcStep2:

  await runProposal({
    proposalActions,
    release,
    guardian,
    isProduction,
    checksOnly,
    safeChainName: "xdc",
    networkEnv
  });

  if (!isProduction && isSimulation) {
    await simulateXdcUpgrade({
      reserveUpdate,
      release,
      guardian,
      root,
      reserveParams: reserveParams!,
      networkEnv
    });
  }

This keeps upgradeXdcStep2 focused on building the proposal and delegating simulation/testing.

4. Move dev / calculation utilities into a separate module

calculateReserveParams and testSwap are dev/test tools with their own RPC providers and impersonation. To keep the proposal orchestration lean, consider moving them into a dedicated script (e.g. reserve-tools.ts) and importing where needed:

// scripts/tools/reserve-tools.ts
export const calculateReserveParams = async () => {
  // ...existing implementation...
};

export const testSwap = async () => {
  // ...existing implementation...
};

Then in this file:

import { calculateReserveParams, testSwap } from "./tools/reserve-tools";

This keeps the step-2 upgrade script mostly about governance proposals, while preserving the existing functionality and behavior of the tools.

@openzeppelin-code
Copy link

Gip 25 deploy reserve xdc

Generated at commit: d886793888241c9357a2bac40fa83db405037c10

🚨 Report Summary

Severity Level Results
Contracts Critical
High
Medium
Low
Note
Total
3
5
0
17
44
69
Dependencies Critical
High
Medium
Low
Note
Total
0
0
1
0
135
136

For more details view the full report in OpenZeppelin Code Inspector

@sirpy sirpy requested review from L03TJ3 and blueogin March 9, 2026 12:44
"production-xdc": "0x00F7f61080EF40d3832C8C06e8A2af757839e1F7"
};

const circuitBreaker = "";
Copy link
Collaborator

@blueogin blueogin Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think inserting blank string into circuitBreaker is not ideal.

console.log("update done");
bool updated = abi.decode(result, (bool));
require(updated, "not updated");
} else {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be better to add an XDC chainId check here?

*/
function connectAccount(address account) external onlyWhitelisted {
require(
!isWhitelisted(account) && !isBlacklisted(account),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please confirm that “no connect if already whitelisted” is intentional.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants