Skip to content

Conversation

@ana-pantilie
Copy link
Contributor

Pre-submit checklist:

  • Branch
    • Tests are provided (if possible)
    • Commit sequence broadly makes sense
    • Key commits have useful messages
    • Changelog fragments have been written (if appropriate)
    • Relevant tickets are mentioned in commit messages
    • Formatting, PNG optimization, etc. are updated
  • PR
    • (For external contributions) Corresponding issue exists and is linked in the description
    • Targeting master unless this is a cherry-pick backport
    • Self-reviewed the diff
    • Useful pull request description
    • Reviewer requested

kwxm and others added 14 commits December 19, 2025 18:12
Add new memory-analysis executable with modules for analyzing memory behavior of Plutus builtins. Includes plotting utilities, regression analysis, and experiment framework for deriving accurate memory models from empirical measurements.
Introduce DataNodeCount newtype that measures Data memory via lazy node traversal rather than serialization size. This provides more accurate memory accounting for UnValueData builtin which operates on the Data structure directly without serializing.

The wrapper separates concerns: node counting logic here, cost coefficients in JSON models.
Add KnownTypeAst and builtin marshalling instances for DataNodeCount. This enables using the new memory model in builtin definitions while maintaining type safety through the universe system.

Also includes minor refactoring (void instead of (() <$)) for clarity.
Apply ValueTotalSize to ValueData and DataNodeCount to UnValueData, replacing plain Value/Data types. This enables accurate memory accounting: ValueData uses total serialized size, UnValueData uses node count for measuring input Data complexity.
Update ValueData and UnValueData benchmarks to use createOneTermBuiltinBenchWithWrapper with appropriate memory measurement wrappers (ValueTotalSize and DataNodeCount).

This ensures benchmarks measure the same memory behavior as production builtins.
Replace constant memory costs with linear models derived from empirical measurements:

- ValueData: memory = 38×size + 6 (was constant 1)

- UnValueData: memory = 8×nodes + 0 (was constant 1)

  CPU: 290658×nodes + 1000 (was 43200×arg + 1000)

The linear models better reflect actual memory behavior: ValueData scales with serialized size, UnValueData scales with node count.

Benchmark data regenerated with new memory measurement approach.
Co-authored-by: Nikolaos Bezirgiannis <bezirg@users.noreply.github.com>
Co-authored-by: Nikolaos Bezirgiannis <bezirg@users.noreply.github.com>
@github-actions
Copy link
Contributor

PR Preview Action v1.6.3

🚀 View preview at
https://IntersectMBO.github.io/plutus/pr-preview/docs/pr-7504/

Built to branch gh-pages at 2025-12-24 11:23 UTC.
Preview will be ready when the GitHub Pages deployment is complete.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants