Trying to add some custom types for functions but devalue seems to throw before calling the replacer. I understand that serializing functions is a non-goal, but I'm not sure it makes sense to prevent the following use-cases to be implemented with custom types:
-
Reviving a pure function using a custom type. There is no transferring of the function, we can just reference it on both sides of serialization:
const add = (a, b) => a + b;
const stringified = devalue.stringify({ myOperation: add }, { Add: (value) => value === add });
const result = devalue.parse(stringified, { Add: () => add });
console.log(result.myOperation(1, 2));
-
Ignoring non-serializable. While transferring user generated values, trying to do a best effort while communicating failures:
const NONTRANSFERRABLE = Symbol('nontransferrable');
const stringified = devalue.stringify(
{ foo: () => 1 },
{ Ignore: (value) => typeof value === 'function' || value === NONTRANSFERRABLE },
);
const result = devalue.parse(stringified, { Ignore: () => NONTRANSFERRABLE });
console.log(result.add(1, 2));
both of these result in
Uncaught DevalueError: Cannot stringify a function
Would you consider relaxing the acceptable values that can be serialized with custom types?
Trying to add some custom types for functions but
devalueseems to throw before calling the replacer. I understand that serializing functions is a non-goal, but I'm not sure it makes sense to prevent the following use-cases to be implemented with custom types:Reviving a pure function using a custom type. There is no transferring of the function, we can just reference it on both sides of serialization:
Ignoring non-serializable. While transferring user generated values, trying to do a best effort while communicating failures:
both of these result in
Would you consider relaxing the acceptable values that can be serialized with custom types?