-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: support BigInt serialization in segment metadata #1713
Comments
Hi @misterjoshua thanks for taking the time to open the issue. As you have correctly pointed out , the With that said, I do agree that we should do something about this, so I propose a two steps approach: Step 1Add an internal This is a change that we can do right away and that we should do regardless of the serialization discussion. Step 2Add support for BigInt objects. The change is relatively simple if made on the X-Ray SDK side, however if made on the Powertools side it requires a more roundabout way given that the SDK implementation doesn't extract the serialization in its own method. For this reason we would have to essentially provide a new implementation for two methods of the I'd like to avoid this option if possible since it'll mean we'll have to make sure our implementation doesn't go out of sync with theirs, so for this reason I have opened an issue on the X-Ray SDK repo and offered to contribute a fix: aws/aws-xray-sdk-node#616. To PoC the fix, I was able write this passing test: it('handles BigInt types when adding metadata to a segment', () => {
const segment = new Subsegment('## foo.bar');
segment.addMetadata('foo', {
bar: BigInt(123),
});
(
segment as Subsegment & {
type: string;
parent_id: string;
trace_id: string;
}
).format = function format() {
this.type = 'subsegment';
if (this.parent) {
this.parent_id = this.parent.id;
}
if (this.segment) {
this.trace_id = this.segment.trace_id;
}
return JSON.stringify(this, (_, value) => {
if (typeof value === 'bigint') {
return value.toString();
}
return value;
});
};
segment.toString = function toString() {
return JSON.stringify(this, (_, value) => {
if (typeof value === 'bigint') {
return value.toString();
}
return value;
});
};
expect(segment).toEqual(
expect.objectContaining({
metadata: {
default: {
foo: {
bar: 123n,
},
},
},
})
);
}); As mentioned I'll open a PR for the first step and try to include it in the next release that should go out today/tomorrow at the latest. In the meanwhile, we'll give some time to the X-Ray SDK folks to review my issue and reply. We'll keep monitoring the progress of that issue and hopefully contribute a PR; however if things take too long we'll reconsider implementing the change on our side either temporarily or permanently. |
The PR (#1716) that introduces the The reason why we went this way is that after debugging the error and following the stack trace, we noticed that the error described in your issue is thrown only when a segment is emitted, specifically when its serialized, and not when the data is added to the segment. Adding the So we have added Unfortunately, until X-Ray merges the fix we are proposing, calling At the time of writing, we are still waiting on the X-Ray team for the issue we've opened so I'm leaving the issue open and changing its state to |
I just wanted to report that I have managed to get through the X-Ray team and we have agreed on a fix. I'll be opening a PR in their repo in the next few days. |
The PR is up: aws/aws-xray-sdk-node#619 |
Thanks, Andrea. I pulled down a local copy, tested it, and left a small review. |
@dreamorosi Looks like The X-Ray SDK for Node released the BigInt serialization change in v3.5.3. I updated my personal reproduction to: @aws-lambda-powertools/[email protected], [email protected] After re-running the lambda, my trace now contains the bigint. So, I think we're done here unless you can think of anything else? |
Hi @misterjoshua thanks for reporting back on the topic. I'll bump the version of the X-Ray SDK in the Tracer dependencies to the latest and close the issue. |
|
This is now released under v1.14.1 version! |
Expected Behaviour
When I add metadata to my trace, I expect the tracer to handle any data I send to it without causing the lambda to crash.
Current Behaviour
It crashes the lambda when I use
putMetadata
to add a data structure that contains aBigInt
or any other format thatJSON.stringify
cannot encode.Code snippet
Steps to Reproduce
Possible Solution
I think there are two factors to consider.
putMetadata
receives when JSON.stringify cannot handle them - perhaps we can use areplacer
function to encode exotic types.Powertools for AWS Lambda (TypeScript) version
latest
AWS Lambda function runtime
16.x
Packaging format used
npm
Execution logs
This issue came from Twitter. https://twitter.com/heitor_lessa/status/1706563622694981916
The text was updated successfully, but these errors were encountered: