WIP: Introduce experimental JSON logging#10822
Conversation
| f"Value {request.get('Runtime')} at 'runtime' failed to satisfy constraint: Member must satisfy enum value set: {VALID_RUNTIMES} or be a valid ARN", | ||
| Type="User", | ||
| ) | ||
|
|
There was a problem hiding this comment.
| handlers.add_region_from_header, | ||
| handlers.add_account_id, | ||
| handlers.parse_service_request, | ||
| # TODO: add logger that initializes a request "trace" |
There was a problem hiding this comment.
| # TODO: add logger that initializes a request "trace" |
LocalStack Community integration with Pro 2 files ±0 2 suites ±0 1h 38m 49s ⏱️ + 2m 16s For more details on these failures, see this check. Results for commit f19f45c. ± Comparison against base commit 3c4c463. This pull request skips 2 tests. |
| response.status_code, | ||
| context.service_exception.code, | ||
| extra={ | ||
| "request_id": context.request_id, |
There was a problem hiding this comment.
request_id propagation is gonna be crucial for proper grouping and is going to require quite some manual work within service-specific loggers. Other fields can then be inferred based on the request_id because some other fields won't be available at different stages (e.g., service, region, and operation won't be available before parsing an HTTP request into an AWS request; response-related fields such as status code obviously won't be available before handling the request)
Motivation
We've had a few requests in the past for outputting our logs in a structured format. We've been experimenting in the past but would like to push forward with the idea now. This will be in an experimental stage for some time now and will mostly be used internally before going into public preview and ideally GA release with 4.0 🤞
Changes
What's left to do:
There will also be some follow ups such as: