You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Default listing of log will contain the following components: eventId, userName, eventTime, eventName, eventSource errorCode, errorMessage.
Options should allow for different formats and data included in listing of logs. For example an option of "---raw" could mean to just output the json.
CloudTrail stores its logs in s3 so retrieval of the logs for parsing will require an understanding of how to query s3, the storage path for CloudTrail logs, and the ability to parse the format for listing of s3 information. Additionally, the download of the logs in their json.gz format, which once downloaded can be parsed further for information.
--- CloudTrail s3 path ---
s3://[bucket for logs]/AWSLogs/[Account number]/CloudTrail/[region]/[year]/[month]/[day]
--- Output format of "aws s3 ls" ---
[year]-[month]-[day] [byte count or PRE for folder] [file or folder]
Note: the illusion of folders is for human convenience, the buckets are actually flat structures.
The text was updated successfully, but these errors were encountered:
Reference for cloud trail log format here.
Default listing of log will contain the following components: eventId, userName, eventTime, eventName, eventSource errorCode, errorMessage.
Options should allow for different formats and data included in listing of logs. For example an option of "---raw" could mean to just output the json.
CloudTrail stores its logs in s3 so retrieval of the logs for parsing will require an understanding of how to query s3, the storage path for CloudTrail logs, and the ability to parse the format for listing of s3 information. Additionally, the download of the logs in their json.gz format, which once downloaded can be parsed further for information.
--- CloudTrail s3 path ---
s3://[bucket for logs]/AWSLogs/[Account number]/CloudTrail/[region]/[year]/[month]/[day]
--- Output format of "aws s3 ls" ---
[year]-[month]-[day] [byte count or PRE for folder] [file or folder]
Note: the illusion of folders is for human convenience, the buckets are actually flat structures.
The text was updated successfully, but these errors were encountered: