New Training: Write a Custom Data Processor for AWS CloudTrail Audit Logs
In this 6-video skill, CBT Nuggets trainer Trevor Sullivan demonstrates how to write a custom data processor for AWS CloudTrail data auditing, which can be delivered directly to bucket. Watch this new AWS training.
Watch the full course: AWS Cloud Automation
This training includes:
1.2 hours of training
You’ll learn these topics in this skill:
Introduction to Interpreting AWS CloudTrail Logs with PowerShell
Setting up AWS CloudTrail Development Tools
Download AWS CloudTrail Logs with Amazon S3 PowerShell Commands
Inspecting the AWS CloudTrail JSON Data Structure
Examine an Individual AWS CloudTrail Microbatch with PowerShell
Aggregate Microbatch Analysis of AWS CloudTrail Logs with PowerShell
How to Pull CloudTrail Summaries with PowerShell
One of the tools that AWS offers to ensure AWS infrastructure governance and compliance is CloudTrail. CloudTrail collects and manages a lot of data, though. So, IT pros may want to create custom scripts to help ingest CloudTrail data and parse it. Scripts to handle these functions can be created within PowerShell using the AWS Tools for PowerShell cmdlets provided by AWS.
One of the cmdlets that can be used to automate pulling and ingesting log data is the Get-CTTrailSummary cmdlet. This command makes a call to the AWS CloudTrail ListTrails API to pull CloudTrail summaries. Data is returned in a JSON token. That JSON data can then be parsed and used as needed.
The Get-CTTrailSummary cmdlet includes three optional flags in addition to the typical regional and authentication flags. Those flags are:
By default, this cmdlet pulls all given data. DevOps engineers can use the 'NextToken' flag to restrict the API call to include only one record at a time. Likewise, DevOps engineers can use the 'NoAutoIteration' flag to only pull one page of records instead of a single record. These commands can help limit the amount of data being pulled by CloudTrail at a given time.
It should be noted that this is simply one command to work with CloudTrail logs with PowerShell. Since CloudTrail stores logs in S3, developers can use a range of cmdlets for the AWS SDK to work with those logs.
For example, logs can be pulled from S3 after finding the log trail with the cmdlet mentioned above. Logs are typically saved in a compressed Gzip container, though. PowerShell doesn't support Gzip by default, so a 3rd party application may be required to decompress that file.
After log files are decompressed, they are structured as JSON files. That means that an interpreter can easily be created to handle that JSON data as needed, whether it's searching for specific keys or parsing that data and storing it into a database.
These JSON files include multiple records. Two useful records that may help find signs of a cyberattack are the sourceIPAddress and userAgent fields. It can be expected that the sourceIPAddress and userAgent fields will remain mostly static for most organizations, so finding anomalies in these fields could help identify possible issues.