New Training: Managing Amazon S3 Storage with Python and Boto3
In this 8-video skill, CBT Nuggets trainer Trevor Sullivan uses Amazon Simple Storage Service (S3) and the Boto3 module for Python to easily interact with data stored in Amazon S3. Watch this new DevOps training.
Learn DevOps with one of these courses:
This training includes:
- 8 videos
- 1.3 hours of training
You’ll learn these topics in this skill:
- Introduction to Amazon S3 and Python Boto3 SDK
- Configure Remote Development Environment for Amazon S3 Programming
- Understanding Python Boto3 Module Structure
- Create Amazon S3 Bucket with Python and Boto3
- Storing Data in Amazon S3 Buckets with Python and Boto3
- Listing Objects in an Amazon S3 Bucket with Python and Boto3
- Paginate Object Listings in Amazon S3 Buckets with Python
- Delete Amazon S3 Objects and Bucket with Python Boto3
How to Transfer Large Files to S3 with Python
AWS S3 service is a great tool for managing and storing a lot of data in a scalable way for a relatively low cost. Uploading large files to S3 isn't possible through its web dashboard interface, though. Amazon recommends using the CLI for this instead. The CLI is used with the AWS SDK to use multipart uploading to transfer large files to an S3 bucket. For developers using Python, this is easily done with the Boto3 library.
The Boto3 library was created to integrate the AWS SDK into Python in a native way. Getting started with the Boto3 library is easy. Use Python's built-in package manager to install it (pip install boto3). Once Boto3 is installed, it will need to be configured with access tokens from an AWS IAM policy.
After the Boto3 library is installed, instantiate an S3 object with it. Once that object has been created, use the MultipartUpload function to upload a large file to a designated S3 bucket. That function will take care of the brunt work of transferring that large file in pieces to AWS S3.