Powershell is a useful tool for automation in Windows. AWS provides CLI tools for Powershell, including a full interface for their Simple Storage Service. If you want to automate sending files to S3 buckets, it’s pretty simple.
Setting Up the Powershell CLI
First, you’ll need to install the general AWS.Tools package, which manages all the modules for various services. Say yes to the prompts if you get an untrusted warning:
Install-Module -Name AWS.Tools.Installer
You can then install the S3 specific module:
Install-AWSToolsModule AWS.Tools.EC2,AWS.Tools.S3 -CleanUp
You’ll need to link your account to the tools. There are a few methods to handle credentials—you can specify them per command, per session, or for all sessions. If this is a script running on your own machine, you’ll probably want to just set your account credentials once using the default credential store:
Set-AWSCredential ` -AccessKey AKIA0123456787EXAMPLE ` -SecretKey wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY ` -StoreAs default
Note that you should create an IAM user for this, rather than using your root account.
Once linked, uploading files is very easy. To upload to the root of a bucket, give the Cmdlet a bucket name and a path to the file:
Write-S3Object -BucketName bucket -File file.txt
To upload to a specific location, you’ll need to give it a string Key, making sure to manually specify the filename as well:
Write-S3Object -BucketName bucket -Key "subfolder/File.txt" -File file.txt
And, to sync a whole folder, use the
-Folder parameter. Optionally, you can upload the folder to a subdirectory by specifying a prefix for each item:
Write-S3Object -BucketName bucket -Folder .\Scripts -KeyPrefix Scripts\
For more documentation on other S3-related Cmdlets, you can read the reference for the module.