Duplicating files from one AWS S3 bucket to another is a common task when migrating data, creating backups, or deploying content across environments. This guide walks you through the methods to copy objects from one bucket to another efficiently
Prerequisites
Before you begin, ensure that you have:
An AWS account
AWS CLI installed on your system
IAM credentials with
s3:GetObject
access on the source bucket ands3:PutObject
access on the destination bucketBoth buckets created in the same or compatible AWS regions
Method 1: Using AWS CLI
Step 1: Install and Configure AWS CLI
If not already installed:
Then configure:
aws configure
Enter your Access Key, Secret Key, region (e.g. af-south-1
), and output format (json
recommended).
Step 2: Use the sync
Command
To copy all files and folders from one bucket to another:
aws s3 sync s3://source-bucket-name s3://destination-bucket-name --region your-region
📌 Example:
aws s3 sync s3://my-old-bucket s3://my-new-bucket --region af-south-1
Optional Flags:
--delete
: Remove files in the destination that don’t exist in the source--exclude "*.tmp"
: Skip specific files or patterns--dryrun
: Test the command before executing it
Method 2: Copying a Specific File
To copy a single file:
aws s3 cp s3://source-bucket/path/to/file.pdf s3://destination-bucket/path/to/file.pdf
You can also use --recursive
to copy entire folders:
aws s3 cp s3://source-bucket/folder/ s3://destination-bucket/folder/ --recursive
Common Issues
Issue | Solution |
---|---|
| Ensure your IAM user has |
Slow transfers | Use |
Bucket in different region | Add |
IAM Policy Example
Here’s an example IAM policy for copying between buckets:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:ListBucket"],
"Resource": ["arn:aws:s3:::source-bucket/*", "arn:aws:s3:::source-bucket"]
},
{
"Effect": "Allow",
"Action": ["s3:PutObject"],
"Resource": ["arn:aws:s3:::destination-bucket/*"]
}
]
}
If you need to automate this process or integrate it into a deployment pipeline, feel free to contact our DevOps team for advanced scripting assistance.
Still need help?
Contact us