Claude Code AWS S3 Multipart Upload Workflow Guide
Large file uploads to Amazon S3 can be challenging when dealing with files exceeding 5GB or unstable network connections. AWS S3 multipart upload breaks large files into parts, enabling parallel uploads and resumable transfers. This guide shows you how to leverage Claude Code to create efficient multipart upload workflows that automate the entire process.
Understanding Multipart Upload Basics
Before diving into Claude Code workflows, let’s understand how multipart uploads work in AWS S3. When you upload a large object, S3 allows you to break it into parts (typically 5MB to 5GB each). Each part uploads independently, and you can even upload parts in parallel for faster throughput.
The multipart upload process follows three stages:
- Initiate Multipart Upload - Create an upload ID for tracking
- Upload Parts - Send each part with its sequence number
- Complete or Abort - Either combine all parts or clean up failed uploads
This is where Claude Code shines—it can manage the entire lifecycle, handle errors, and resume interrupted uploads automatically.
Setting Up AWS Credentials for Claude Code
Before creating multipart upload workflows, ensure your AWS credentials are configured properly. You can set up credentials using environment variables or AWS profiles:
export AWS_ACCESS_KEY_ID="your-access-key"
export AWS_SECRET_ACCESS_KEY="your-secret-key"
export AWS_DEFAULT_REGION="us-east-1"
Alternatively, use AWS profiles with the AWS_PROFILE environment variable. Claude Code can read these automatically when executing AWS CLI commands or calling the SDK.
Creating a Claude Code Skill for Multipart Upload
Here’s a practical skill that handles multipart uploads efficiently:
---
name: s3-multipart-upload
description: Upload large files to AWS S3 using multipart upload with automatic retry and resume capability
---
# S3 Multipart Upload Skill
This skill handles large file uploads to S3 using multipart upload for reliability and performance.
## Usage
When I need to upload a large file to S3, I'll use the following process:
1. First, I'll check the file size and determine the optimal part size
2. Initiate the multipart upload with the appropriate configuration
3. Upload parts in parallel where possible
4. Track progress and handle any failures gracefully
5. Complete or abort the upload based on success
## Key Commands
### Initiate Upload
aws s3api create-multipart-upload --bucket BUCKET --key KEY --region REGION
### Upload Part
aws s3api upload-part --bucket BUCKET --key KEY --upload-id ID --part-number N --body part-file
### Complete Upload
aws s3api complete-multipart-upload --bucket BUCKET --key KEY --upload-id ID --multipart-upload file.json
Practical Example: Automated Upload Script
Here’s a practical bash script that Claude Code can use to perform multipart uploads:
#!/bin/bash
# S3 Multipart Upload Script
BUCKET=$1
FILE_PATH=$2
KEY=$3
REGION=${4:-us-east-1}
PART_SIZE=${5:-100} # Part size in MB
# Calculate number of parts
FILE_SIZE=$(stat -f%z "$FILE_PATH")
PART_SIZE_BYTES=$((PART_SIZE * 1024 * 1024))
NUM_PARTS=$(( (FILE_SIZE + PART_SIZE_BYTES - 1) / PART_SIZE_BYTES ))
echo "File size: $FILE_SIZE bytes"
echo "Part size: $PART_SIZE MB"
echo "Number of parts: $NUM_PARTS"
# Initiate multipart upload
UPLOAD_RESULT=$(aws s3api create-multipart-upload \
--bucket "$BUCKET" \
--key "$KEY" \
--region "$REGION")
UPLOAD_ID=$(echo "$UPLOAD_RESULT" | jq -r '.UploadId')
echo "Upload ID: $UPLOAD_ID"
# Upload each part
for i in $(seq 1 $NUM_PARTS); do
PART_NUM=$i
START=$(( (i - 1) * PART_SIZE_BYTES ))
# Create part file using dd
dd if="$FILE_PATH" bs=1M skip=$((START / 1024 / 1024)) count=$PART_SIZE 2>/dev/null | \
aws s3api upload-part \
--bucket "$BUCKET" \
--key "$KEY" \
--upload-id "$UPLOAD_ID" \
--part-number "$PART_NUM" \
--body file://- \
--region "$REGION"
echo "Uploaded part $PART_NUM of $NUM_PARTS"
done
# Complete multipart upload
aws s3api complete-multipart-upload \
--bucket "$BUCKET" \
--key "$KEY" \
--upload-id "$UPLOAD_ID" \
--multipart-upload "file://parts.json" \
--region "$REGION"
Handling Large Files with Resume Capability
One of the most valuable features you can add to multipart upload workflows is resume capability. Here’s how Claude Code can help manage this:
import json
import os
import boto3
class MultipartUploader:
def __init__(self, bucket, key, part_size_mb=100):
self.bucket = bucket
self.key = key
self.part_size = part_size_mb * 1024 * 1024
self.s3 = boto3.client('s3')
self.upload_id = None
self.parts = []
self.state_file = f"/tmp/{key}.upload_state"
def load_state(self):
if os.path.exists(self.state_file):
with open(self.state_file, 'r') as f:
state = json.load(f)
self.upload_id = state['upload_id']
self.parts = state['parts']
return True
return False
def save_state(self):
state = {
'upload_id': self.upload_id,
'parts': self.parts
}
with open(self.state_file, 'w') as f:
json.dump(state, f)
def upload_part(self, part_number, data):
# Check if part already uploaded
for part in self.parts:
if part['PartNumber'] == part_number:
print(f"Part {part_number} already uploaded")
return part['ETag']
response = self.s3.upload_part(
Bucket=self.bucket,
Key=self.key,
PartNumber=part_number,
UploadId=self.upload_id,
Body=data
)
etag = response['ETag']
self.parts.append({'PartNumber': part_number, 'ETag': etag})
self.save_state()
return etag
Best Practices for Claude Code S3 Workflows
When implementing multipart upload workflows with Claude Code, consider these best practices:
-
Choose the right part size - For files under 5GB, a single PUT is simpler. For larger files, use 100-500MB parts for optimal performance.
-
Enable parallel uploads - Claude Code can orchestrate multiple part uploads simultaneously using background processes:
# Upload parts in parallel
for i in {1..10}; do
upload_part $i &
done
wait
- Implement proper cleanup - Always abort multipart uploads that don’t complete to avoid charges:
aws s3api abort-multipart-upload \
--bucket BUCKET \
--key KEY \
--upload-id UPLOAD_ID
- Use S3 Transfer Acceleration - For geographically distributed uploads, enable transfer acceleration for faster uploads:
aws s3api create-multipart-upload \
--bucket BUCKET \
--key KEY \
--acl bucket-owner-full-control \
--use-accelerate-endpoint
Conclusion
Claude Code transforms AWS S3 multipart uploads from complex manual processes into automated, reliable workflows. By combining Claude Code’s skill system with AWS CLI or SDK capabilities, you can build robust upload handlers that handle large files efficiently, recover from failures automatically, and scale to meet production demands.
Start by creating a basic skill following the examples above, then extend it with error handling, progress tracking, and resume capability as your requirements grow. With Claude Code managing your S3 workflows, you can focus on your application logic while it handles the intricacies of large file transfers.
Related Reading
- Claude Code for Beginners: Complete Getting Started Guide
- Best Claude Skills for Developers in 2026
- Claude Skills Guides Hub
Built by theluckystrike — More at zovo.one