Testing S3 Upload with Sample Files — Presigned URLs & Multipart
Testing S3 uploads requires real files of various sizes. This guide shows how to test presigned URL uploads, multipart uploads, and error handling using sample files from TrueFileSize.
Prerequisites
- AWS SDK v3 (
@aws-sdk/client-s3,@aws-sdk/s3-request-presigner) - Node.js 18+ or Python 3.10+
- LocalStack (optional, for local testing)
Step 1: Download Test Files
mkdir -p test-files
# Small file for basic upload
curl -sL -o test-files/doc.pdf https://cdn.truefilesize.com/pdf/sample-1mb.pdf
# Medium file for presigned URL
curl -sL -o test-files/photo.jpg https://cdn.truefilesize.com/jpg/sample-3mb.jpg
# Large file for multipart upload
curl -sL -o test-files/video.mp4 https://cdn.truefilesize.com/mp4/sample-50mb.mp4
Step 2: Test Basic S3 Upload (Node.js)
// tests/s3-upload.test.ts
import { S3Client, PutObjectCommand, GetObjectCommand } from '@aws-sdk/client-s3';
import { readFileSync } from 'fs';
const s3 = new S3Client({
region: 'us-east-1',
// For LocalStack testing:
// endpoint: 'http://localhost:4566',
// forcePathStyle: true,
});
const BUCKET = 'test-uploads';
describe('S3 Upload', () => {
it('uploads a PDF file', async () => {
const file = readFileSync('test-files/doc.pdf');
await s3.send(new PutObjectCommand({
Bucket: BUCKET,
Key: 'uploads/test-doc.pdf',
Body: file,
ContentType: 'application/pdf',
}));
// Verify upload
const result = await s3.send(new GetObjectCommand({
Bucket: BUCKET,
Key: 'uploads/test-doc.pdf',
}));
expect(result.ContentLength).toBe(file.length);
expect(result.ContentType).toBe('application/pdf');
});
});
Step 3: Test Presigned URL Upload
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
it('uploads via presigned URL', async () => {
// 1. Generate presigned URL (server-side)
const command = new PutObjectCommand({
Bucket: BUCKET,
Key: 'uploads/photo.jpg',
ContentType: 'image/jpeg',
});
const presignedUrl = await getSignedUrl(s3, command, { expiresIn: 300 });
// 2. Upload using presigned URL (client-side simulation)
const file = readFileSync('test-files/photo.jpg');
const res = await fetch(presignedUrl, {
method: 'PUT',
body: file,
headers: { 'Content-Type': 'image/jpeg' },
});
expect(res.status).toBe(200);
// 3. Verify file exists
const obj = await s3.send(new GetObjectCommand({
Bucket: BUCKET,
Key: 'uploads/photo.jpg',
}));
expect(obj.ContentLength).toBe(file.length);
});
Step 4: Test Multipart Upload (Large Files)
import { Upload } from '@aws-sdk/lib-storage';
import { createReadStream } from 'fs';
it('uploads large file via multipart', async () => {
const stream = createReadStream('test-files/video.mp4');
const upload = new Upload({
client: s3,
params: {
Bucket: BUCKET,
Key: 'uploads/large-video.mp4',
Body: stream,
ContentType: 'video/mp4',
},
partSize: 5 * 1024 * 1024, // 5MB parts
leavePartsOnError: false,
});
// Track progress
upload.on('httpUploadProgress', (progress) => {
console.log(`Uploaded: ${progress.loaded}/${progress.total} bytes`);
});
await upload.done();
// Verify
const obj = await s3.send(new GetObjectCommand({
Bucket: BUCKET,
Key: 'uploads/large-video.mp4',
}));
expect(obj.ContentLength).toBeGreaterThan(50_000_000);
}, 120_000); // 2 minute timeout for large file
Step 5: Test with LocalStack (No AWS Costs)
# docker-compose.yml
services:
localstack:
image: localstack/localstack
ports:
- "4566:4566"
environment:
SERVICES: s3
# Create test bucket
aws --endpoint-url=http://localhost:4566 s3 mb s3://test-uploads
# Run tests against LocalStack
API_ENDPOINT=http://localhost:4566 npm test
Test Matrix
| Test | File | Size | Approach | |------|------|------|----------| | Basic PUT | sample-1mb.pdf | 1 MB | PutObjectCommand | | Presigned URL | sample-3mb.jpg | 3 MB | getSignedUrl + fetch | | Multipart | sample-50mb.mp4 | 50 MB | @aws-sdk/lib-storage | | Size limit | sample-100mb.bin | 100 MB | Verify rejection or success | | Wrong content type | Rename .jpg to .pdf | 500 KB | Content validation | | Corrupted file | sample-corrupt.pdf | 10 KB | Error handling |
Python (boto3)
import boto3
from pathlib import Path
s3 = boto3.client('s3')
# Basic upload
s3.upload_file('test-files/doc.pdf', 'test-uploads', 'uploads/doc.pdf')
# Multipart upload (automatic for large files)
s3.upload_file(
'test-files/video.mp4',
'test-uploads',
'uploads/video.mp4',
Config=boto3.s3.transfer.TransferConfig(
multipart_threshold=5 * 1024 * 1024, # 5MB
multipart_chunksize=5 * 1024 * 1024,
)
)
Download all sample files: TrueFileSize · Large test files