Overview
Email attachments can range from a few kilobytes (a text file) to 25 MB (the maximum email size). Large attachments — PDFs, images, spreadsheets, CAD files — require different handling from small attachments to avoid memory exhaustion, webhook timeouts, and storage inefficiencies.
JsonHook's architecture is designed for this: attachment metadata (filename, size, content type) is included in the webhook payload without the binary content, keeping webhook payloads small and fast. Binary content is available via a separate download API call, which you can make asynchronously in a background job after acknowledging the webhook.
Best practices for large attachment handling:
- Acknowledge the webhook immediately and download attachments asynchronously
- Stream large files directly to cloud storage (S3, GCS) rather than buffering in memory
- Validate file size and content type before downloading
- Use signed URLs for downstream access rather than serving files yourself
Prerequisites
Requirements for large attachment handling:
- A JsonHook Pro plan (attachment content download requires Pro)
- A cloud storage service: AWS S3, Google Cloud Storage, or Cloudflare R2
- The AWS SDK, GCS client, or equivalent for your language
- A queue for async processing (BullMQ, SQS, etc.) to download attachments outside the webhook handler
Handle Email Attachments at Scale
Metadata in every payload. Binary download on Pro. Stream to S3 in minutes.
Get Free API KeyStep-by-Step Instructions
Handle large attachments safely:
- In the webhook handler, filter large attachments by size:
const largeAttachments = email.attachments.filter(a => a.size > 1_000_000); // >1MB const smallAttachments = email.attachments.filter(a => a.size <= 1_000_000); - Enqueue download jobs for large attachments and return 200 immediately.
- In your background worker, download and stream to S3:
// Download using Node.js fetch streaming const downloadRes = await fetch(attachmentDownloadUrl, { headers: { Authorization: ... }}); // Pipe directly to S3 using @aws-sdk/lib-storage Upload - Validate file content after download using a file magic byte check.
- Generate a signed URL for downstream access and store the URL in your database.
- Handle download failures with retry logic — attachment content expires after the retention period.
Code Example
Streaming attachment download to S3 using AWS SDK v3 multipart upload:
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";
import fetch from "node-fetch";
const s3 = new S3Client({ region: "us-east-1" });
async function streamAttachmentToS3(
deliveryId: string,
attachment: { id: string; filename: string; contentType: string; size: number }
): Promise<string> {
// Enforce size limit
const MAX_SIZE = 25 * 1024 * 1024; // 25 MB
if (attachment.size > MAX_SIZE) {
throw new Error(`Attachment too large: ${attachment.size} bytes`);
}
// Download from JsonHook
const res = await fetch(
`https://api.jsonhook.com/v1/deliveries/${deliveryId}/attachments/${attachment.id}`,
{ headers: { Authorization: `Bearer ${process.env.JSONHOOK_API_KEY}` } }
);
if (!res.ok) throw new Error(`Download failed: ${res.status}`);
const s3Key = `attachments/${deliveryId}/${attachment.filename}`;
// Stream directly to S3 without buffering entire file in memory
const upload = new Upload({
client: s3,
params: {
Bucket: "my-email-attachments",
Key: s3Key,
Body: res.body, // Node.js readable stream
ContentType: attachment.contentType,
ContentDisposition: `attachment; filename="${attachment.filename}"`,
},
});
await upload.done();
return `https://my-email-attachments.s3.amazonaws.com/${s3Key}`;
}Common Pitfalls
Large attachment handling pitfalls:
- Buffering the entire attachment in memory. A 25 MB attachment base64-encoded becomes ~34 MB in memory. Stream from JsonHook's download API directly to S3 using pipe/stream APIs rather than reading the full response into a Buffer.
- Downloading in the webhook handler synchronously. A 25 MB download takes several seconds — far too long for a webhook handler. Always enqueue download jobs and acknowledge the webhook immediately.
- Not checking attachment size before downloading. Validate the
sizefield from the webhook payload before initiating a download to enforce your application's size limits without making an HTTP request. - Trusting the filename from the email. Sanitize filenames before using as S3 keys. Remove path traversal sequences, replace spaces with underscores, and limit to a safe character set.
- Not handling download expiry. Attachment content is only retained for 7 or 30 days depending on your plan. If your background job is delayed and the retention period expires, the download will fail. Download attachments within 24 hours of receipt to provide a safe margin.