AWS S3
Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections.
Connect Webhook Relay to Amazon S3 to receive object notifications as webhooks (input) or store incoming webhook data as S3 objects (output).
Prerequisites
- An AWS service connection with credentials that have S3 permissions
- An S3 bucket in your AWS account
IAM Permissions
For S3 Input (receive notifications):
s3:GetBucketNotificationConfigurations3:PutBucketNotificationConfigurations3:GetObjects3:ListBucket
For S3 Output (upload objects):
s3:PutObject
S3 Input — Receive Object Notifications
S3 inputs relay notifications when objects are created or modified in your S3 bucket. Each notification is delivered as a webhook event into your Webhook Relay bucket.
Configuration
| Field | Required | Description |
|---|---|---|
bucket_name | Yes | S3 bucket name |
region | Yes | AWS region (e.g. us-east-1) |
prefix | No | Filter objects by key prefix (e.g. uploads/) |
file_format | No | How to read files: json (default), body_only, har |
File Formats
| Format | Description |
|---|---|
json | Full webhook data including HTTP method, headers, query params, and body (default) |
body_only | Only the raw file content — no headers or metadata |
har | HTTP Archive format (HAR 1.2 spec) |
S3 Output — Upload Webhook Data to S3
S3 outputs store incoming webhook data as objects in your S3 bucket. Each webhook is saved as a separate file.
Configuration
| Field | Required | Description |
|---|---|---|
bucket_name | Yes | S3 bucket name |
region | Yes | AWS region |
prefix | No | Key prefix for uploaded objects (e.g. webhooks/) |
file_format | No | Storage format: json (default), body_only, har |
Object Path
Objects are stored with a date-based path:
{prefix}/{year}/{month}/{day}/{log_id}.json
For example: webhooks/2026/02/24/whl_abc123.json
Example: Bridge GCP Pub/Sub to AWS S3
You can receive messages from a GCP Pub/Sub subscription and archive them as objects in an S3 bucket. This is useful for cross-cloud data archival:
- Create a GCP service connection with Pub/Sub subscriber permissions
- Create an AWS service connection with S3 write permissions
- Create a bucket in Webhook Relay
- Add a GCP Pub/Sub input on the bucket (messages flow in)
- Add an AWS S3 output on the bucket (messages get stored as objects)
Every message published to your Pub/Sub topic will automatically be archived as an S3 object.
Transform Before Storing
Attach a Function to the bucket to transform the payload before it reaches S3. For example, extract only the relevant fields from a Pub/Sub message:
const message = JSON.parse(r.body)
// Extract just the data you need
const simplified = {
event_type: message.attributes.event_type,
data: message.data,
timestamp: message.publish_time
}
r.setBody(JSON.stringify(simplified))
See the JSON encoding guide for more transformation examples.
Example: S3 Event Notifications to Any HTTPS API
Forward S3 object notifications to any API endpoint. This eliminates the need to set up Lambda functions or SNS topics for simple webhook delivery:
- Create an AWS service connection
- Create a bucket with an S3 input pointing to your AWS bucket
- Add a public destination (any HTTPS URL) as an output
When new objects are uploaded to S3, Webhook Relay delivers the notification to your API. You can also use Functions to filter events by key prefix, transform the payload format, or add custom authentication headers before forwarding.
