AWS S3
Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections.
Connect Webhook Relay to Amazon S3 to store incoming webhook data as S3 objects (output).
Prerequisites
- An AWS service connection with credentials that have S3 permissions
- An S3 bucket in your AWS account
IAM Permissions
For S3 Output (upload objects):
s3:PutObject
S3 Output — Upload Webhook Data to S3
S3 outputs store incoming webhook data as objects in your S3 bucket. Each webhook is saved as a separate file.
Configuration
| Field | Required | Description |
|---|---|---|
bucket_name | Yes | S3 bucket name |
region | Yes | AWS region |
prefix | No | Key prefix for uploaded objects (e.g. webhooks/) |
file_format | No | Storage format: json (default), body_only, har |
Object Path
Objects are stored with a date-based path:
{prefix}/{year}/{month}/{day}/{log_id}.json
For example: webhooks/2026/02/24/whl_abc123.json
Example: Bridge GCP Pub/Sub to AWS S3
You can receive messages from a GCP Pub/Sub subscription and archive them as objects in an S3 bucket. This is useful for cross-cloud data archival:
- Create a GCP service connection with Pub/Sub subscriber permissions
- Create an AWS service connection with S3 write permissions
- Create a bucket in Webhook Relay
- Add a GCP Pub/Sub input on the bucket (messages flow in)
- Add an AWS S3 output on the bucket (messages get stored as objects)
Every message published to your Pub/Sub topic will automatically be archived as an S3 object.
Transform Before Storing
Attach a Function to the bucket to transform the payload before it reaches S3. For example, extract only the relevant fields from a Pub/Sub message:
const message = JSON.parse(r.body)
// Extract just the data you need
const simplified = {
event_type: message.attributes.event_type,
data: message.data,
timestamp: message.publish_time
}
r.setBody(JSON.stringify(simplified))
See the JSON encoding guide for more transformation examples.
