DocumentationFundamentals

AWS S3

Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections.

Connect Webhook Relay to Amazon S3 to receive object notifications as webhooks (input) or store incoming webhook data as S3 objects (output).

Prerequisites

IAM Permissions

For S3 Input (receive notifications):

  • s3:GetBucketNotificationConfiguration
  • s3:PutBucketNotificationConfiguration
  • s3:GetObject
  • s3:ListBucket

For S3 Output (upload objects):

  • s3:PutObject

S3 Input — Receive Object Notifications

S3 inputs relay notifications when objects are created or modified in your S3 bucket. Each notification is delivered as a webhook event into your Webhook Relay bucket.

Configuration

FieldRequiredDescription
bucket_nameYesS3 bucket name
regionYesAWS region (e.g. us-east-1)
prefixNoFilter objects by key prefix (e.g. uploads/)
file_formatNoHow to read files: json (default), body_only, har

File Formats

FormatDescription
jsonFull webhook data including HTTP method, headers, query params, and body (default)
body_onlyOnly the raw file content — no headers or metadata
harHTTP Archive format (HAR 1.2 spec)

S3 Output — Upload Webhook Data to S3

S3 outputs store incoming webhook data as objects in your S3 bucket. Each webhook is saved as a separate file.

Configuration

FieldRequiredDescription
bucket_nameYesS3 bucket name
regionYesAWS region
prefixNoKey prefix for uploaded objects (e.g. webhooks/)
file_formatNoStorage format: json (default), body_only, har

Object Path

Objects are stored with a date-based path:

{prefix}/{year}/{month}/{day}/{log_id}.json

For example: webhooks/2026/02/24/whl_abc123.json

Example: Bridge GCP Pub/Sub to AWS S3

You can receive messages from a GCP Pub/Sub subscription and archive them as objects in an S3 bucket. This is useful for cross-cloud data archival:

  1. Create a GCP service connection with Pub/Sub subscriber permissions
  2. Create an AWS service connection with S3 write permissions
  3. Create a bucket in Webhook Relay
  4. Add a GCP Pub/Sub input on the bucket (messages flow in)
  5. Add an AWS S3 output on the bucket (messages get stored as objects)

Every message published to your Pub/Sub topic will automatically be archived as an S3 object.

Transform Before Storing

Attach a Function to the bucket to transform the payload before it reaches S3. For example, extract only the relevant fields from a Pub/Sub message:

const message = JSON.parse(r.body)

// Extract just the data you need
const simplified = {
    event_type: message.attributes.event_type,
    data: message.data,
    timestamp: message.publish_time
}

r.setBody(JSON.stringify(simplified))

See the JSON encoding guide for more transformation examples.

Example: S3 Event Notifications to Any HTTPS API

Forward S3 object notifications to any API endpoint. This eliminates the need to set up Lambda functions or SNS topics for simple webhook delivery:

  1. Create an AWS service connection
  2. Create a bucket with an S3 input pointing to your AWS bucket
  3. Add a public destination (any HTTPS URL) as an output

When new objects are uploaded to S3, Webhook Relay delivers the notification to your API. You can also use Functions to filter events by key prefix, transform the payload format, or add custom authentication headers before forwarding.

Did this page help you?