DocumentationFundamentals

AWS S3

Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections.

Connect Webhook Relay to Amazon S3 to store incoming webhook data as S3 objects (output).

Prerequisites

IAM Permissions

For S3 Output (upload objects):

  • s3:PutObject

S3 Output — Upload Webhook Data to S3

S3 outputs store incoming webhook data as objects in your S3 bucket. Each webhook is saved as a separate file.

Configuration

FieldRequiredDescription
bucket_nameYesS3 bucket name
regionYesAWS region
prefixNoKey prefix for uploaded objects (e.g. webhooks/)
file_formatNoStorage format: json (default), body_only, har

Object Path

Objects are stored with a date-based path:

{prefix}/{year}/{month}/{day}/{log_id}.json

For example: webhooks/2026/02/24/whl_abc123.json

Example: Bridge GCP Pub/Sub to AWS S3

You can receive messages from a GCP Pub/Sub subscription and archive them as objects in an S3 bucket. This is useful for cross-cloud data archival:

  1. Create a GCP service connection with Pub/Sub subscriber permissions
  2. Create an AWS service connection with S3 write permissions
  3. Create a bucket in Webhook Relay
  4. Add a GCP Pub/Sub input on the bucket (messages flow in)
  5. Add an AWS S3 output on the bucket (messages get stored as objects)

Every message published to your Pub/Sub topic will automatically be archived as an S3 object.

Transform Before Storing

Attach a Function to the bucket to transform the payload before it reaches S3. For example, extract only the relevant fields from a Pub/Sub message:

const message = JSON.parse(r.body)

// Extract just the data you need
const simplified = {
    event_type: message.attributes.event_type,
    data: message.data,
    timestamp: message.publish_time
}

r.setBody(JSON.stringify(simplified))

See the JSON encoding guide for more transformation examples.

Did this page help you?