---
title: "AWS S3 | WebhookRelay"
meta:
  "og:description": "Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections."
  "og:title": "AWS S3"
  description: "Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections."
---

![Stripes](https://webhookrelay.com/docs/service-connections/aws_s3/images/stripes.svg)

Documentation

**Fundamentals**

# **AWS S3**

Receive S3 object notifications as webhooks and upload webhook data to S3 buckets using Webhook Relay service connections.

Connect Webhook Relay to **Amazon S3** to store incoming webhook data as S3 objects (output).

## [Prerequisites](#prerequisites)

- An [AWS service connection](https://webhookrelay.com/docs/service-connections/aws_s3/docs/service-connections) with credentials that have S3 permissions
- An S3 bucket in your AWS account

### [IAM Permissions](#iam-permissions)

**For S3 Output (upload objects):**

- `s3:PutObject`

## [S3 Output — Upload Webhook Data to S3](#s3-output-upload-webhook-data-to-s3)

S3 outputs store incoming webhook data as objects in your S3 bucket. Each webhook is saved as a separate file.

### [Configuration](#configuration)

| Field | Required | Description |
| --- | :---: | --- |
| `bucket_name` | Yes | S3 bucket name |
| `region` | Yes | AWS region |
| `prefix` | No | Key prefix for uploaded objects (e.g. `webhooks/`) |
| `file_format` | No | Storage format: `json` (default), `body_only`, `har` |

### [Object Path](#object-path)

Objects are stored with a date-based path:

```
{prefix}/{year}/{month}/{day}/{log_id}.json
```

For example: `webhooks/2026/02/24/whl_abc123.json`

## [Example: Bridge GCP Pub/Sub to AWS S3](#example-bridge-gcp-pubsub-to-aws-s3)

You can receive messages from a GCP Pub/Sub subscription and archive them as objects in an S3 bucket. This is useful for cross-cloud data archival:

1. Create a [GCP service connection](https://webhookrelay.com/docs/service-connections/aws_s3/docs/service-connections) with Pub/Sub subscriber permissions
2. Create an [AWS service connection](https://webhookrelay.com/docs/service-connections/aws_s3/docs/service-connections) with S3 write permissions
3. Create a bucket in Webhook Relay
4. Add a **GCP Pub/Sub input** on the bucket (messages flow in)
5. Add an **AWS S3 output** on the bucket (messages get stored as objects)

Every message published to your Pub/Sub topic will automatically be archived as an S3 object.

### [Transform Before Storing](#transform-before-storing)

Attach a [Function](https://webhookrelay.com/docs/service-connections/aws_s3/docs/webhooks/functions) to the bucket to transform the payload before it reaches S3. For example, extract only the relevant fields from a Pub/Sub message:

```
const message = JSON.parse(r.body)

// Extract just the data you need
const simplified = {
    event_type: message.attributes.event_type,
    data: message.data,
    timestamp: message.publish_time
}

r.setBody(JSON.stringify(simplified))
```

See the [JSON encoding](https://webhookrelay.com/docs/service-connections/aws_s3/docs/webhooks/functions/manipulating-json) guide for more transformation examples.

Did this page help you?