---
title: "Archive AWS GuardDuty Findings to GCP Cloud Storage | WebhookRelay"
meta:
  "og:description": "Route AWS GuardDuty security findings to Google Cloud Storage using Webhook Relay Service Connections for long-term retention and cross-cloud analysis"
  "og:title": "Archive AWS GuardDuty Findings to GCP Cloud Storage"
  description: "Route AWS GuardDuty security findings to Google Cloud Storage using Webhook Relay Service Connections for long-term retention and cross-cloud analysis"
---

![Stripes](https://webhookrelay.com/blog/guardduty-to-gcs-archival/images/stripes.svg)

# **Archive AWS GuardDuty Findings to GCP Cloud Storage**

Route AWS GuardDuty security findings to Google Cloud Storage using Webhook Relay Service Connections for long-term retention and cross-cloud analysis

AWS GuardDuty monitors your AWS accounts for malicious activity and unauthorized behavior. But what if your analytics stack lives in GCP? Maybe you run BigQuery for security analysis or Chronicle for SIEM. Getting GuardDuty findings into GCP usually means building custom Lambda functions, managing cross-cloud credentials, and maintaining glue code.

Webhook Relay [Service Connections](https://webhookrelay.com/blog/guardduty-to-gcs-archival/docs/service-connections) let you route GuardDuty findings to GCP Cloud Storage in minutes, no code required.

## [Architecture Overview](#architecture-overview)

The data flow:

```
AWS GuardDuty → EventBridge → SQS Queue → Webhook Relay → GCP Cloud Storage
```

GuardDuty publishes findings to Amazon EventBridge. An EventBridge rule routes them to an SQS queue. Webhook Relay polls the queue and stores each finding as a JSON file in your GCS bucket.

You can also publish to AWS S3 if you prefer, just in this case we want to access them through BigQuery afterwards.

## [Why This Approach?](#why-this-approach)

- **No custom code.** No Lambda functions to maintain, no credential rotation scripts, no deployment pipelines.
- **Long-term retention.** GCS lifecycle policies move findings to Coldline or Archive storage automatically. Keep years of security data cheaply.
- **Analytics-ready.** BigQuery can query your findings directly from GCS. Build dashboards, run threat hunting queries, or feed data into Chronicle.
- **Reliable delivery.** SQS provides durable message queuing. Webhook Relay handles retries.

## [AWS Setup](#aws-setup)

### [Step 1: Create an SQS Queue](#step-1-create-an-sqs-queue)

First, create an [SQS queue](https://webhookrelay.com/blog/guardduty-to-gcs-archival/docs/service-connections/aws_sqs) to receive GuardDuty findings:

1. Go to **AWS Console → SQS → Create queue**
2. Choose **Standard** queue type
3. Name it `guardduty-findings`
4. Keep default settings and create the queue

Copy the **Queue URL**. It looks like:

```
https://sqs.us-east-1.amazonaws.com/123456789012/guardduty-findings
```

### [Step 2: Create an EventBridge Rule](#step-2-create-an-eventbridge-rule)

EventBridge routes GuardDuty findings to your SQS queue:

1. Go to **AWS Console → EventBridge → Rules → Create rule**
2. Name: `guardduty-to-sqs`
3. Event bus: `default`
4. Rule type: **Rule with an event pattern**

For the event pattern, select:

- AWS service: **GuardDuty**
- Event type: **GuardDuty Finding**

Or use this custom pattern to capture all GuardDuty findings:

```
{
  "source": ["aws.guardduty"],
  "detail-type": ["GuardDuty Finding"]
}
```

For the target:

- Target type: **AWS service**
- Select target: **SQS queue**
- Queue: `guardduty-findings`

Create the rule. EventBridge will now route all GuardDuty findings to your SQS queue.

### [Step 3: Create an IAM User for Webhook Relay](#step-3-create-an-iam-user-for-webhook-relay)

Webhook Relay needs credentials to poll your SQS queue:

1. Go to **AWS Console → IAM → Users → Create user**
2. Name: `webhookrelay-sqs-reader`
3. Attach a policy with these permissions:

```
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "sqs:ReceiveMessage",
        "sqs:DeleteMessage",
        "sqs:GetQueueAttributes"
      ],
      "Resource": "arn:aws:sqs:us-east-1:123456789012:guardduty-findings"
    }
  ]
}
```

1. Create an access key and save the **Access Key ID** and **Secret Access Key**

## [GCP Setup](#gcp-setup)

We will be using [GCP GCS service connection](https://webhookrelay.com/blog/guardduty-to-gcs-archival/docs/service-connections/gcp_gcs) for this.

### [Step 1: Create a GCS Bucket](#step-1-create-a-gcs-bucket)

1. Go to **GCP Console → Cloud Storage → Create bucket**
2. Name: `guardduty-archive` (or your preferred name)
3. Choose your region and storage class
4. Create the bucket

### [Step 2: Create a Service Account](#step-2-create-a-service-account)

1. Go to **GCP Console → IAM → Service Accounts → Create**
2. Name: `webhookrelay-gcs-writer`
3. Grant the role: **Storage Object Creator** (`roles/storage.objectCreator`)
4. Create a JSON key and download it

## [Webhook Relay Setup](#webhook-relay-setup)

Connect both clouds through Webhook Relay.

### [Step 1: Add AWS Service Connection](#step-1-add-aws-service-connection)

1. Go to [Webhook Relay Service Connections](https://my.webhookrelay.com/service-connections)
2. Click **Add Connection**
3. Select **AWS**
4. Enter your IAM user credentials:
  - Access Key ID
  - Secret Access Key
5. Name it `aws-guardduty` and save

### [Step 2: Add GCP Service Connection](#step-2-add-gcp-service-connection)

1. Click **Add Connection**
2. Select **GCP**
3. Paste the contents of your service account JSON key
4. Name it `gcp-storage` and save

### [Step 3: Create a Bucket with SQS Input and GCS Output](#step-3-create-a-bucket-with-sqs-input-and-gcs-output)

1. Go to [Buckets](https://my.webhookrelay.com/buckets) and create a new bucket
2. Name it `guardduty-archive`

Add the **SQS Input**:

1. Click **Add Input → AWS SQS**
2. Select your `aws-guardduty` connection
3. Enter the queue URL: `https://sqs.us-east-1.amazonaws.com/123456789012/guardduty-findings`

Add the **GCS Output**:

1. Click **Add Output → GCP Cloud Storage**
2. Select your `gcp-storage` connection
3. Bucket name: `guardduty-archive`
4. Prefix: `findings/`

![Selecting your output](https://webhookrelay.com/blog/guardduty-to-gcs-archival/images/blog/aws_guard_duty_gcs/gcs_bucket.png)

And then configure the bucket:

![Output configuration](https://webhookrelay.com/blog/guardduty-to-gcs-archival/images/blog/aws_guard_duty_gcs/output.png)

Webhook Relay will now poll your SQS queue and store each GuardDuty finding in GCS.

## [Verify It Works](#verify-it-works)

### [Generate a Test Finding](#generate-a-test-finding)

GuardDuty can generate sample findings for testing:

1. Go to **AWS Console → GuardDuty → Settings**
2. Click **Generate sample findings**

This creates test findings that flow through your pipeline.

### [Check GCS](#check-gcs)

After a minute or two, check your GCS bucket. You should see files at:

```
findings/2026/03/06/<finding-id>.json
```

Each file contains the full GuardDuty finding with all metadata, severity scores, and resource details.

## [Optional: Transform Findings Before Storage](#optional-transform-findings-before-storage)

To reshape the data or extract specific fields before storing, add a [Function](https://webhookrelay.com/blog/guardduty-to-gcs-archival/docs/webhooks/functions) to your bucket:

```
const event = JSON.parse(r.body)
const finding = event.detail

// Create a simplified structure for analysis
const archived = {
  id: finding.id,
  severity: finding.severity,
  type: finding.type,
  title: finding.title,
  description: finding.description,
  region: finding.region,
  accountId: finding.accountId,
  resourceType: finding.resource.resourceType,
  createdAt: finding.createdAt,
  updatedAt: finding.updatedAt,
  // Add custom fields
  archived_at: new Date().toISOString(),
  source: "aws-guardduty"
}

r.setBody(JSON.stringify(archived))
```

This extracts key fields and adds metadata, making BigQuery queries simpler.

## [Query Findings with BigQuery](#query-findings-with-bigquery)

Once your findings are in GCS, BigQuery can query them directly:

```
-- Create an external table pointing to your GCS bucket
CREATE EXTERNAL TABLE \`project.dataset.guardduty_findings\`
OPTIONS (
  format = 'JSON',
  uris = ['gs://guardduty-archive/findings/*']
);

-- Query high-severity findings from the last 7 days
SELECT 
  id,
  type,
  severity,
  title,
  createdAt
FROM \`project.dataset.guardduty_findings\`
WHERE severity >= 7
  AND DATE(createdAt) >= DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY)
ORDER BY severity DESC;
```

## [Use Cases](#use-cases)

**Compliance and audit trails.** Many compliance frameworks require long-term retention of security events. GCS lifecycle policies can automatically transition old findings to Archive storage.

**Cross-cloud SIEM.** If you're using Google Chronicle or another GCP-based SIEM, this pipeline feeds findings directly into your security analytics platform.

**Cost optimization.** AWS GuardDuty retains findings for 90 days. For longer retention, exporting to GCS with lifecycle policies is often cheaper than other archival solutions.

**Multi-cloud correlation.** Combine GuardDuty findings with security events from GCP and Azure in a single BigQuery dataset for unified threat analysis.

## [Conclusion](#conclusion)

Routing GuardDuty findings to GCS used to require custom Lambda functions, cross-cloud IAM roles, and ongoing maintenance. With Webhook Relay Service Connections, setup takes minutes:

1. Create an SQS queue and EventBridge rule in AWS
2. Create a GCS bucket and service account in GCP
3. Connect them through Webhook Relay with an SQS input and GCS output

Your findings are now archived in GCS, ready for BigQuery analysis, Chronicle ingestion, or compliance retention.

[Sign up for Webhook Relay](https://my.webhookrelay.com/register) to start archiving GuardDuty findings.