---
title: "GCP BigQuery | WebhookRelay"
meta:
  "og:description": "How to send data to BigQuery from Webhook Relay."
  "og:title": "GCP BigQuery"
  description: "How to send data to BigQuery from Webhook Relay."
---

![Stripes](https://webhookrelay.com/docs/webhooks/functions/big-query/images/stripes.svg)

Documentation

**Fundamentals**

# **GCP BigQuery**

How to send data to BigQuery from Webhook Relay.

Prerequisites:

- [Google Cloud Platform account](https://cloud.google.com/) (free trial available)
- Google Cloud project with BigQuery enabled (there's a generous free tier available for BigQuery)
- Dataset and table in BigQuery - [https://cloud.google.com/bigquery/docs/tables](https://cloud.google.com/bigquery/docs/tables)

Webhook Relay provides a bigquery module that can stream writes into Google Cloud BigQuery. To start ingesting data from webhooks straight into your BigQuery table, create a new Function and import the bigquery module:

```
// BigQuery module is available globally
// Initialize with your project details
bigquery.initialize("your-project-id", "dataset-id", "table-id")
```

```
-- Import BigQuery helper package
local bigquery = require('bigquery')
```

A new tab should appear that will ask you to set up credentials:

Go to that tab and it will ask you to:

1. Create [new service accounts](https://cloud.google.com/iam/docs/service-accounts) with **BigQuery Editor** permissions
2. Download the JSON file. Once you have the JSON file
3. Copy & paste contents into the form and click save.

## [Streaming data into BigQuery](#streaming-data-into-bigquery)

```
// Parsing payload
const rowData = JSON.parse(r.body)

// Initializing BigQuery client
const initResult = bigquery.initialize("your-project-id", "dataset-id", "table-id")
if (initResult && initResult.error) {
    console.error("BigQuery init failed:", initResult.error)
}

// Receiving payload:
// {
//     "hub_user_id": "user-id-here",
//     "category": "signup",
//     "action": "click",
//     "label": "github auth"
// }

// Insert row:
const insertResult = bigquery.insert(rowData)
if (insertResult && insertResult.error) {
    console.error("Insert failed:", insertResult.error)
}
```

```
-- Import BigQuery helper package
local bigquery = require('bigquery')
local json = require("json")

-- Parsing payload
local rowData, err = json.decode(r.RequestBody)
if err then error(err) end

-- Initializing BigQuery client
err = bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

-- Receiving payload:
-- {
--     "hub_user_id": "user-id-here",
--     "category": "signup",
--     "action": "click",
--     "label": "github auth"
-- }

-- Insert row:
err = bigquery.insert(rowData)
if err then error(err) end
```

## [Check if record exists](#check-if-record-exists)

A simple query to check whether a row exists by matching a column with a value:

```
bigquery.initialize("your-project-id", "dataset-id", "table-id")

const result = bigquery.recordExists("name", "john")
if (result.error) {
    console.error("Query failed:", result.error)
}

if (result.exists) {
    // OK
} else {
    console.error("Record not found")
}
```

```
bigquery = require('bigquery')
err =   bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

local exists, err = bigquery.record_exists('name', 'john')
if err then error(err) end

if exists then
  -- OK
else
  error('Record not found')
end
```

Use cases:

- You are working with a webhook that sends data about a user signing up. You want to check if the user already exists in your database before inserting a new row.
- If each inserted unique webhook results in an expensive operation you want to avoid running the operation if the row already exists.

## [Execute any command](#execute-any-command)

To execute any SQL command on your table:

```
bigquery.initialize("your-project-id", "dataset-id", "table-id")

// Delete old records of the matching category. Method 'exec' can take an arbitrary
// number of arguments, depending on how many ? you have in your query.
const result = bigquery.exec("DELETE dataset-id.table-id WHERE category = ? AND country = ?", "movies", "US")
if (result && result.error) {
    console.error("Query failed:", result.error)
}
```

```
bigquery = require('bigquery')
err =   bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

-- Delete old records of the matching category. Method 'exec' can take an arbitrary
-- number of arguments, depending on how many ? you have in your query.
err = bigquery.exec('DELETE dataset-id.table-id WHERE category = ? AND country = ?', 'movies', 'US')
if err then error(err) end
```

## [BigQuery module API reference](#bigquery-module-api-reference)

| Lua method | JavaScript method | Parameter Type | Description |
| --- | --- | --- | --- |
| insert(rowData) | insert(rowData) | Table/Object | A key-value object that represents a row data. |
| record_exists(column, value) | recordExists(column, value) | String, String | Checks if a row with the matching column exists |
| exec(query, ...args) | exec(query, ...args) | String, ...any | Execute a DML query with positional parameters |

## [Limitations](#limitations)

Currently the module doesn't support nested objects. That means that a table with a JSON structure such as:

```
{
    "hub_user_id": "user-id-here",
    "category": "signup",
    "action": "click",
    "label": "github auth",
    "nested_data": {
      "location": "GB",
      "date": "2020-05-10"
    }
}
```

will not be successfully inserted. Therefore, flatten the structure in the function before inserting it.

## [Troubleshooting](#troubleshooting)

Few things to note:

- Ensure that project ID, dataset ID and table ID are there.
- BigQuery table schema is defined by the user. You don't have to write all the fields (most of the can be nullable) but if you try to write a field that doesn't exist, BigQuery will refuse to write.

Did this page help you?