Google BigQuery Data Share permissions

How to configure Google BigQuery so that Funnel can successfully share your data to it.

Niclas Bångman avatar
Written by Niclas Bångman
Updated over a week ago

1. Overview

Funnel's BigQuery connector lets you share your Funnel data to a BigQuery Dataset of your choice. This guide will cover what you need to do in your Google Cloud Console in order for Funnel to be able to share data there.

Before you get started you need a Google Cloud Project with billing enabled https://console.cloud.google.com/billing.

2. Authentication

When authenticating you can select either a Google account (via OAuth) or your own service account or our service account (legacy). Pros and Cons are written in each section together with the instructions for how to set it up.

Required scopes

  • bigquery.datasets.get

  • bigquery.tables.create

  • bigquery.tables.updateData

  • bigquery.tables.get

  • bigquery.tables.update

  • bigquery.jobs.create

  • storage.buckets.create

  • storage.buckets.get

  • storage.objects.create

  • storage.objects.delete

  • storage.multipartUploads.*

Optional scopes

  • bigquery.datasets.create
    Not needed if you have manually created a BigQuery Dataset and use that ID in the Funnel Data Share.

  • bigquery.tables.delete
    A test table might be created before the export to validate access, this will allow us to delete it afterwards.
    Also when doing certain changes like removing date from the Data Share or changing all fields at once, it's required by us to delete the table and create a new one. For these auto-fix capabilities to work we require this scope, but it is also possible to delete the table manually and click "Run job" to fix all of these issues.

  • storage.buckets.update
    This will allow us to unset the soft delete retention policy since it's not used because we only have files there temporarily while they are loaded into BigQuery. It will also save you a little bit of money.

2.1 Use a Google account (OAuth)

The Google user that you use to authenticate with needs access to Google Cloud Storage (GCS) and BigQuery and billing needs to be enabled in your Google project. See complete list in "Required scopes" and "Optional scopes" above. The reason why we need GCS access as well is that we first put the files in a bucket that we create, transfer the files from there to BigQuery and then delete the files.

Pros: No configuration needed for the correct permissions and clear ownership. Access is only given to the specific Data Share(s) you add it to.

Cons: If the Funnel user connecting the credentials is removed from Funnel the credentials will also be removed and the Data Share will have to be reconnected.

2.2 Use a service account [Recommended]

  1. In Google cloud console go to IAM -> Roles, or follow this link.

  2. Create a new role and add permissions to the scopes in "Required scopes" above and preferably the "Optional scopes".

  3. In Google cloud console go to IAM -> Service Accounts, or follow this link.

  4. Create a new Service account

    1. Give it a name and ID

    2. Add the Role that you created in step 2

  5. Click on the created Service account, go to the Keys tab and add a JSON key.

  6. Open the downloaded JSON key and paste it into Funnel

Pros: The credential is not tied to any individual user. Access is only given to the specific Data Share(s) you add it to. You own the credentials which means that you can edit, delete or disable access. You can also rotate the credential yourself depending on your organisational requirements by creating a new key pasting it into Funnel and then removing the old one.

2.3 Use Funnel's service account

  1. Go to https://console.cloud.google.com/iam-admin/iam/ and select your project.

  2. Grant access to the Funnel service account from the table below and give it the roles BigQuery Data Editor and BigQuery Job User.

App URL

Service account

app.funnel.io

app.eu.funnel.io

Pros: Easy setup and the authentication is not tied to an individual user. The service account is maintained by us and the access keys are rotated every day.

Cons: Gives Funnel more permissions than necessary and you have no control over the service account.

3. Configure a BigQuery dataset

3.1 Let Funnel create the dataset

Funnel can create a dataset for you, this is the default behaviour.

3.1.1 BigQuery Dataset Location

For the BigQuery dataset location you can either select US (multi-region) or EU (multi-region). This can be selected under "Advanced settings" when setting up a Data Share. If you want your data in another location follow the step 3.2 to create your own dataset and give Funnel permissions to export to it.

3.2 Give service account access to your own dataset

If you choose to manage the dataset yourself and are using the service account authentication you need to grant Funnel access to the dataset.

To grant the access required add the Funnel service account found in the table above to the project with the role "BigQuery Job User" and on the dataset add the role "BigQuery Data Owner" to the service account.


3.2.1 How to grant BigQuery Data Owner on the dataset

  1. Go to the dataset you want to export to and press "Sharing" then "Permissions".

  2. Press Add principal then add the service account with the role "BigQuery Data Owner"

3.2.2 Domain Restrictions (optional)

When using service account and if you have domain restrictions enabled in GCP then below is the organisation and workspace ID's for the service accounts used in Funnel Shares.

Organization resource ID

1033907189250

Google Workspace customer ID

C03lzhzjd

All done!

Having completed the two steps above you can now configure your Funnel Share to send data to this BigQuery Dataset. Do this from within your Funnel account.

Did this answer your question?