👍

Estimated setup time

30 minutes

Methods

There are 2 ways to send data to Aampe from BigQuery

  1. Direct read access from your BQ dataset [Link]
  2. Push to Google Cloud Storage bucket [Link]

1. Direct read access from your BQ dataset

This method is generally used when events data is exported from Firebase directly

Aampe will access your BQ with the a dedicated service account which we will provide to you:

To grant Aampe read-only access to your BQ dataset, you need to grant the following permissions to the Aampe service account above:

bigquery.datasets.get
bigquery.readsessions.create
bigquery.readsessions.getData
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list

The default BigQuery Data Viewer role (the closest role provided by Google to what Aampe needs) is more suited to internal read access in your company, and is therefore too broad to grant to an external company.

We recommend that you:

Create a custom role with the needed minimal permissions
a) Grant this role to the Aampe service account
b) Make sure you have the “role Administrator” and “Security Administrator” roles before you begin.

Step-by-step procedure

  1. Create the custom role:
    Go to the uri: https://console.cloud.google.com/iam-admin/roles/create?project=
1085
  1. Fill the form (title: Aampe Custom BQ Read Access)
910
  1. Add the following permissions
bigquery.datasets.get
bigquery.readsessions.create
bigquery.readsessions.getData
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list

Grant permissions:

  1. Go to the uri: https://console.cloud.google.com/iam-admin/iam?project=
  2. Press grant access button
938
  1. Fill the form “add principal” with Aampe service account that we will provide you ( We will provide you the service account to use for integration via email)
  2. Fill the form “assign role” with Aampe Custom BQ Read Access
650

All you need to do is provide us with the following details now:

  1. Project ID
  2. Dataset ID
  3. Data location
  4. Table names (of all the tables that you would like to provide us data of)

2. Push to Google Cloud Storage bucket

When the data size is very large we recommend that you push data to a cloud storage bucket.
We will then copy this data from your GCS bucket to one set up by us for ingestion to our system.

If you prefer, you can directly push to a bucket created by us.

Steps

  1. Create a bucket on S3/GCP.

  2. Provide Aampe with access to that bucket (We will provide you the details of the account that you need to provide access to)

  3. Set up export from BQ to cloud storage bucket : Export Data

  4. Automate the export for each day of event data

For reference on data model and FAQ please refer to Data Models and Event stream