Setting up BigQuery
Data warehouse integrations are available as a premium add-on for our Web Experimentation and Feature Experimentation module. For more information, please contact your Customer Success Manager.
We currently support Data warehouse integration from the following providers:
- BigQuery
- Snowflake
- Redshift
Support for the following provider is coming soon:
- Microsoft Azure
For more information, and if you want to be part of our early adopter program, please contact your Customer Success Manager.
Considerations
Keep these things in mind when using this integration:
- Data Volume: Keep in mind the volume of data you plan to interact with, as it can affect query performance and costs.
- Query Complexity: Complex queries may require more time and resources to execute. Optimize your queries for efficiency.
- Data Privacy: Ensure compliance with data privacy regulations when handling user data within your warehouse.
- Access Control: Implement proper access controls to limit who can configure and use the integration within your organization.
- Data Schema: Maintain a clear and consistent data schema to facilitate data retrieval and analysis.
- Monitoring: Regularly monitor your data warehouse usage to manage costs and performance effectively.
- Documentation: Maintain documentation for queries, configurations, and integration processes to facilitate collaboration and troubleshooting.
BigQuery
Prerequisites
To configure this integration, you need the following information:
- Google Cloud Account: Users must have a valid Google Cloud account to access Google BigQuery and generate the necessary credentials.
- Google Service Account: A Google Service Account with the appropriate permissions to access BigQuery and create credentials is required.
- BigQuery Project: Users must have a Google BigQuery project set up where data will be stored and queried.
- Credential File: Users must generate a credential file from their Google Service Account, which will be used to securely access BigQuery.
Create a service account
Create a service account for Kameleoon within the project that has the BigQuery Data Viewer role.
- Type service accounts in the global search bar, and click the suggested result.
- Click Create service account.
- Fill in the mandatory fields and click Create and Continue.
- Type "BigQuery data viewer" in the Select a role field, and click BigQuery Data Viewer when it appears in the dropdown search
- Click Done.
Create a new dataset
Create a new dataset called "kameleoon" in your project
Select the region or multi-region that suits you best.
Grant permissions to the service account
Grant the service account BigQuery Data Owner status on this "kameleoon" dataset
- In the BigQuery dashboard, click the kameleoon dataset in the Explorer bar on the left.
- Click Sharing > Permissions.
- Click ADD PRINCIPAL.
- Type in the Kameleoon service account name in the Add principals field.
- In the Assign Roles field, type "BigQuery Data Owner".
- Finish by clicking SAVE.
Create a custom role
Create a custom role that has the **bigquery.jobs.create**
permission
- Navigate to Roles.
- You can find Roles by typing "Roles" in the console global search field.
- Click CREATE ROLE.
- Fill in the configuration fields as you wish.
- Click ADD PERMISSIONS, and add bigquery.jobs.create from the drop-down list that has appeared in a pop-in window.
- Click ADD.
- Click CREATE to finalize the custom role.
Add this custom role to the service account you created
- Navigate to IAM in the sidebar of Google Cloud dashboard.
- You should see a list of Principals. A service account is a Principal. Find the Kameleoon service account in the list, and click the pen icon to edit the principal.
- A configuration sidebar appears on the right. Click Add Another Role and find the custom role you created above in the drop-down auto-complete menu.
- Click SAVE.
Download the service account JSON credentials file on your computer.
Enabling the integration for your project
- Log in to your Kameleoon account and click Admin > Integrations.
- Select BigQuery.
- Choose the project on which you want to enable BigQuery.
- Click Upload JSON file, and select your JSON file.
- Click Validate to save your changes.
Managing Multiple Projects
If you have multiple projects, and you want to configure the BigQuery Integration for each of them, you can do so from the same Kameleoon account. To switch between project configurations, use the project dropdown menu provided within the integration settings. This dropdown lets you select and manage the integration settings for each project.
Keep in mind that each project has different configuration settings. You will need to upload a unique JSON credential file for each project. However, if the permissions granted by a single credential file are applicable to several projects, you can reuse the same credential file for those projects.
Once you have enabled the BigQuery integration for your project, you can:
- activate Use BigQuery as a source to access and use data stored in Google BigQuery within your Kameleoon campaigns;
- activate Use BigQuery as a destination to seamlessly send you Kameleoon campaigns' results to Google BigQuery.