Prerequisites
- A Google Cloud Project with BigQuery enabled
- A Google Cloud Service Account with the “BigQuery User” and “BigQuery Data Editor” roles in your GCP project
- A Service Account Key to authenticate into your Service Account
Step 1: Instance Naming
- Choose Google BigQuery from Warehouse section under Destination connector

- Choose the Meta CLO data type
- Name the connection and click next
Step 2: Authorization
- Option to see the format of data requirement is available
Table

- Paste the JSON key on the account
- Enter Dataset and Project ID
- Enter the table name

- Click Validate to finish setup
Query

- Paste the JSON key on the account
- Enter Dataset and Project ID
- Validate the credentials
Click Next
- Enter the query
- Click finish