Google BigQuery can be used as a data source or a data destination in Mitto.

Source plugin example: Query

Destination plugin examples: CSV, Salesforce, SQL

Google BigQuery as a data destination

  • Mitto automatically creates the Google BigQuery dataset tables if they don't exist
  • Mitto automatically determines data types for Google BigQuery columns
  • Mitto automatically adds new columns to Google BigQuery tables based on new fields in source systems

Google BigQuery specific setup

The Google BigQuery API requires using a service account key which is downloaded as a JSON file and stored in Mitto.

Below is an example of what this key file contains:

  "type": "service_account",
  "project_id": "mitto-183418",
  "private_key_id": "...",
  "private_key": "...",
  "client_email": "...",
  "client_id": "...",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "..."

The JSON key file contains the project_id which is used in the database url.

Below is the database url structure for connecting to a Google BigQuery database:


Here's an example of using Google BigQuery as a destination in a CSV job:

NOTE: When outputting to Google BigQuery, leave "Schema" blank and append the dataset name to the end of the output database URL.


Mitto can send SQL statements to Google BigQuery. Use Google BigQuery syntax in these Mitto SQL jobs.