1 of 20
2 of 20
3 of 20
4 of 20
5 of 20
6 of 20
7 of 20
8 of 20
9 of 20
10 of 20
11 of 20
12 of 20
13 of 20
14 of 20
15 of 20
16 of 20
17 of 20
18 of 20
19 of 20
20 of 20


Using Google BigQuery to receive data from Azion Data Streaming

Data Streaming is one of Azion’s Observe products designed to help you access your content and application data in real time. To successfully use Data Streaming with connectors, you first need to set up the endpoints.

After completing the initial setup, you can use Google’s BigQuery to receive data from Azion Data Streaming. Once you finish connecting the endpoint, you can improve your monitoring and use other Azion products to continue exploring information on your data.

Continue reading this hands-on guide to see step by step of how to connect Google’s BigQuery endpoint to Data Streaming.

  1. Requirements
  2. How to configure the new endpoint in Azion Data Streaming

1. Requirements

To get started with Google BigQuery, you must follow a few steps:

  1. Create a Google Cloud Platform account.
  2. Create a project on the Google Cloud Platform.
  3. Create a service account on Google Platform by filling the required fields with your information.

To authenticate third-party applications, BigQuery uses service accounts.

  1. The service account must have the BigQuery Admin permissions. Make sure you select that option in the Role dropdown list.

For more details about the standard permissions assigned to the role of data editor of BigQuery, access BigQuery Roles.

Creating a private key

Next, you must create a private key to continue your configuration.

  1. After the service account is created, access your service account.
  2. In the Keys menu, create a new key by choosing the Key Type JSON option.
  3. After the confirmation, a .JSON file will be downloaded with the credentials.

See below an example of the file’s content:

  "type": "service_account",
  "project_id": "azion-data",
  "private_key_id": "13e018d99d6ay9e3c9f3e21a7a7e0226a1ae082",
  "private_key": "-----BEGIN PRIVATE KEY-----\\nxxx\\n-----END PRIVATE KEY-----\\n",
  "client_email": "myemail@azion.com",

Enabling BigQuery API

Next, you must access the API Manager and enable the BigQuery API.

The BigQuery API supports an endpoint to stream rows into a table. However, this endpoint is not supported in the Free Tier version. To use it, it’s necessary to enable the full version of the platform with the Billing configuration.

Find more details on this step in the documentation of Billing management on projects. You can consult the fees for this API in the Streaming Inserts section’s price table.

Creating a dataset

After enabling the API, you will need to create a dataset. To do so, you must first create a project in the Google Cloud Console. By default, BigQuery is already enabled in new projects.

After creating the project, follow these steps:

  1. Select your project and create a dataset for the selected project.
  2. Create a table for your dataset.
  3. Open the table you’ve created > select Edit schema.
  4. Add the structure of the data that will be inserted.

Once you create the table, it’s possible to ingest data through the BigQuery API.

2. Configuring the new endpoint in Azion DataStreaming

Next, you’ll follow these steps to configure the new endpoint you created in Google BigQuery in your Azion Data Streaming.

  1. If you are a new user of Data Streaming, access the Account Menu > Billing & Subscriptions in Real-Time Manager (RTM) and enable the product in the Subscriptions tab.
  2. On the upper left corner, select Products Menu > Data Streaming.
  3. Click the Add Streaming button.
  4. Choose a name for your Data Streaming.
  5. In the Data Source dropdown list, select Edge Applications.

For more information on creating Edge Applications, see the documentation page.

  1. Your DataSet must be in accordance with the table previously created in BigQuery. Therefore, in Template, select “Custom Template” and assemble the dataset according to the example below.

Tip: copy-and-paste the template default and erase the fields you didn’t create or rename. For more details about the variables that exist in Data Streaming, consult the product’s documentation.

	"host": "$host",
	"status": "$status",
	"request_uri": "$request_uri",
	"remote_addr": "$remote_addr"

The dataset created here is equivalent to the table created in Google BigQuery.

  1. In Options, you can:

    • Select Filter Domains and add the domains you want to receive the logs by selecting them in the Available Domains box.
    • Select All Domains.
  2. In the Destination configurations, select Google BigQuery from the Endpoint Type dropdown list.
  3. Fill the following fields:
  • Project ID: ID of your project in Google Cloud.
  • Dataset ID: ID of your dataset created on Google BigQuery.
  • Table ID: ID of your table that will receive the streamed data.
    1. In Service Account Key, paste the content of the downloaded .JSON file that contains your private access key.

Here’s an example of how the private access key should look like:

	"type": "service_account",
	"project_id": "[name]",
	"private_key_id": "13c73d892hf6e8s04hjkloi6759f1e6df39f9038",
	"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvAIpDEryaqLPEuiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQCzL+bgfcynhWOx\nAKQ6wfsnwl/jEYsu5KxlPTtr11hmHLVtDAC68FVjAL029zfTjCRIG9d2ttm6fySY\nJm7Y1MwpahekDmFhMbISxA5UfN0KAF5Bs/uGU6hm17tq+ZDSA1L9f3UIvAJ5/cqu\n9CKhU1Dm1TChL8nxIfAb90G7ga6QJVve3ko/0KHpq7pdm3tp6VsVQ+fgwKNi7L+A\n4CvHFT0jX4jRDIFUKePRuxyleZV5p1Y3BHSLCIC1X+oe36a0RMLBCrWVdhHwAqBb\nbec3NYTen4Re+BidL0cfJ8IsVhjdWuibQTaT2/V+OzA+JgzXvpYSI0jWUvUiYtRK\njxz7AGaXAgMBAAECggEASA0bua76ElAjTg9ixKFg7u0/4P4cWfAM1cf64+e9zPJ6/\nH5NaW8cpWf+7C/MxlOdH/zojHKScMyWhXu0wvpKalGXWr+F5/mVCsu2wqfoIhPh\nzeAq72KB5MtBLI4ecPkbCnyGKbt9909TfRrrLBEl58EHNaUwEvRDzsmBpn1JDe75\nJ2ODNf714DsDtghG5Jy5nZ75Bk6ny5mYp67q6IdCUFJeLgJUwfNdtUJmcQ5x7lw3\nujR0vEyEWXpiSAsIhIi0XgMr5NSbBdH+e+P9gVUZwqtRbshdH6aPalIxh1rhdEtY\nJguGzK9nbYQtzm0Mdka3VZtUZIEQAqlg8OZe8xLpa+p392TU64sQlrJxQMZxPNtU\ntPuDwtDAgmwGZNGFxgBFIMuzN88QpL5zPFSBbJoHt5xJ3sGNmeuDF9SrBXNrFz\n9hmqUtoUa0iNheVNG+Y7smEnJNjuSYldAlBQ5qjqSr1IAJTwoUE0fF1P3SbFK9b2\nW6TJ73gqF78EQIJf6t3kOczm/QKB0pRMSuGK2ga45ig2CtMSklUHVjL3A+zcEP9NH\nosFRYkxZZShPqKj2j0PAdB2TcUgrl1a+I+6oA1oU/j0fuJiux9pxrz9I8QfTVwJQS\n/oCcHsKMrDngi0+DkETHDe9peDPTfO4MAh+G285MDPa3LegEG2iVGsqhp+5v8Jdm0Vl\nCyZQJ526IwKBgESw1npFyakE0sMGjlwBRjworH5HjajNPsJjZtspaU7TkCXsS7bt\nwFmLmm7205SKM+1N9C4owSn25uxIWbsb/wB6iuK+EyP+K3qnjPI/GsVRpDjXb1Ma\niBe4tZCUUP/lJGj8HvBk+kD/lQoFuFndD6cvwDze+PpUeN2oe7IiiZQBlAoGAcQUp\nHT3lCVmxXC049FKa8DyWTJIQJhkJmDADeqlYaCFaUe9YC490Y+BtYZHX0UNDXCnFZ\nLIBTtRTPfFU02kUBAcGn0ALc74QwUnJlImvuOeYOlgGwy6QzcRQ6dtfsDWROwKk\nNCAAjYBylKF2QcuZC3rwe0qN5EIe/0DoFmWUD7ELCgYBIKy2ojKY2d+IByJakBOXt\nojwlCj+I5GpDtDeVhzw9u+74j7KoLsKE057DnMGgouGdVH2xCKih7E71iDKPx1Li\nar9Dz3LsPzHGYXt0LBa+0RBm8mRVb68AlFuN3XJ7g9H8tXPZl38hwLKM\EkDJruapG84nuOcgrp2zGHwYtp9S7DfUg==\n-----END PRIVATE KEY-----\n",
	"client_email": "email@myemailaccount.com",
	"client_id": "1835090363570189530221",
	"auth_uri": "https://accounts.google.com/o/oauth2/auth",
	"token_uri": "https://oauth2.googleapis.com/token",
	"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
	"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/mytest.com"
  1. Make sure the Active switch is on.
  2. Click the Save button.

All authentication with Google Oauth2 and generation of JWTokens will be performed by the Data Streaming backend systems.

You can also consult data directly from Google BigQuery.

After saving the configurations, you can keep track of the calls made by Data Streaming to BigQuery in the Real-Time Events product, available at the Products Menu in RTM. To do so, select the Data Source > Data Streaming and choose the filters options as you desire.


Google BigQuery and Google Cloud Platform are registered trademarks of Google LLC in the US and/or other countries.

Didn’t find what you were looking for? Open a support ticket.