1 of 20
2 of 20
3 of 20
4 of 20
5 of 20
6 of 20
7 of 20
8 of 20
9 of 20
10 of 20
11 of 20
12 of 20
13 of 20
14 of 20
15 of 20
16 of 20
17 of 20
18 of 20
19 of 20
20 of 20

doc

Features GraphQL API

GraphQL features consist of datasets, filtering, sorting, and pagination. The features provide easy access to your data, and using and combining the available features creates more personalized and specific queries to request the exact information you need.

The next sections will detail each available feature of GraphQL and how to use them.

  1. Datasets
  2. Filtering
  3. Sorting
  4. Pagination

1. Datasets

Azion GraphQL API uses defined datasets to indicate what requests you can run through queries and fetches data from Real-Time Metrics and Real-Time Events. They consist of organized tables informing your data.

Find each available dataset and what they request next:

Dataset Description
httpMetrics Request events registered by Edge Application and Edge Firewall.
l2CacheMetrics Request events registered by L2 Caching.
edgeFunctionsMetrics Events executed by Edge Functions.
imagesProcessedMetrics Image processing events by Image Processor.
idnsQueriesMetrics Query events performed on Intelligent DNS.
dataStreamedMetrics Sent events of data by Data Streaming to the clients’ endpoint.
httpEvents Request events registered by Edge Application and Edge Firewall.
l2CacheEvents Request events registered by L2 Caching.
edgeFunctionsEvents Events executed by Edge Functions.
imagesProcessedEvents Image processing events by Image Processor.
idnsQueriesEvents Query events performed on Intelligent DNS.
dataStreamedEvents Sent events of data by Data Streaming to the clients’ endpoint.
cellsConsoleEvents Events logs from applications using Edge Runtime returned by the Cells Console.

To see which fields are available for each dataset, you can run an Introspection Query to consult metadata. Find out more on the How to query metadata with GraphQL API guide.

Find out more on How to select Top X queries with GraphQL API.

While using datasets, it’s important to note you can request both raw and aggregated data models. By running a request for raw information, using the Events datasets, you receive as a response the raw numbers related to that specific dataset. By running a request for aggregated information, using the Metrics datasets, you receive your information in the form of graphs related to that specific dataset.

To specify you want to aggregate data with a time interval, add the aggregate operator in your query along with the group_by field. For example, the following query aggregates data:

query IdnsQuery {
  idnsQueriesMetrics(
    limit: 10
    aggregate: {sum:requests}
    groupBy: [ts]
    filter: {
      tsRange: {begin:"2022-10-20T10:10:10", end:"2022-10-23T10:10:10"}
    }
  ) 
  {	
    ts
    sum
  }
}

Find out more on how to aggregate data with the How to query aggregated data with GraphQL API guide.

It’s important to note that with the aggregated model you’ll receive data according to a time range defined through an adaptive resolver. Currently, there are three possible intervals to fetch your results: minute, hour, and day.

Each query interval is used according to the following definitions:

  • Minute: used for queries in the interval of up to 3 days.
  • Hour: used for queries in the interval of 3 and 60 days.
  • Day: used for queries in the interval of over 60 days.

To successfully receive data as a response, you must inform a time interval either through a tsRange or a tsGt + tsLt field in your queries with a valid date and time format. If you use tsRange, you’ll receive data greater than or equal to or less than or equal to that specific interval, including the beginning and ending date you’ve informed.

Use the following example as a basis to apply a tsRange in your request:

tsRange: {begin:"2022-06-23T09:10:10", end:"2022-06-23T16:10:10"}

If you use tsGt + tsLt, you’ll receive data greater than or less than that specific interval, not including the beginning and ending date you’ve informed.

Use the following example as a basis to apply a tsGt + tsLt in your request:

{

  "tsGt": "2022-07-22T10:10:10",

  "tsLt": "2022-09-19T10:10:10"

}

Defining and informing a time range interval in your queries is important to fetch data from the available datasets of the GraphQL API and continue running requests with the other available features.


2. Filtering

With filtering parameters, responses returned by your queries can be more accurate to your set of data. You can use filtering with any available field in the dataset you’re consulting.

As requesting complex or a large amount of data can cause responses to get noisy and complicate its use, filtering queries helps with getting exact and direct data from your requests. For example, if you’re using the following query to request httpMetrics:

query HttpQuery {
  httpMetrics(
    limit: 10
    filter: {
      tsRange: {begin:"2022-10-20T10:10:10", end:"2022-10-23T10:10:10"}
    }
  ) 
  {	
    ts
    sourceLocPop
  }
}

You can filter the query by requesting data specific to the sourceLocPop field:

query HttpQuery {
  httpMetrics(
    limit: 10
    filter: {
      tsRange: {begin:"2022-10-20T10:10:10", end:"2022-10-23T10:10:10"}
      sourceLocPop: "lax-bso"
    }
  ) 
  {	
    ts
    sourceLocPop
  }
}

You can feel free to update your request to use a field of your interest within the fields of the dataset you’re consulting.

You can also filter using multiple AND and OR parameters. Make sure you define all the fields you’re filtering for inside the parameter, like in the following example:

query totalImagesProcessedRequests {
  imagesProcessedMetrics(
    aggregate: {sum: requests}
    limit: 100
    filter: {
      tsRange: {begin:"2023-03-20T19:52:00", end:"2023-03-20T20:52:00"}
      or: {
          status:304
          statusRange: {begin: 200, end: 299}
      }
    }
    groupBy:[ts]
    orderBy:[ts_ASC]
  )
  {
    ts
    sum
  }
}

3. Sorting

The sorting feature lets you organize and sort the received data of a dataset according to the event’s order. For example, if you’re receiving the host field data as a response to your API request, you can sort the data to receive it in:

  • An ascending order (ASC)
  • A descending order (DESC)

Whenever you use the orderBy property, you must add either the ASC or the DESC specification.

For example, to use the ascending order sorting feature, you need to add orderBy in your query and the field you want to sort + ASC:

{
    orderBy: [host_ASC]
}

To sort the data according to a descending order (DESC), you need to add the field you want to sort + DESC:

{
    orderBy: [ts_DESC]
}

Say you’re using this query with DESC:

query SumBytesSentByHost {
  httpMetrics(
    limit: 1000
    filter: {
      tsRange: {begin:"2023-01-01T17:03:00", end:"2023-06-01T18:05:00"}
    }
    aggregate: {sum: bytesSent}
    groupBy: [host]
    orderBy: [sum_DESC]
  ) 
  {        
    host
    sum
  }
}

You’ll get a response similar to this:

{
    "data": {
        "httpMetrics": [
            {
                "host": "g1sdetynmxe0ao.map.azionedge.net",
                "sum": 606226
            },
            {
                "host": "uaykhefjdk9or.map.azionedge.net",
                "sum": 583059
            },
            {
                "host": "wz0ywpod397zk.map.azionedge.net",
                "sum": 567633
            },
            {
                "host": "zi1435nbhec7.map.azionedge.net",
                "sum": 96002
            }
        ]
    }
}

And if you’re using this query with ASC:

query AvgRequesTimeByHost {
  httpMetrics(
    limit: 1000
    filter: {
      tsRange: {begin:"2023-01-01T17:03:00", end:"2023-06-01T18:05:00"}
    }
    aggregate: {avg:requestTime}
    groupBy: [ts, host]
    orderBy: [avg_ASC]
  ) 
  {        
    ts
    host
    avg
  }
}

You’ll get a response similar to this:

{
    "data": {
        "httpMetrics": [
            {
                "ts": "2023-04-21T00:00:00Z",
                "host": "zipo145nbhc7.map.azionedge.net",
                "avg": 0.04871428571428572
            },
            {
                "ts": "2023-04-13T00:00:00Z",
                "host": "g1snmxepa0ao.map.azionedge.net",
                "avg": 0.561
            },
            {
                "ts": "2023-04-11T00:00:00Z",
                "host": "g1syinmxe2ao.map.azionedge.net",
                "avg": 4.101833333333333
            },
            {
                "ts": "2023-04-11T00:00:00Z",
                "host": "wz1yd307zk.map.azionedge.net",
                "avg": 8.705666666666668
            },
            {
                "ts": "2023-05-22T00:00:00Z",
                "host": "uaifjdk6or.map.azionedge.net",
                "avg": 31.493818181818185
            }
        ]
    }
}

4. Pagination

Pagination is a feature designed to help you decide where you want your results to begin from and how many results you want to see. Currently, Azion GraphQL API uses offset and limit pagination to provide the feature.

Using pagination can be useful when you get a large number of data in response to your API request. You can use the feature by setting offsets and limits. This way, the API knows it needs to return data within the specific range you’ve set.

The offsets parameter sets the number of records you want to skip in your data response, and the limits parameter sets the number of results you want to receive.

Setting offset and limit parameters isn’t mandatory. In case you don’t set the parameters, GraphQL API automatically sets the offset for 0 and the limit for 10.

See the following example on how to set offset and limit parameters:

query HttpQuery {
  httpMetrics(
    offset: 15
    limit: 30
    filter: {
      tsRange: {begin:"2022-10-20T10:10:10", end:"2022-10-23T10:10:10"}
    }
  ) 
  {	
    ts
    sourceLocPop
  }
}

The offset is set for 15, meaning your response will start with the 16th result, and the limit is set for 30, meaning your response will give you a total of 30 results. In this case, you’ll receive a response from the 16th result until the 45th result.

If your data is constantly updated, using pagination may cause missing or duplicate data when you run more than one request using the feature.


Didn’t find what you were looking for? Open a support ticket.