OBSERVE

Data Stream

Stream logs and events in real time to your SIEM, big data, and analytics platforms. Power security monitoring, compliance, and business intelligence with near real-time data delivery.

Data Stream

interval for real-time streaming

to configure, Plug-and-play setup

connectors including qRadar, Grafana, Splunk, and BigQuery

Real-time observability for modern applications

Stream data early in the request flow to your analytics platforms with minimal latency and maximum reliability.

Stream events in near real-time

Deliver logs to SIEM and analytics platforms with up to 60-second latency for real-time threat monitoring and analysis.

Ensure compliance and data retention

Retain logs for audits and investigations with secure, continuous backups that meet regulatory requirements.

Pay only for what you use

Usage-based pricing with no hidden fees, infrastructure to manage, or vendor lock-in.

Connect to multiple destinations simultaneously

Stream to AWS Kinesis, Splunk, S3, BigQuery, Elasticsearch, Datadog, and more. Use multi-protocol support to send data to multiple endpoints in parallel.

Filter and transform data before delivery

Apply filtering, sampling, and transformations to control data volume and reduce costs before streaming.

Scale automatically with your traffic

Handle traffic spikes seamlessly across a globally distributed platform without configuration changes or performance degradation.

DNZ
Axur
Radware
Arezzo
Contabilizei
Magazine Luiza
Fourbank
Amazon Prime Video
Crefisa
Netshoes
Dafiti
Global Fashion Group
GPA Logo

"Azion shielded us from sophisticated cyberattacks and empowered us to modernize our infrastructure, reduce costs, and deliver the best shopping experiences to millions of customers across Latin America."

Allan Monteiro

CISO & Head of Technology

Complete data pipeline from edge to analytics

Collect, process, and deliver event data with enterprise-grade reliability.

Collect event data from multiple sources

Automatically collect logs from Activity History, Applications, Functions, and WAF Events, filtering product and domain data based on predefined monitoring parameters. Select specific variables using ready-made templates or custom configurations to capture exactly the data you need.

DOCS

Data collection from multiple sources with variable selection interface

Reliable pipeline architecture and data processing

Build reliable pipelines for continuous near real-time data streaming with 100% data capture or optional sampling. Connect to multiple destinations using multi-protocol support, with built-in filtering and transformation before delivery to analytics, SIEM, and BI platforms.

DOCS

Secure pipeline architecture with stream processing capabilities

Multiplatform integration and flexible data delivery

Stream structured event data to AWS Kinesis Data Firehose, Splunk, S3, Google BigQuery, Elasticsearch, Datadog, IBM QRadar, Azure Monitor, and HTTP/HTTPS endpoints.

Customize payloads and formats, and control delivery with configurable limits of up to 2,000 records or 60-second intervals across the distributed platform.

DOCS

Integration with SIEM and analytics platforms with real-time delivery

Frequently Asked Questions

What is Data Stream?

Data Stream is Azion's real-time log streaming service that feeds your SIEM, big data, and analytics platforms with event logs from your applications. It collects, processes, and delivers data from multiple sources including Applications, Functions, WAF Events, and Activity History to endpoints like AWS Kinesis, Splunk, S3, BigQuery, and more.

How fast is data delivery?

Data Stream delivers logs in near real-time with a maximum interval of 60 seconds or when 2,000 records are accumulated, whichever comes first. This ensures timely data availability for security monitoring, compliance, and analytics while optimizing throughput and reducing overhead.

Which connectors and endpoints are supported?

Data Stream supports 10+ connectors including AWS Kinesis Data Firehose, Splunk, Amazon S3, Google BigQuery, Elasticsearch, Datadog, IBM QRadar, Azure Monitor, Azure Blob Storage, Apache Kafka, and standard HTTP/HTTPS POST endpoints. You can connect to multiple destinations simultaneously.

Can I filter and customize the data before streaming?

Yes. Data Stream allows you to select specific variables using ready-made templates or create custom configurations. You can filter by domains, apply sampling to control volume, and transform data before delivery. This helps reduce costs and ensures you only stream the data you need.

How does Data Stream handle high traffic volumes?

Data Stream scales automatically with your traffic using Azion's distributed global architecture. You can choose 100% data capture or apply optional sampling to manage volume. The service handles traffic spikes seamlessly without configuration changes or performance degradation.

Is Data Stream suitable for compliance and audit requirements?

Yes. Data Stream enables continuous log retention for historical analysis and incident investigations. You can stream to secure storage endpoints like S3 or Azure Blob Storage for long-term retention, helping meet PCI-DSS, HIPAA, and other regulatory requirements.

How do I integrate Data Stream with my SIEM?

Configure a stream in Azion Console or via API, select your data source (Applications, WAF Events, etc.), choose a template or customize variables, and connect to your SIEM endpoint (Splunk, IBM QRadar, Datadog, etc.). The integration takes minutes to set up. See our SIEM integration guide for detailed steps.

What is the pricing model for Data Stream?

Data Stream uses transparent usage-based pricing. You pay only for the data you stream with no hidden fees, infrastructure costs, or vendor lock-in. Pricing is based on the volume of data transferred. Check the pricing page for detailed information.

What types of data can be exported for compliance and audit?

Data Stream exports comprehensive field types to support compliance and audit requirements. For Applications, you can capture request data (timestamp, client IP, HTTP method, URI, headers, user agent), response data (status codes, bytes sent, cache status), and performance metrics (upstream response time, request time). For WAF Events, fields include attack signatures, threat scores, blocked requests, rule IDs, and geographic data. Activity History provides user actions, API calls, configuration changes, and authentication events. All data includes precise timestamps, session identifiers, and can be filtered by domain, allowing you to meet requirements for PCI-DSS, HIPAA, SOC 2, and GDPR with detailed audit trails.

DOCS

Access to all features.

$300 free credits

Modernize your Application Security