Your logs from Web Application Firewall (WAF) can be integrated with SIEM platforms through Data Stream to monitor your applications behaviors, performance, and security.
Go to Data Stream reference- Access Azion Console > Data Stream.
- Click + Stream.
- Choose a unique and easy-to-remember name.
- On the Source dropdown menu, select Applications.
- On the Template dropdown menu, select Applications + WAF Event Collector.
- On Option, select between Filter Domains or All Current and Future Domains.
- Find more information on each option on How to associate domains on Data Stream.
- On the Destination section, select a Connector on the dropdown menu: Standard HTTP/HTTPS POST, Apache Kafka, Simples Storage Service (S3), Google BigQuery, Elasticsearch, Splunk, AWS Kinesis Data Firehose, Datadog, IBM QRadar, Azure Monitor, or Azure Blob Storage.
- You’ll see different fields depending on the endpoint type you choose. Find more information on each of them on the specific guide for the endpoint on the Observe guides section.
- Click the Save button.
- Run the following
POSTrequest, replacing[TOKEN VALUE]with your personal token:
curl --request POST --url https://api.azion.com/v4/data_stream/streams --header 'Accept: application/json' --header 'Authorization: Token [TOKEN VALUE]' --header 'Content-Type: application/json' --data '{ "name": "Conector Kafka", "active": true, "inputs": [ { "type": "raw_logs", "attributes": { "data_source": "http" } } ], "transform": [ { "type": "sampling", "attributes": { "rate": 100 } } ], "outputs": [ { "type": "kafka", "attributes": { "bootstrap_servers": "infra.my.net:9094,infra.my.net:9094", "topic": "mykafka.dts.topic" } } ]}'Wait a few minutes for the changes to propagate and your stream will be created.