HAQM Data Firehose now delivers real-time streaming data into HAQM S3 Tables

Posted on: Mar 14, 2025

Today, we are excited to announce the general availability of HAQM Data Firehose (Firehose) integration with HAQM S3 Tables, a feature that enables customers to deliver real-time streaming data into HAQM S3 Tables without requiring any code development or multi-step processes.

Firehose can acquire streaming data from HAQM Kinesis Data Streams, HAQM MSK, Direct PUT API, and AWS Services such as AWS WAF web ACL logs, HAQM VPC Flow Logs. It can then deliver this data to destinations like HAQM S3, HAQM Redshift, OpenSearch, Splunk, Snowflake, and others for analytics. Now, with the HAQM S3 Table integration, customers can stream data from any of these sources directly into HAQM S3 Tables. As a serverless service, Firehose allows customers to simply setup a stream by configuring the source and destination properties, and pay based on bytes processed.

The new feature also enables customers to route records in a data stream to different HAQM S3 tables based on the content of the incoming record. Additionally, customers can automate processing for data correction and right-to-forget scenarios by applying row-level update or delete operations in the destination S3 tables.

To get started, visit HAQM Data Firehose documentation and console.