AWS Database Blog

Category: Intermediate (200)

How Aqua Security exports query data from HAQM Aurora to deliver value to their customers at scale

Aqua Security is the pioneer in securing containerized cloud native applications from development to production. Like many organizations, Aqua faced the challenge of efficiently exporting and analyzing large volumes of data to meet their business requirements. Specifically, Aqua needed to export and query data at scale to share with their customers for continuous monitoring and security analysis. In this post, we explore how Aqua addressed this challenge by using aws_s3.query_export_to_s3 function with their HAQM Aurora PostgreSQL-Compatible Edition and AWS Step Functions to streamline their query output export process, enabling scalable and cost-effective data analysis.

Monitor the health of HAQM Aurora PostgreSQL instances in large-scale deployments

In this post, we show you how to achieve better visibility into the health of your HAQM Aurora PostgreSQL instances, proactively address potential issues, and maintain the smooth operation of your database infrastructure. The solution is designed to scale with your deployment, providing robust and reliable monitoring for even the largest fleets of instances.

How Iterate.ai uses HAQM MemoryDB to accelerate and cost-optimize their workforce management conversational AI agent

Iterate.ai is an enterprise AI platform company delivering innovative AI solutions to industries such as retail, finance, healthcare, and quick-service restaurants. Among its standout offerings is Frontline, a workforce management platform powered by AI, designed to support and empower Frontline workers. Available on both the Apple App Store and Google Play, Frontline uses advanced AI tools to streamline operational efficiency and enhance communication among dispersed workforces. In this post, we give an overview of durable semantic caching in HAQM MemoryDB, and share how Iterate used this functionality to accelerate and cost-optimize Frontline.

Diving deep into the new HAQM Aurora Global Database writer endpoint

On October 22, 2024, we announced the availability of the Aurora Global Database writer endpoint, a highly available and fully managed endpoint for your global database that Aurora automatically updates to point to the current writer instance in your global cluster after a cross-Region switchover or failover, alleviating the need for application changes and simplifying routing requests to the writer instance. In this post, we dive deep into the new Global Database writer endpoint, covering its benefits and key considerations for using it with your applications.

Use HAQM Neptune Analytics to analyze relationships in your data faster, Part 1: Introducing Parquet and CSV import and export

In this two-part series, we show how you can import and export using Parquet and CSV to quickly gather insights from your existing graph data. Part 1 introduces the import and export functionalities, and walks you through how to quickly get started with them. In Part 2, we show how you can use the new data mobility improvements in Neptune Analytics to enhance fraud detection.

How Skello uses AWS DMS to synchronize data from a monolithic application to microservices

Skello is a human resources (HR) software-as-a-service (SaaS) platform that focuses on employee scheduling and workforce management. It caters to various sectors, including hospitality, retail, healthcare, construction, and industry. In this post, we show how Skello uses AWS Database Migration Service (AWS DMS) to synchronize data from an monolithic architecture to microservices and perform data ingestion from the monolithic architecture and microservices to our data lake.

How Orca Security optimized their HAQM Neptune database performance

Orca Security, an AWS Partner, is an independent cybersecurity software provider whose patented agentless-first cloud security platform is trusted by hundreds of enterprises globally. At Orca Security, we use a variety of metrics to assess the significance of security alerts on cloud assets. Our HAQM Neptune database plays a critical role in calculating the exposure of individual assets within a customer’s cloud environment. By building a graph that maps assets and their connectivity between one another and to the broader internet, the Orca Cloud Security Platform can evaluate both how an asset is exposed as well as how an attacker could potentially move laterally within an account. In this post, we explore some of the key strategies we’ve adopted to maximize the performance of our HAQM Neptune database.

Vacasa’s migration to HAQM Aurora for a more efficient Property Management System

Vacasa is North America’s leading vacation rental management platform, revolutionizing the rental experience with advanced technology and expert teams. In the competitive short-term vacation property management industry, efficient systems are critical. To maintain its edge and continue providing top-notch service, Vacasa needed to modernize its primary transactional database to improve performance, provide high availability, and reduce costs. In this post, we share Vacasa’s journey from HAQM Relational Database Service (HAQM RDS) for MariaDB to HAQM RDS for MySQL, and finally to HAQM Aurora, highlighting the technical steps taken and the outcomes achieved.

Enhance the reliability of airlines’ mission-critical baggage handling using HAQM DynamoDB

In the world of air travel, baggage handling isn’t just about keeping track of baggage, but a seamless orchestration of different processes to improve the passenger baggage experience. A key component to make this happen is a strong database management strategy. In this post, we discuss how AWS Partner IBM Consulting developed an initiative to modernize a traditional baggage database architecture using HAQM DynamoDB and other HAQM Web Services (AWS) managed services, addressing the evolving needs of the airline industry.

Transition from AWS DMS to zero-ETL to simplify real-time data integration with HAQM Redshift

The zero-ETL integrations for HAQM Redshift are designed to automate data movement into HAQM Redshift, eliminating the need for traditional ETL pipelines. With zero-ETL integrations, you can reduce operational overhead, lower costs, and accelerate your data-driven initiatives. This enables organizations to focus more on deriving actionable insights and less on managing the complexities of data integration. In this post, we discuss the best practices for migrating your ETL pipeline from AWS DMS to zero-ETL integrations for HAQM Redshift.