AWS Big Data Blog
Category: HAQM Redshift
HAQM SageMaker Lakehouse now supports attribute-based access control
HAQM SageMaker Lakehouse now supports attribute-based access control (ABAC) with AWS Lake Formation, using AWS Identity and Access Management (IAM) principals and session tags to simplify data access, grant creation, and maintenance. In this post, we demonstrate how to get started with SageMaker Lakehouse with ABAC.
Accelerate your analytics with HAQM S3 Tables and HAQM SageMaker Lakehouse
HAQM SageMaker Lakehouse is a unified, open, and secure data lakehouse that now seamlessly integrates with HAQM S3 Tables, the first cloud object store with built-in Apache Iceberg support. In this post, we guide you how to use various analytics services using the integration of SageMaker Lakehouse with S3 Tables.
Integrate ThoughtSpot with HAQM Redshift using AWS IAM Identity Center
In this post, we walk you through the process of setting up ThoughtSpot integration with HAQM Redshift using IAM Identity Center authentication. The solution provides a secure, streamlined analytics environment that empowers your team to focus on what matters most: discovering and sharing valuable business insights.
Enhance Agentforce data security with Private Connect for Salesforce Data Cloud and HAQM Redshift – Part 3
In this post, we discuss how to create AWS endpoint services to improve data security with Private Connect for Salesforce Data Cloud.
Using HAQM S3 Tables with HAQM Redshift to query Apache Iceberg tables
In this post, we demonstrate how to get started with S3 Tables and HAQM Redshift Serverless for querying data in Iceberg tables. We show how to set up S3 Tables, load data, register them in the unified data lake catalog, set up basic access controls in SageMaker Lakehouse through AWS Lake Formation, and query the data using HAQM Redshift.
Unlock the power of optimization in HAQM Redshift Serverless
In this post, we demonstrate how HAQM Redshift Serverless AI-driven scaling and optimization impacts performance and cost across different optimization profiles.
Build a secure data visualization application using the HAQM Redshift Data API with AWS IAM Identity Center
In this post, we dive into the newly released feature of HAQM Redshift Data API support for SSO, HAQM Redshift RBAC for row-level security (RLS) and column-level security (CLS), and trusted identity propagation with AWS IAM Identity Center to let corporate identities connect to AWS services securely. We demonstrate how to integrate these services to create a data visualization application using Streamlit, providing secure, role-based access that simplifies user management while making sure that your organization can make data-driven decisions with enhanced security and ease.
HAQM Redshift announces history mode for zero-ETL integrations to simplify historical data tracking and analysis
This post will explore brief history of zero-ETL, its importance for customers, and introduce an exciting new feature: history mode for HAQM Aurora PostgreSQL-Compatible Edition, HAQM Aurora MySQL-Compatible Edition, HAQM Relational Database Service (HAQM RDS) for MySQL, and HAQM DynamoDB zero-ETL integration with HAQM Redshift.
HAQM Redshift Serverless adds higher base capacity of up to 1024 RPUs
In this post, we explore the new higher base capacity of 1024 RPUs in Redshift Serverless, which doubles the previous maximum of 512 RPUs. This enhancement empowers you to get high performance for your workload containing highly complex queries and write-intensive workloads, with concurrent data ingestion and transformation tasks that require high throughput and low latency with Redshift Serverless.
How Open Universities Australia modernized their data platform and significantly reduced their ETL costs with AWS Cloud Development Kit and AWS Step Functions
At Open Universities Australia (OUA), we empower students to explore a vast array of degrees from renowned Australian universities, all delivered through online learning. In this post, we show you how we used AWS services to replace our existing third-party ETL tool, improving the team’s productivity and producing a significant reduction in our ETL operational costs.