Customer Stories / Software &
Internet

2024
Miro Logo

Miro Drives Hypergrowth with HAQM EKS, Reducing Costs by 80%

Discover how HAQM EKS helped Miro, the visual workspace for innovation, scale its infrastructure and cut time to market during its hypergrowth phase

80%

reduction in compute cost compared to self-managed clusters

60% - 70%

reduction in developer overhead from maintenance tasks

70 million users

 supported worldwide with continuous innovation

Minutes to deploy

rather than weeks boosting deployment velocity

Scales seamlessly

from hundreds to thousands of nodes on demand

Overview

After a period of rapid growth, Miro, the innovation workspace, needed to drastically reduce the operational overhead for its engineering team to meet customer demand for new features. The company was hosted in the cloud on HAQM Web Services (AWS), but it was managing its own clusters, and its engineers were spending 60–70 percent of their time on maintenance.

So Miro began splitting its monolithic software into containerized microservices. Using HAQM Elastic Kubernetes Service (HAQM EKS), a managed service to run Kubernetes on AWS, Miro engineers were able to refocus on innovating and building new features. Miro also created self-service infrastructure for its developers, decreased time to market, increased scalability, and cut compute costs by 80 percent.

Visual Collaboration Platform

Opportunity | Using HAQM EKS to Reduce Maintenance and Manage Growth for Miro

Miro provides an online, visual workspace for teams to collaborate all along the innovation process. The workspace offers tools for brainstorming, analyzing, planning, diagramming, and task management, and these capabilities are enhanced by generative artificial intelligence. Supporting over 70 million users and 200,000 organizations globally, Miro has been at the forefront of innovation and teamwork across various industries since 2011.

Although Miro has always been hosted on AWS, it initially managed its infrastructure in-house. But as remote work surged in 2020, Miro’s user base expanded dramatically. The in-house approach became unsustainable as growth accelerated, which quickly enlarged overhead costs. Engineers at Miro found themselves dedicating 60–70 percent of their time to maintenance tasks, diverting essential resources away from new feature development. Furthermore, the self-management of the infrastructure led to recurring errors in resource allocation, as well as other critical challenges, increasing the pressure on the company’s operations. “We had to find a way of scaling our products, and we needed a more robust solution to support our growth,” says Rodrigo Fior Kuntzer, staff software engineer and infrastructure engineer at Miro.

Miro pivoted to a microservices architecture for more flexibility, agility, and ownership for developers. Miro first used self-managed Docker containers on HAQM Elastic Compute Cloud (HAQM EC2), which offers secure and resizable compute capacity for virtually any workload. This initial attempt at a microservices architecture was difficult to scale, so Miro then adopted Kubernetes, an open-source service for managing containers. Miro also shifted to using managed services to free up its infrastructure team by migrating to HAQM EKS. “HAQM EKS was the natural choice for us as a cloud offering so that we didn’t have to manage our own infrastructure,” says Ilia Medvetchii, senior engineering manager at Miro. “This was part of an overall paradigm shift for the company to adopt more managed AWS offerings so that we could hand over the complexity of day-to-day operations to AWS and spend our time helping our customers.”
 

kr_quotemark

All the features that HAQM EKS provides, including the managed control plane, made it superior compared with managing our own Kubernetes clusters."

Ilia Medvetchii
Senior Engineering Manager, Miro

Solution | Reducing Time to Market from Weeks to Minutes at 80 Percent Lower Cost

Using HAQM EKS, Miro accelerated time to market for the new, innovative features in its Intelligent Canvas, a significant product advancement launched in 2024. Miro now performs upgrades, deploys applications, and manages changes in minutes instead of weeks. The team utilizes out-of-the-box management features within HAQM EKS—such as automated cluster provisioning and upgrades—to quickly and easily spin up clusters and deploy to new regions, managing cluster lifecycles using infrastructure as code. Additionally, Miro takes advantage of built-in monitoring and metrics in HAQM EKS. “All the features that HAQM EKS provides, including the managed control plane, made it superior compared with managing our own Kubernetes clusters,” says Medvetchii.

By using HAQM EKS to manage its clusters, Miro decreased its operational overhead significantly. HAQM EKS simplifies the management of the Kubernetes control plane by ensuring it is always operational, up-to-date, and automatically scaled to meet Miro’s needs. “Using the managed control plane that HAQM EKS offers, our small team can focus on scaling, building new features, and providing developers with self-service,” Medvetchii adds.

Miro has created self-service options for developers to deploy infrastructure using HAQM EKS. To set up standardized deployments, the team implemented tools such as Karpenter, an open-source Kubernetes cluster autoscaler, and Kyverno, a policy engine designed for Kubernetes. The infrastructure team presets aspects like compliance, domain name system registration, secrets management, and policies using these tools. This allows developers to select the correct instances and resource types for their workloads and create new microservices on the fly without needing help or approval from a centralized team. “Now developers can create new microservices on the fly without having to request help or approval from a centralized team,” says Fior Kuntzer. The governance that Miro put in place not only facilitated self-service but also standardized microservice architecture and resource deployment strategies, simplifying troubleshooting and onboarding new team members.

To build a scalable and highly available Kubernetes cluster, Miro uses HAQM EKS with a managed node group to host Karpenter across three Availability Zones. Karpenter simplifies Kubernetes infrastructure management by launching right-sized nodes based on workload requirements. To optimize performance and efficiency, Miro employs open-source components like KEDA, which facilitates pod scaling to match workload demand. The compute infrastructure integrates with several AWS services, including AWS Secrets Manager, HAQM ECR, and Elastic Load Balancing (ELB). For secure operations, Miro uses IAM Roles for service accounts to manage credentials for workloads, allowing them to perform authorized API calls to AWS services. To achieve high compute flexibility, Miro leverages Karpenter NodePools and dynamically configures EC2 Spot Instances, Graviton instances, or x86 instances based on specific workload requirements.

Miro also achieves greater scalability and cost optimization using HAQM EKS. The company uses automatic scaling tools and features—including Karpenter and KEDA, an open-source Kubernetes-based event-driven autoscaler—to absorb traffic spikes, scaling from a few hundred nodes overnight to thousands during the day. “The ability to scale up and down according to usage helps us reduce costs and be more efficient in how we use our compute,” says Medvetchii. By transitioning to Kubernetes on HAQM EKS from its previous, self-managed container infrastructure, Miro reduced costs by 80 percent. The company further reduced costs by 70 percent for nonproduction workloads and 50 percent for production workloads by using Karpenter to automatically launch the appropriate HAQM EC2 instances, including HAQM EC2 Spot Instances, which offer up to a 90 percent discount compared to HAQM EC2 On-Demand Pricing.

Architecture Diagram

Figure 1: Diagram of HAQM EKS infrastructure at Miro

Outcome | Accelerating Global Expansion on HAQM EKS

Miro runs 15–20 Kubernetes clusters on HAQM EKS, and it expects to continue growing. Strategically, the company aims to grow its capability to support a much larger, global fleet using additional tools such as Knative, an open-source framework for building and running serverless workloads on Kubernetes.

“By adopting Kubernetes and HAQM EKS, we are no longer overwhelmed with the details of provisioning infrastructure for teams,” says Fior Kuntzer. “We can celebrate when a new microservice is launched without spending hours supporting the team that was launching it.”

About Miro

Miro is the innovation workspace that helps distributed teams of any size build the next big thing. The platform’s infinite canvas, which serves more than 70 million users worldwide, helps teams lead engaging workshops and meetings, brainstorm, design and deliver products, and more.

Miro's Team

Ilya Medvetchii

Marina Cherkalina

Rodrigo Fior Kuntzer

AWS Services Used

HAQM EKS

HAQM Elastic Kubernetes Service (HAQM EKS) is a managed Kubernetes service to run Kubernetes in the AWS cloud and on-premises data centers. 

Learn more »

HAQM ECR

HAQM Elastic Container Registry (HAQM ECR) is a fully managed container registry offering high-performance hosting, so you can reliably deploy application images and artifacts anywhere.

Learn more »

HAQM EC2

HAQM Elastic Compute Cloud (HAQM EC2) offers the broadest and deepest compute platform, with over 750 instances and choice of the latest processor, storage, networking, operating system, and purchase model to help you best match the needs of your workload. 

Learn more »

AWS Secrets Manager

HAQM Secrets Manager helps you manage access to your applications, services, and IT resources.

Learn more »

More Software & Internet Customer Stories

no items found 

1

Get Started

Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.