Building a sophisticated CodePipeline with CDK in a Monorepo Setup

Rate this content
Bookmark
The video discusses using AWS CDK to create CI/CD pipelines in a monorepo setup. It explains how to set up a CodePipeline with stages and actions, using CodeCommit as the source. The video covers deploying to multiple AWS accounts and managing frontend and backend deployments with separate pipelines. It also describes how to use EventBridge and AWS Lambda to trigger pipelines based on code changes. The talk highlights the use of environment variables to control pipeline triggers and the role of manual approval in production stages. Viewers will learn about creating efficient pipelines using CDK's abstraction methods.

From Author:

Many companies are going all-in AWS and thus adopting their complete CodeSuite for their CI/CD processes. However, while CodePipeline is the platform for this process, it may not be the most user-friendly. In a Monorepo setup, it's typical to create multiple CI/CD pipelines for each package. However, there are several caveats to be aware of. For instance, you may encounter scenarios where multiple pipelines get triggered even if you just modified one file, or you may question the need to create multiple branches for each pipeline. In this talk, we provide valuable tips for building a sophisticated CodePipeline using CDK in a Monorepo environment. The techniques discussed in this talk are also transferrable to other CI/CD tools.

This talk has been presented at DevOps.js Conf 2024, check out the latest edition of this Tech Conference.

FAQ

The primary goal is to implement a fully automated CI/CD pipeline on AWS for a monorepo containing a backend and frontend, both written in TypeScript.

AWS CDK was chosen because it allows the use of TypeScript, which is the same language used by the developers, and it provides a nice abstraction that developers can easily understand.

EventBridge and AWS Lambda are used to trigger different pipelines based on changes detected in the code repository. EventBridge captures events from AWS services, and Lambda functions are used to determine which pipeline to trigger based on the changes.

Each pipeline contains stages, and each stage contains actions. Actions result in outputs called artifacts, which can be stored in an S3 bucket.

CodeCommit is used as the source repository where the monorepo is hosted. It triggers the pipeline whenever changes are made to the main branch.

Deployments to different AWS accounts are handled by looping over all accounts and making use of CDK's abstractions. A method called createPipelineProject is used to simplify this process.

Frontend and backend deployments are managed by creating separate pipelines for each. The appropriate pipeline is triggered based on the changes detected in the respective folders.

The Lambda function uses the CodeCommit get differences command to check what files have been changed. It then triggers the appropriate pipeline based on predefined paths in an object that maps file changes to pipelines.

The project uses AWS CDK, CodeCommit, CodePipeline, EventBridge, AWS Lambda, and the JavaScript SDK for CodeCommit and CodePipeline.

The createPipelineProject method is used to create pipelines in a simplified manner by abstracting repetitive code, making the process of creating new pipelines easier.

John Nguyen
John Nguyen
8 min
15 Feb, 2024

Comments

Sign in or register to post your comment.

Video Transcription

1. Introduction to AWS CDK and Pipeline Creation

Short description:

Imagine starting as an AWS DevOps engineer in a small company. Your boss wants an all-in AWS approach, with CICD entirely on AWS for automated deployments. Use AWS CDK for creating a pipeline, stages, and actions. Import code from CodeCommit, create stages with CodeCommit source action and code bit action. Next, deploy on different AWS accounts.

Hi. Imagine you started as an AWS DevOps engineer in a small company and your boss wants us to go all-in AWS. Of course, the CICD should be entirely on AWS for automated deployments. All the infrastructure should be put into code. And we are using a monorepo containing a backend and frontend both in TypeScript.

Great. You have been working with GitLab CI or GitHub Actions where you just need to define the Yammer file and it creates on the platform the pipeline for you. On AWS, it's a bit different. Each resource you have to create by yourself. So you decided to use AWS CDK because you can use TypeScript. You can use the same tooling as the developers. You are able to put the code inside their monorepo. And you have also a nice abstraction, which also the developer may understand.

Let's get your hands dirty. So you start by creating a pipeline using this abstraction new pipeline. And each code pipeline contains stages. And a stage contains actions. The actions result and outputs artifact. This artifact can be stored into an S3 bucket. So you import the code from CodeCommit where the monorepo is hosted. And you can use the to create your first stage by using the add stage. And here you create the CodeCommit source action. So this will be triggered whenever something happening on the main branch. And you create a second stage by using the code bit action. And in there, you use a code bit project. The code bit project you can take, you can define each phases. In this case, I use the build phase, where I want to install linting and test unit. And the result of this action can be then stored inside this folder in this case. And now the next stages are the deployment stages. So you want to deploy on different accounts on different AWS account.

2. Handling Multiple Pipelines and Manual Approval

Short description:

Create multiple pipelines for different accounts using CDK's abstraction. Use a custom method for pipeline creation, assuming roles and deploying with CDK. Handle manual approval for staging and production environments. Create separate pipelines for front end and back end deployments. Use environment variables to trigger the appropriate pipeline.

So you create the loop over all accounts. And you can also make use of CDKs abstraction. So you can create your own method, in this case, the create pipeline project. So instead of using this code, again, you can actually make it like easier by abstracting that and putting it in your own method, and then just make use of each phase. First, you assume the role, and then you use the CDK deploy command. After the deployment, you can run integration tests. You can also use Canary operations. In the staging and production environment, we have the manual approval. This manual approval action can be merged together with the corporate action. Only if it's not on default, otherwise, we just skip the manual approval action. But each time there's a change on the front end, the back end will also be deployed. To handle that, you need to create multiple pipelines. Now you have a pipeline for the front end and another pipeline for the back end. This could be the code, how it looks like. You could find multiple pipelines, and you can handle it by using environment bio.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Levelling up Monorepos with npm Workspaces
DevOps.js Conf 2022DevOps.js Conf 2022
33 min
Levelling up Monorepos with npm Workspaces
Top Content
NPM workspaces help manage multiple nested packages within a single top-level package, improving since the release of NPM CLI 7.0. You can easily add dependencies to workspaces and handle duplications. Running scripts and orchestration in a monorepo is made easier with NPM workspaces. The npm pkg command is useful for setting and retrieving keys and values from package.json files. NPM workspaces offer benefits compared to Lerna and future plans include better workspace linking and adding missing features.
End the Pain: Rethinking CI for Large Monorepos
DevOps.js Conf 2024DevOps.js Conf 2024
25 min
End the Pain: Rethinking CI for Large Monorepos
Today's Talk discusses rethinking CI in monorepos, with a focus on leveraging the implicit graph of project dependencies to optimize build times and manage complexity. The use of NX Replay and NX Agents is highlighted as a way to enhance CI efficiency by caching previous computations and distributing tasks across multiple machines. Fine-grained distribution and flakiness detection are discussed as methods to improve distribution efficiency and ensure a clean setup. Enabling distribution with NX Agents simplifies the setup process, and NX Cloud offers dynamic scaling and cost reduction. Overall, the Talk explores strategies to improve the scalability and efficiency of CI pipelines in monorepos.
AWS Lambda under the hood
Node Congress 2023Node Congress 2023
22 min
AWS Lambda under the hood
Top Content
In this Talk, key characteristics of AWS Lambda functions are covered, including service architecture, composition, and optimization of Node.js code. The two operational models of Lambda, asynchronous and synchronous invocation, are explained, highlighting the scalability and availability of the service. The features of Lambda functions, such as retries and event source mapping, are discussed, along with the micro VM lifecycle and the three stages of a Lambda function. Code optimization techniques, including reducing bundle size and using caching options, are explained, and tools like webpack and Lambda Power Tuning are recommended for optimization. Overall, Lambda is a powerful service for handling scalability and traffic spikes while enabling developers to focus on business logic.
Federated Microfrontends at Scale
React Summit 2023React Summit 2023
31 min
Federated Microfrontends at Scale
Top Content
Watch video: Federated Microfrontends at Scale
This Talk discusses the transition from a PHP monolith to a federated micro-frontend setup at Personio. They implemented orchestration and federation using Next.js as a module host and router. The use of federated modules and the integration library allowed for a single runtime while building and deploying independently. The Talk also highlights the importance of early adopters and the challenges of building an internal open source system.
AWS Lambda Performance Tuning
Node Congress 2024Node Congress 2024
25 min
AWS Lambda Performance Tuning
This Talk covers various optimization techniques for Lambda functions, including parameter fetching, code minification and bundling, observability with Power Tools and X-Ray, baseline testing with load testing tools, caching with Elastic Cache and Redis, and optimizing code size and memory usage. The importance of library choices, power tuning for cost and performance, leveraging subprocesses and sandboxes, and adjusting concurrency limits are also discussed. Overall, these techniques can significantly improve Lambda function performance.
Scale Your React App without Micro-frontends
React Summit 2022React Summit 2022
21 min
Scale Your React App without Micro-frontends
This Talk discusses scaling a React app without micro-frontend and the challenges of a growing codebase. Annex is introduced as a tool for smart rebuilds and computation caching. The importance of libraries in organizing code and promoting clean architecture is emphasized. The use of caching, NxCloud, and incremental build for optimization is explored. Updating dependencies and utilizing profiling tools are suggested for further performance improvements. Splitting the app into libraries and the benefits of a build system like NX are highlighted.

Workshops on related topic

React at Scale with Nx
React Summit 2023React Summit 2023
145 min
React at Scale with Nx
Top Content
Featured WorkshopFree
Isaac Mann
Isaac Mann
We're going to be using Nx and some its plugins to accelerate the development of this app.
Some of the things you'll learn:- Generating a pristine Nx workspace- Generating frontend React apps and backend APIs inside your workspace, with pre-configured proxies- Creating shared libs for re-using code- Generating new routed components with all the routes pre-configured by Nx and ready to go- How to organize code in a monorepo- Easily move libs around your folder structure- Creating Storybook stories and e2e Cypress tests for your components
Table of contents: - Lab 1 - Generate an empty workspace- Lab 2 - Generate a React app- Lab 3 - Executors- Lab 3.1 - Migrations- Lab 4 - Generate a component lib- Lab 5 - Generate a utility lib- Lab 6 - Generate a route lib- Lab 7 - Add an Express API- Lab 8 - Displaying a full game in the routed game-detail component- Lab 9 - Generate a type lib that the API and frontend can share- Lab 10 - Generate Storybook stories for the shared ui component- Lab 11 - E2E test the shared component
Node Monorepos with Nx
Node Congress 2023Node Congress 2023
160 min
Node Monorepos with Nx
Top Content
WorkshopFree
Isaac Mann
Isaac Mann
Multiple apis and multiple teams all in the same repository can cause a lot of headaches, but Nx has you covered. Learn to share code, maintain configuration files and coordinate changes in a monorepo that can scale as large as your organisation does. Nx allows you to bring structure to a repository with hundreds of contributors and eliminates the CI slowdowns that typically occur as the codebase grows.
Table of contents:- Lab 1 - Generate an empty workspace- Lab 2 - Generate a node api- Lab 3 - Executors- Lab 4 - Migrations- Lab 5 - Generate an auth library- Lab 6 - Generate a database library- Lab 7 - Add a node cli- Lab 8 - Module boundaries- Lab 9 - Plugins and Generators - Intro- Lab 10 - Plugins and Generators - Modifying files- Lab 11 - Setting up CI- Lab 12 - Distributed caching
Building Serverless Applications on AWS with TypeScript
Node Congress 2021Node Congress 2021
245 min
Building Serverless Applications on AWS with TypeScript
Workshop
Slobodan Stojanović
Slobodan Stojanović
This workshop teaches you the basics of serverless application development with TypeScript. We'll start with a simple Lambda function, set up the project and the infrastructure-as-a-code (AWS CDK), and learn how to organize, test, and debug a more complex serverless application.
Table of contents:        - How to set up a serverless project with TypeScript and CDK        - How to write a testable Lambda function with hexagonal architecture        - How to connect a function to a DynamoDB table        - How to create a serverless API        - How to debug and test a serverless function        - How to organize and grow a serverless application


Materials referred to in the workshop:
https://excalidraw.com/#room=57b84e0df9bdb7ea5675,HYgVepLIpfxrK4EQNclQ9w
DynamoDB blog Alex DeBrie: https://www.dynamodbguide.com/
Excellent book for the DynamoDB: https://www.dynamodbbook.com/
https://slobodan.me/workshops/nodecongress/prerequisites.html
Frontend to the Cloud Made Easy - A ReactJS + AWS Workshop
DevOps.js Conf 2024DevOps.js Conf 2024
59 min
Frontend to the Cloud Made Easy - A ReactJS + AWS Workshop
Workshop
Eyal Keren
Eyal Keren
This workshop enables you to learn how to develop React applications, and then deploy them to the cloud (or building them to the console) coupled with a backend, fully abstracted, with no complex backend configuration, simplifying the building and deployment of frontend & web apps to the cloud.