The Rise of the Dynamic Edge

Rate this content
Bookmark

Over the last few years, the JS web app community has seen a growing awareness and focus on performance & scalability. The days when prominent production sites serve entirely-blank pages waiting on monolithic bundles of megates of JavaScript are (mostly!) behind us.

A big part of that has been deeper integration with CDNs, after all, round-trip latency is one of the primary determiners of performance for a global audience. But frameworks, and the companies that support them, have different approaches to how they can be used, and the operational complexities their strategies introduce have real consequences.

But what if, instead of a dynamic origin sending instructions to a static CDN, you could run your application directly on the edge? As it turns out, that doesn't just improve performance, it also vastly simplifies our deployment and maintenance lives too.

This talk has been presented at DevOps.js Conf 2021, check out the latest edition of this Tech Conference.

FAQ

Glen Madden is a speaker and developer who has worked on various open-source projects in the React space, including CSS modules and styled components. He is the creator of frontend application bundles (FABs) and has recently been involved with Cloudflare Workers.

Frontend application bundles (FABs) are a project created by Glen Madden that focuses on production performance and deployment. FABs aim to compile any application using any framework into a single server entry point and a directory full of assets, making it possible to deploy apps globally.

Latency significantly impacts download speeds. In an experiment conducted by Glen Madden, it was observed that as latency increases, the time to download files also increases, even if the file size is small. This is due to the way TCP works, where it starts slow and ramps up as it detects good network conditions.

CDNs are crucial for frontend app workflows because they help reduce latency by caching content closer to users. This geographical distribution improves performance and ensures faster load times for web applications.

The stale-while-revalidate header allows a CDN to serve stale content while fetching a new version from the origin server. This ensures that users receive fast responses while the content is being updated in the background, minimizing load on the origin server.

JAMstack is a modern web development architecture that involves serving static files from a CDN. It simplifies the deployment process and enhances site stability by eliminating the need for a live origin server. However, it can be inefficient for large sites with frequently changing content.

JAMstack can be inefficient for large sites as every change to the site requires generating the entire site afresh. This can impede quick iterations and increase build times, especially for websites with thousands of pages.

The dynamic edge is an emerging concept where the CDN is not just a static cache but can run entire applications close to the users. This eliminates the need for a traditional origin server and leverages global edge locations to host frontend code, improving performance and reducing latency.

Deploying apps to the edge using Cloudflare Workers offers several advantages, including no cold start impact, tighter CPU and RAM limits, and the ability to run JavaScript in a V8 container across over 200 global locations. This setup ensures faster response times and better performance.

According to Glen Madden, the future of frontend hosting lies in the dynamic edge, where applications are hosted globally at the edge locations of CDNs like Cloudflare Workers. This approach eliminates the need for traditional origin servers and offers significant performance improvements.

Glen Maddern
Glen Maddern
32 min
01 Jul, 2021

Comments

Sign in or register to post your comment.

Video Summary and Transcription

The Talk discusses the rise of the dynamic edge and the past, present, and future of frontend hosting. It emphasizes the impact of latency on CDN usage and the relevance of CDNs in JavaScript application development. The use of CDNs for rapidly changing content and the benefits of the Jamstack approach are explored. The future of the dynamic edge lies in platforms like Cloudflare Workers. The Talk also highlights the performance benefits of running Frontend Application Bundles (FABs) on the edge and the challenges faced in achieving optimal performance.
Available in Español: El Auge del Borde Dinámico

1. Introduction to the Rise of the Dynamic Edge

Short description:

Hello. My name is Glen. My talk today is the rise of the dynamic edge, or another way to talk about it would be the past, present, and future of frontend hosting. I've done a couple of open source projects in the React space. More recently, I started a project called frontend application bundles, or FABs, which is at FAB.dev, as well as a product around deployments called link.sh. Last year Link was acquired by Cloudflare workers. Now I get to approach the same problem but from a point of view of an entire platform, an entire global platform, which is pretty exciting.

Okay. Hello. My name is Glen. My talk today is the rise of the dynamic edge, or another way to talk about it would be the past, present, and future of frontend hosting. If you don't know me, my name is Glen Madden, that's me on Twitter, that's probably the easiest way to get in touch. I've done a couple of open source projects in the React space. A couple on styling, CSS modules and styled components. More recently, a couple of years ago I switched gears and started thinking about production performance and deployment and started a project called frontend application bundles, or FABs, which is at FAB.dev, as well as a product around deployments called link.sh. Fairly excitingly, last year Link was acquired by Cloudflare workers. I've only been there a couple of months, but now I get to kind of approach the same problem but from a point of view of an entire platform, an entire global platform, which is pretty exciting.

2. The Impact of Latency on CDN Usage

Short description:

Today, I will discuss how CDNs have become an integral part of our front end app workflows. CDNs are widely used due to their geographical distribution, which plays a crucial role in reducing latency. I conducted an experiment comparing download speeds from different locations and found that even a small increase in latency can significantly impact download times. This is because of the way TCP works, where the initial data transfer is slower and gradually ramps up. Therefore, being local to the server is essential for optimal performance.

So today I wanted to drill into something that I found really interesting over the last few years getting into this stuff, which is how we've come to depend on and how CDNs have become a part of our front end app workflows. So just to recap, a traditional CDN architecture has the CDN in between your users and your origin server, your actual host. And requests flow through and responses flow back. The CDN will take copies of those requests, responses, depending on some algorithms, some directives. Your origin server is the ground truth.

So why do people use CDNs? Well, they're everywhere, right? This is CloudFlare's network. It's over 200 locations. But it might be a little bit surprising to just see just how important that geographical distribution is. Why do they need to be in so many locations? So I wanted to start today's talk by looking over something I'd actually looked at a couple years ago, which is about the impact of latency. This was an experiment I ran for a web series I was doing called Frontend Center, where I ran a bandwidth test, or a download speed test, from Melbourne, where I was living at the time, against three different locations. Sidney, San Jose, and London. Now, Sidney's only 15 milliseconds away. San Jose is on the other side of the Pacific. And London is 280 milliseconds by speed of light, or as I live there now, it's a lot longer by plane, let me tell you.

So when you have a small file, you get download speeds, or total download times, pretty much exactly what you'd expect. It's just one single round trip to the server. So the further the server is away, the longer it takes for the file to download. But what might be surprising is just when you have a fast connection to a local box, and this is between two data centers, so there's no bandwidth constraints here at all, really. For a 250 kilobyte file, we're still a fraction of a second. But when you add some latency into this picture, things start to get pretty different. At 200 kilobytes, you're now looking at 2 seconds in the best case scenario to download that file. And if you double the latency, the same effect is doubled. Now this might be surprising, because those servers are only, you know, 100 or 200 milliseconds further away, and yet the download times are taking 10 times longer, or 30 times longer in some cases. And these steps are actually the latency between those hops. So each jump on the graph is 160 milliseconds. Each jump on the red line is 280. This is because of the way TCP, the protocol, works underneath everything else, where it starts slow and ramps up as it detects that the network conditions are good enough. This means that the first 100 kilobytes cost a lot, you know, that every 100 kilobytes from then on, can increasingly cost your performance. And much more so than you might think. So being local is really important.

QnA

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced Conference 2022React Advanced Conference 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Watch video: Speeding Up Your React App With Less JavaScript
Mishko, the creator of Angular and AngularJS, discusses the challenges of website performance and JavaScript hydration. He explains the differences between client-side and server-side rendering and introduces Quik as a solution for efficient component hydration. Mishko demonstrates examples of state management and intercommunication using Quik. He highlights the performance benefits of using Quik with React and emphasizes the importance of reducing JavaScript size for better performance. Finally, he mentions the use of QUIC in both MPA and SPA applications for improved startup performance.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
Levelling up Monorepos with npm Workspaces
DevOps.js Conf 2022DevOps.js Conf 2022
33 min
Levelling up Monorepos with npm Workspaces
Top Content
NPM workspaces help manage multiple nested packages within a single top-level package, improving since the release of NPM CLI 7.0. You can easily add dependencies to workspaces and handle duplications. Running scripts and orchestration in a monorepo is made easier with NPM workspaces. The npm pkg command is useful for setting and retrieving keys and values from package.json files. NPM workspaces offer benefits compared to Lerna and future plans include better workspace linking and adding missing features.
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Today's Talk discusses the future of performance tooling, focusing on user-centric, actionable, and contextual approaches. The introduction highlights Adi Osmani's expertise in performance tools and his passion for DevTools features. The Talk explores the integration of user flows into DevTools and Lighthouse, enabling performance measurement and optimization. It also showcases the import/export feature for user flows and the collaboration potential with Lighthouse. The Talk further delves into the use of flows with other tools like web page test and Cypress, offering cross-browser testing capabilities. The actionable aspect emphasizes the importance of metrics like Interaction to Next Paint and Total Blocking Time, as well as the improvements in Lighthouse and performance debugging tools. Lastly, the Talk emphasizes the iterative nature of performance improvement and the user-centric, actionable, and contextual future of performance tooling.
Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
PlayCanvas is an open-source game engine used by game developers worldwide. Optimization is crucial for HTML5 games, focusing on load times and frame rate. Texture and mesh optimization can significantly reduce download sizes. GLTF and GLB formats offer smaller file sizes and faster parsing times. Compressing game resources and using efficient file formats can improve load times. Framerate optimization and resolution scaling are important for better performance. Managing draw calls and using batching techniques can optimize performance. Browser DevTools, such as Chrome and Firefox, are useful for debugging and profiling. Detecting device performance and optimizing based on specific devices can improve game performance. Apple is making progress with WebGPU implementation. HTML5 games can be shipped to the App Store using Cordova.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured WorkshopFree
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Building WebApps That Light Up the Internet with QwikCity
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
Featured WorkshopFree
Miško Hevery
Miško Hevery
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
Next.js 13: Data Fetching Strategies
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
WorkshopFree
Alice De Mauro
Alice De Mauro
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Performance Debugging
React Advanced Conference 2023React Advanced Conference 2023
148 min
React Performance Debugging
Workshop
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
Deploying React Native Apps in the Cloud
React Summit 2023React Summit 2023
88 min
Deploying React Native Apps in the Cloud
WorkshopFree
Cecelia Martinez
Cecelia Martinez
Deploying React Native apps manually on a local machine can be complex. The differences between Android and iOS require developers to use specific tools and processes for each platform, including hardware requirements for iOS. Manual deployments also make it difficult to manage signing credentials, environment configurations, track releases, and to collaborate as a team.
Appflow is the cloud mobile DevOps platform built by Ionic. Using a service like Appflow to build React Native apps not only provides access to powerful computing resources, it can simplify the deployment process by providing a centralized environment for managing and distributing your app to multiple platforms. This can save time and resources, enable collaboration, as well as improve the overall reliability and scalability of an app.
In this workshop, you’ll deploy a React Native application for delivery to Android and iOS test devices using Appflow. You’ll also learn the steps for publishing to Google Play and Apple App Stores. No previous experience with deploying native applications is required, and you’ll come away with a deeper understanding of the mobile deployment process and best practices for how to use a cloud mobile DevOps platform to ship quickly at scale.