Parsing Millions of URLs per Second

Rate this content
Bookmark

With the end of Dennard scaling, the cost of computing is no longer falling at the hardware level: to improve efficiency, we need better software. Competing JavaScript runtimes are sometimes faster than Node.js: can we bridge the gap? We show that Node.js can not only match faster competitors but even surpass them given enough effort. URLs are the most fundamental element in web applications. Node.js 16 was significantly slower than competing engines (Bun and Deno) at URL parsing. By reducing the number of instructions and vectorizing sub-algorithms, we multiplied by three the speed of URL parsing in Node.js (as of Node.js 20). If you have upgraded Node.js, you have the JavaScript engine with the fastest URL parsing in the industry with uncompromising support for the latest WHATGL URL standard. We share our strategies for accelerating both C++ and JavaScript processing in practice.

This talk has been presented at Node Congress 2024, check out the latest edition of this JavaScript Conference.

FAQ

Optional components of a URL include the username, password, hostname, port number, pathname, search query, and hash. Even the hostname is optional if you have a file URL.

The ADA URL parser is significantly faster than alternatives, capable of parsing 6 million URLs per second, and is 6-7% faster than curl. It achieves these results through various optimizations and efficient coding practices.

The URL specification supports various types of URLs including non-ASCII format URLs, file-based URLs, JavaScript URLs, percent-encoded URLs, and URLs with IPv4 and IPv6 addresses.

The ADA URL parser's benchmark and source code are available on GitHub at github.com/adurl/gs-url-benchmark.

Elzen Zipli is a senior software engineer at Sentry, an OGS technical steering committee member, and an OpenJS foundation cross-project council member.

The purpose of the talk by Elzen Zipli is to discuss how to parse millions of URLs per second and to explain the improvements in URL parsing performance in Node.js.

Node.js 18 introduced a new URL parsing dependency which resulted in performance improvements of up to 400% in URL parsing.

ADA URL parser is a high-performance URL parser named after Elzen Zipli's daughter. It supports the full WhatWG URL specification, has no dependencies, and is highly portable. It can parse 6 million URLs per second and is used by Node.js and Cloudflare workers.

The ADA URL parser uses several optimizations, including perfect hashing, memoization tables to reduce the number of branches, and vectorization to process multiple bytes at once instead of one by one.

You can reach Elzen Zipli via his GitHub account at github.com or via X (formerly known as Twitter) at X.com.

 Yagiz Nizipli
Yagiz Nizipli
14 min
04 Apr, 2024

Comments

Sign in or register to post your comment.

Video Summary and Transcription

Today's talk explores the performance of URL parsing in Node.js and introduces the ADA URL parser, which can parse 6 million URLs per second. The ADA URL parser includes optimizations such as perfect hashing, memoization tables, and vectorization. It is available in multiple languages and has bindings for popular programming languages. Reach out to Ada URL and Daniel Lemire's blog for more information.

1. URL Parsing and Performance

Short description:

Today's talk is about parsing millions of URLs per second and achieving a 400% improvement. We will explore the state of Node.js performance in 2023 and the impact of a new URL parsing dependency. We'll also discuss the structure of a URL and the various components involved.

Hello. Today I'm going to talk about parsing millions of URLs per second. My name is Elzen Zipli and I'm a senior software engineer at Sentry. I'm an OGS technical steering committee member. I'm an OpenJS foundation cross-project council member. You can reach me from my GitHub account, github.com, and from X, formerly known as Twitter, from X.com.

Software performance in the last decade has changed drastically. The main goal was to reduce cost in cloud environments such as AWS, Azure or Google Cloud. The latency has been a problem, and in order to improve it, we need to optimize our code more than ever right now. Reduce complexity, parallelism, caching, and performance brings those kinds of things. And most importantly, the climate change. Faster computers resulted in better tomorrows and better climate.

So state of Node.js performance 2023. This is a quote from that. Since Node.js 18, a new URL parsing dependency was added to Node.js 8. This addition bumps the Node.js performance from parsing URLs to a new level. Some results could reach up to an improvement of 400%. State of Node.js performance 2023 and this is written by Rafael Gonzaga, which is a Node.js technical steering committee member. This talk is about how we reach 400% improvement in URL parsing. Another quote from James Snell from Cloudflare and also Node.js TSC. Just set a benchmark for a code change, go from 11 seconds to complete down to about half a second to complete, this makes me very happy. This is in reference to adding Ada URL to Cloudflare.

So let's start with the structure of a URL. For example, there is HTTPS user pass at example.com, 1 2 3 4, which is the port number, then we have Foo, Bar, Buzz, and QUU. So it starts with the protocol, HTTPS is the protocol, it ends with the slash. Then we have the username and password. This is an optional field in all URLs. Then we have the host name, which is example.com. Then we have the port, which is 1 2 3 4. And then we have the path name, which is slash Foo slash Bar.

2. URL Parsing and Assumptions

Short description:

URLs have various optional components, different encodings, and support for different types of URLs like file-based URLs, JavaScript URLs, and path names with dots. Implementations like PHP, Python, curl, and Go follow different URL parsing specifications. We challenge the assumptions that URL parsing doesn't matter and URLs are free.

And then we see the search, which starts with a question mark Buzz. And then we have the hash, which is QUU. So port number, path name, search, hash, username, password, they're all optional. Even host name is optional if you have a file URL. But this is just an example about how this structure of a URL is. There are also, despite the structure of the URL, there is also different encodings that the URL specification supports, such as non-ASCII format, which is the first one. Then we support file-based URLs, which is what you see in Unix-based systems, file, slash, slash, slash, Foo, Bar, Buzz, Foo, Bar, Test, Node.js. Then we have JavaScript URLs, which is JavaScript colon alert. Then we have percent encoding that starts with a URL that has subsections, sub strings that has a percentage character in dash. And then we have path names with dots, which is like example.org slash dot A slash A dot dot slash B, which basically resolves into a different URL according to the URL specification. Then we have IPv4 addresses with hex and octal digits, 127.0.0.0.0.0.0.0.1, which is 127.0.0.0.1. And we have also IPv6 and so on and so forth. According to what we do URL, if we put in this input string, HTTPS711home dot dot slash Montreal. PHP in PHP, it's unchanged. In Python, it's unchanged. In what we do URL, which is implemented by Chrome, Safari and all the browsers, including Ada, it's xn dash dash 711 and so on so forth. In curl, it's a lot different. And as you see, in Go runtime, it's a lot different as well. This is mostly because of different implementations and also from all other subsystems, all other languages doesn't follow the what we do URL strictly. For PHP and Python, they basically parse the URL from start and string without any making allocations. And for curl and Go, they implement a different specification called RFC 3787. Or similar, I'm not really quite sure about. So we have these old assumptions like does URL parsing really matter? Is it the bottleneck to some performance metric? URLs are free, you don't gain anything by overlaying. This is what, these were the assumptions that we broke with our work. And you will see why.