1. Introduction to Node Features
Hello, JS Nation. What's new in Node? In this talk, I'll be walking you through some features from version 19 and a few from 18 and definitely a lot from 20. Node version 20 introduces experimental permission features, giving you control over the process. You can allow or deny permissions for file system, child processes, or worker threads. Another feature is single executables, which allows you to package a standalone application into a single binary that can run on any operating system without requiring node.
Hello, JS Nation. What's new in Node? I'm Hemant. I'm a senior staff engineering manager at PayPal. I'm a TC 39 delegate and a Google developer expert for web and payments domain. You can have a look at me on htman.com or tweet to me at GNU month.
If you look at the timeline today for different versions of Node and what state they are in, the current version is version 19 and we have version 20, 20.0.1 to be precise that got released lately. In this talk, I'll be walking you through some features that you might be aware or you might have missed from version 19 and a few from 18 and definitely a lot from 20, and a few of them probably are still in the master branch or the main branch and they might be unstable. So, let's have a look.
Node version 20. First, I'm going to talk about the experimental permission features that kind of gives you control on what the process can do. You can allow it to read a particular file system or a child process. In this example, we see we say node followed by the experimental permissions flag and say allow FS write on slash temp. And we say allow FS read on slash home index or GS and that index or GS would be able to only write to slash temp and read from slash home index stop. Yes. And also we can say allow child processes on index or GS and the process permission that has FS dot right would be true because it's allowed and process dot permissions dot So it takes a second parameter where you can pass the path and see whether the current process has the permission to write to that path or not. And that returns true. If you go in and execute this index.js with experimental permissions in this particular example, it throws an error saying access to the API has been restricted. So it can be file system. It can be child process or worker threads which can be controlled. You can also deny the permissions or check whether it's there or not. So you have total control on what the process has access to. It can be parallel to some of the other runtime engines like dino, but this is super cool to see it on node.
Next up we have single executables. Let's take this example where I say echo console.log, hello process.org BF2 and send it to hello.js and they copy or command node to hello the path that's from the node. I use post-jet on npmx based on what operating system you are on or on OS X or different systems. We just use this command and have node.js use that particular binary and then we would have a single executable file. If we look into .slash hello and I pass in word, we get hello word. So thereby it makes it very interesting for folks who are writing standalone applications using node. They can package it to a single binary and send it over to any operating system and the requirement is the operating system need not have node. This whole thing is packed with node and it could run.
2. Experimental Loader Flag and import.meta.resolve
This part introduces the experimental loader flag, which gives developers control over the module loading process. By hooking into different phases of the loading process, such as resolving, getting the source, getting the format, and transforming the source, developers can customize how modules are loaded. Additionally, the import.meta.resolve feature allows for resolving file paths or URLs based on the current environment, providing flexibility in module resolution.
So this makes it super interesting for folks who are writing standalone applications. So we can just package it, bundle it and send it over and it just, the user just clicks and it works fine. We also have the experimental loader flag. What experimental loader flag basically does is it gives you control on the loading process. So whenever a module is loaded, it goes through different phases where it goes to resolve or get source or get format or transform source and you can hook through all of this and take total control over that process. In this example, we have loader.js which has resolve, get format and get source which eventually calls the default get format on URL and context, but you can apply your own logic there to control how the loader works. This gives us total control on loading modules and injecting based on our needs. This part of the move esm loaders of thread, we have import.meta.resolve, which works across ENVs. So if you're on a browser env or node, if you were to do import.meta.resolve in this particular example foo, it resolves to that file path. If you were in a browser, then it would resolve to that URL. So import.meta.resolve works irrespective of which environment we are in.
3. Node 18.6.0 Test Suite and Array Manipulation
In node 18.6.0, an experimental test suite was exposed, allowing the use of describe and it within node using the test runner. This feature is now stabilized and released in version 20. It also introduces the ability to mock methods while testing. Another feature implemented in version 20 is the addition of the isWellFormed and toWellFormed methods on the string prototype, which make it easier to check and convert Unicode strings. Additionally, version 20 includes methods like toSorted, toReverse, and toSpliced, which allow for array manipulation without modifying the original array. The copy-width proposal, also implemented in version 20, provides constructs for operating on arrays without modification. Array buffers and shared buffers on resizable are also available in this version.
So in node 18.6.0, I kind of expose an experimental test suite on node colon test, from which you could have import describe and it. Here's a simple example where we import describe and it from node test. And then we say describe Unity plus operator and it should add two numbers and assert strict SQL. I on purpose in this tweet and added it as Unity plus, but it is a binary operation that's happening there. So you get the point. Now you could use describe and it within node using the test runner. And it gels well with your existing test runners. And we have the node colon test to make sure that it's the node built in that we are using. This was exposed in 18.6.0 but it's stabilized and it's stable release in 20. It also gives you a feature to mock. In this example we have constant number with value five, which has add method in it. We could say test.mock method, mock that add within that number object. So in this case we have mocked add and we can make sure that that mock is being invoked by calling number.add.mockCost.length to be one. So that we have executed in the previous step once and the value would be one in this case. So that that would allow us to mock methods while testing.
IsWellFormed and toWellFormed are the other two methods on the string prototype that's implemented in 20. So if you have say a string of Unicodes and you want to check whether they are well formed or not, as we know they would have two parts of a Unicode and if one of the parts is missing then you can assume that it's not well formed. So in this case we are doing strings.map on each of those strings in that array and saying isWellFormed then it returns whether it's false or true based on whether it's well formed or not and you can convert it to toWellFormed and the second part of the example we see we map over all of those Unicode bytes and convert it to a well formed string. So this makes it easier in identifying whether it's a given Unicode is well formed or not and then converting it to toWellFormed. In this example, we have numbers which are unsorted and then we call toSorted and pass in a comparator function and just do like how we do for numbers.sort on an array and this returns as a sorted array but the numbers stays intact, it's not modified. Then we have toReverse. It reverses the array but the original array remains. Then we have toSpliced. We say 2,1 and we see that the second index is gone and but the original number array still stays. Same with width you can replace a particular index with a string in this case is h. The array returns with h and the original array remains. This is a copy-width and it's a TC39 proposal where you can, you know, operate on the array without modifying it and it gives you all of the constructs that we saw in this particular demo. And this is implemented in 20. We also have array buffers and shared buffers on resizable. So you can have resizable array and shared array buffers.
4. Notable Updates and Strategic Initiatives
In Node version 20, notable updates include the introduction of resizable array buffers and shared array buffers. The implementation of the Regex vFlag simplifies handling complex strings. Tailed recursion within WebGL and WebAssembly enhances performance. Another significant feature is top-level await, which eliminates the need for an async wrapper in MJS files. The proposal underwent discussions and was deemed not a foot gun. Additionally, the experimental repel await flag was removed, allowing it to run without the flag. Strategic initiatives are ongoing discussions and suggestions for improving Node.
You can pass in max-length to it and here in this example it creates an array buffer of size 8 and the max-length to be 16 and you could check if it's resizable by using resizable and then just say resize to resize it. Same with shared array buffers.
We also have implemented Regex vFlag, which is like with set notation and properties on the string which makes it super easy to handle complex strings, maybe unicode or ASCII or non-ASCII or decimals. In this example, we have a pretty complex Regex, which says emoji keycap sequence and then you can also use an emoji in quotes here. I'm using a flag or it could be I N followed by X, Y, Z, to zero to nine. And if you look into the test, it passes on the emoji four, it passes on underscore, it passes on the Indian flag, it passes on, test I N then test X on four. So the entire thing is kind of baked into this simple, expressive regular expression, With the high slash V flag, it's possible today to construct an array like this and operate on the emoji sequence and other properties.
The other interesting feature we have is the tailed recursion within WebGL, WebAssembly. If you look into this example, here at any given time, there will be only one Feb underscore in the frame, which unwinds itself before performing the next recursive call. So if you look at the second part of the example where we say, call Feb underscore rec, only one instance of that will be in the frame, thereby making it more performance. And this is, of course, tailed recursion. So that was some of the updates from version 20. There are few mini-updates, but these are some of the notable changes. So let's look into some of the other features across different versions of node, which are interesting.
So a proposal on top-level await was part of TC39, was on stage 4, and a few many folks who I have discussed on social media and other interactions kind of missed this part and I wanted to highlight this here. So let's, let's, let's look into how you resolve a promise using await syntax. If you were to say, just say, await promise dot resolve in this case, you'd see a syntax are saying that await is only valid in an async function. So then of course you'd go and wrap that in an async function and say, await promise dot resolve console dot log, then things work fine. What if we were to just say, await dot promise dot resolve console dot log and it should work fine? It is, it does in MJS. So in this case, if you have a file with index.mjs, and then you say, await promise dot resolve look-ma, and just say console dot log, in this case it works fine, or if you have a type to be module, then you say, await promise dot resolve look-ma again, it works fine. And if you want to test this on CLI, you could say input type equals module, and just evaluate the code which has await inline, and it works fine. So we no more need an async wrapper in MJS files, and not in CIS, and that makes it simpler with top-level await. And there is an interesting conversation where there was an article, which spoke about how in a top-level await was a foot gun, and then there was a discussion with the champion who was making this proposal with the other individual who wrote the article on this being a foot gun, and they came into kind of a consensus saying how it is not a foot gun. So it's very interesting to follow the history on this proposal. And another plug here, I enabled experimental repel await on node source. What I basically did was, I did a simple PR to get rid of experimental repel await. So if you were to have a .sash node and run with await.promise.resolve without the experimental repel await, it should work, and that PR was in. And it's possible today to have the experimental repel await to run without the flag.
Strategic initiatives. So what are strategic initiatives? Consider strategic initiatives are some of the long term running discussions and suggestions and way to improve node basically.
5. Node Foundation Initiatives and Promisified APIs
The Node foundation has various initiatives, including core promise and single executable apps. These initiatives aim to evolve different APIs and proposals through open governance. The upcoming Next 10 features and discussions are also worth exploring. The strategic initiatives include promisified APIs such as DNS.promises and fs.promises, which simplify asynchronous operations. The use of timers and streams with promises is also demonstrated. Additionally, the availability of the crypto Web Crypto API in Node.js aligns with the Web standards and strategic initiatives.
If you look into the complete of initiatives, you can kind of get an idea on what are the different directions it takes. It can be in terms of governance, or it can be in terms of workers or open web standards, open SSL evaluations and like that. And, and currently, some of the initiatives that are in progress today are core promise and we did see a single executable apps. So there are these different initiatives that the node foundation takes to picture and and this is all open governance. It's not run by one particular set of entities, it's open governance, you could go and propose and participate and, and kind of help to evolve different APIs and proposals. And it's a it's a very interesting space. The next 10 is also interesting. Have a look at next 10, which speaks about the upcoming features and a lot of open discussions there.
In the next few slides, we will be talking about some of the promisified APIs, part of the strategic initiatives. In this case, we have required DNS dot promises, we take a resolver. And then a new resolver. And for that resolver, setting a server of 444, the DNS and then we say resolver.resolve for example.org and we could do it then or at top level of it, if you're an MGS or an NG function, if you're not. So, basically, it's promised you know, the DNS inbuilt module is, you can, is promisified and you can use it within your code by just saying a promises. And we have node colon Fs slash promises, the node prefix there make make sure that it's not a library from the user land, but for sure it's a library within node. So you could you would add a node prefix slash Fs promise, promises to get the promise file of open for FS. So you could do await open on some file and go ahead and read it. In this example, we are doing we're also using a readable web stream on the file. And for using a far off loop on await, we are logging each of the chunk. And finally, we are await closing the file with file that goes. So this makes it very precise and easy to read. Also, you could use await on top level and have the promise if I'd power for FS promises. Similarly, we have timers which are promise if I you could use set time out set immediate set intervals and time of promises in this example, I'm using set timeout where you can say I wait set time out of one hundred and result and it brings the result after the time of.
Here's an example of using streams on readable with FS promises, which that all we are doing here is going through those big files file on file to file three and for each of the file where taking the stat and then seeing if the size of that file is greater than thousand four thousand twenty four thousand twenty four with a concurrency of do then that mark them as big files and we would get through for all the files which match that size criteria. So there was a request on Node.js in a repo with an issue, saying that, hey, we need to kind of get the crypto web API. So the request here was, hey, we have the crypto API, the web crypto that's available in Chrome. Why not? We have it in node and kind of meeting the Web standards rate, which is also part of the strategic initiatives. So now today we do have crypto. You could say require crypto Web Crypto and get in this case, you can just require the Web Crypto. And in this case, we are generating a key. We choosing which is using a sharp two fifty six, a length is 256.
6. Web Crypto API, Abort Control, and Streams
You can use the Web crypto API in Node.js to sign, verify, and encode data. However, caution must be exercised as it's easy to misuse the low-level cryptographic primitives. The experimental abort control feature allows for better control over asynchronous operations, such as setting timeouts and handling abort signals. The node util provides a convenient way to listen for abort signals and take action accordingly. Additionally, Node.js supports streams, which can be used to implement features like readable streams and performance monitoring.
And you could sign and verify use a new text encoder and you can encode that in this case you are saying I love cupcakes. And then you could evade and give it a name of hamack and passing the key and the message you get back to digest. So basically how Crypto works, it's possible to do it on node to data. But that comes with a warning, the Web crypto API provides number of low level cryptographic primitives. It's very easy to misuse them and have the pitfalls involved, which are very subtle. So if you are operating with something like Web Crypto, you should be aware of these nuances and kind of be vigilant about some of the security implications that this has.
Here's the PR that went into kind of having experimental abort control, which is implemented today, you could use a new abort control, which gives you a signal. In this case, we're using the read line and we're asking a question to the user on CLI saying, what's your favorite food? And we pass in the signal to it. And then we say, oh, no. So your favorite food is whatever the user answers, then you can have on the signal, you can add an event listener called abort. And if there's an abort signal that gets triggered, it comes into this block, this callback here and then says console log the food question timed out. Now, we are setting a timeout of those many milliseconds and then passing AC abort. So if the user doesn't answer in those many milliseconds, then it would say that the food question timed out. So you have total control and about control. Controller has other features like where you can deal it well with fetch. And for example, if the user is say, hit download or download and batch file and it's canceled, then you want to abort that and it likes on where you have control on when to abort or get aborted or how to abort and all of it. So you can make a promise and not break it, but we can still abort it, right? So the node util also exposes aborted method which which gives you kind of a call back when something which is dependent is aborted. In this case, we obtain something that's abortable. That's the dependent when I'm dependent on event, which is calling dependent or abort. It goes back to the aborted callback and you have control on this. Util makes makes it better to listen on that particular signal and the dependent. And whenever the abort is triggered, you can you can do something based on when based on what you have decided to do when it's aborted. So this util makes that flow easier.
Streams were implemented in what WGStreams, this was also an experimental implementation. We could use streams as part of it if not the entire implementation. Here is an example on where we're importing readable streams with node colon stream slash web, making sure that it's part of the web standards on readable stream. And then we could say node colon timer slash promises set interval as every. And then we are also using the node colon performance code to get performance. And we are creating a readable stream, which on the controller goes for every second it enqueues the performance. Now, and then we go and use a for of with a weight and log the value from the stream.
7. Node Stream, Fetch, Form Data, and More
Readable streams make things easier with all the powers of stream and adhere to web standards. You can use similar logic in Node today. Another example involves using the node:stream/promises module to create a pipeline and convert Archive.tar to archive.tar.gz using gzip. The experimental fetch feature allows you to make HTTP requests in Node, and you can use it without the experimental flag. Form data, the request API, text decoder and encoder, and the experimental network import are other useful features available in Node.
So it's like a ticker on and giving the performance value for for every second in this, in this process. Right. So you see how a readable stream makes things easier with all the powers of stream and also adhering to the web standard. So if you're aware of these things from the web standards, you could use the similar logic here to on node today.
Here's another example where we have node colon stream slash promises we take in pipeline and we take FS and all this function, which this like three lines is doing is taking Archive.tar and creating a gzip and converting it to archive.tar.gz Right. And when you pass it to the pipeline, we create a rich team on archive.tar and use gzip to create gzip stream and then create a right stream on archive.tar.gz And that's it. You could just do run dot and you can do a catch in this case, just like console.error if there is any error that appear while you're converting tar to targz.
Node experimental fetch in this example had hosted and server on Heroku long time back, which would send a random XKCD image so we can use fetch today on node, with using experimental fetch is what it was in version 18 and use await await fetch and .json, you get a JSON here, of course, await await inline is not advised for performance reasons and readability. But in this example, I'm just wanted to demo that we could use fetch within node repl where we are using await await on top level await and doing a .json to get the response in node-lite nightly fetch was right there on 18, where we didn't need the flag. And if you use no experimental fetch, then you would get fetches not defined. And today you can use and it's not behind the flags. If you look into this PR that got merged, which enabled fetch by default. So you could just say fetch and it should work fine.
With that, we also get form data. Form data is the web standard API where and it behaves a similar way here. In this example, I'm doing a new form data and appending name, name and HM and then you could use append entries, get all set values and all of that which you do with form data, which makes handling forms and submitting forms simpler within node. You also get the request API. In this example, we can do a new request on example.com method is post bodies FUBAR, we could do request URL to get the URL request out method to get the method request credentials body used unlike. So basically everything that's there on the standard request API is possible in node today. In this example, we have text decoder and encoder. We did see part of this in previous examples, we have an 80 of eight bytes. With with those bytes in it, and we go ahead and say text decoder 80 of eight decoder decode those 80 of eight bytes, which is Hullabaloo. And we take in the other example, we take the string to encode, which is Hullabaloo, we use the new text encoder and then pass encoder dot encode string to encode, we get back the 80 of eight bytes array that we used at the initial line. So this helps us to, you know, decode and encode text. The experimental network import is interesting to where you can import MGS from the network. In this example, I hosted Hullabaloo hyphen word MGS in H. Simon dot com. You could just say import Hullabaloo from the mondot com slash Hello hyphen Word, not MGS and invoke Hullabaloo. It just logs hollowed out. And this is possible on your local households ever testing.
8. CDN, Assert on Type JSON, and Dynamic Imports
Imagine your CDN or breaking up dependencies into multiple pieces within your network. There are arguments regarding malicious code in URLs. The assert on type JSON ensures imported content is JSON, preventing importing JS as JSON. This applies to dynamic imports as well.
So imagine this more like your CDN, or if you want to break up your dependencies into multiple pieces and pass it across within your network, which can be used from within different teams which are dependent on the on the same dependency. Right.
So this is interesting, but there are few arguments also with folks saying that what if what if there is a malicious code in the URL and how do we handle and things like that? We also have assert on type JSON, which is very much in line with what we were discussing the previous line. It, it makes sure that dynamic import is also possible, but with assert saying that, Hey, whatever we are importing for sure is JSON, so you can't just import intro from an URL or say a package on JSON, it might be JS and a named as dot JSON, but assert makes, makes it sure that the type is JSON and we could do it with dynamic imports too.
9. File API, Array.findLast, Intl, and Shadow Realm
There is a file API that allows easy operation with files. In version 18, the array.findLast method was implemented, which can find the last occurrence of a specific value in an array. The Intl object provides various localization features, such as calendars, numbering systems, and time zones. It also offers the Intl.NumberFormat method, which simplifies number formatting based on the desired locale and style. Additionally, there is an experimental feature called Shadow Realm, which allows for encapsulation and separate global scopes.
There was another request say on the node source with an issue saying that we will need the file API. And yes, today we have file API. In this example, we do a new file, dummy.txt.exe and asserting a few things from what it returns on the file attribute, the file name should be there, which is there and file size is zero and file last modified is a number and file last modified is less than date.now of course, because it is created before the date.now. So it gives you like all the features that the file API provides and it's easy to operate with files.
On v18, here's an example where we have array of objects with values x with repeating values of one, two, one, two, and suppose you want to find the last, which is a TC39 proposal that's got implemented here, array.findLast, where we're seeing o.x equals one, which is equal to array of the second, the third index array of two, right? So you, which is equal to true, or you could also find the index. The last index is zero to the last index is two in this case, because the zero it also has x whose value is one. The second also has the value, which is one. So find the last index finds the last in the array, which is two. The index is two, and array.two is x, whose value is one. So you can do last and last index on array.
We have Intel local, where you can pass in the local. Here we are passing Arabic, Arabic Egyptian local, which returns your calendars, collisions, R cycles, numbering systems, and time zones. And also it has this easy util method, which says supported value off for, for which you can pass calendar, collision, currency, numbering system, time zone and unit, and that would return all the supported values, say, Buddhist, Chinese, calendar here or African and white and Celsius and centimeters and all the units. So it makes it super easy to handle different internalization and efforts in, may it be currency or calendar or numbering system and times on our legs. So you have a number one, two, three, four, five dot seven, eight, nine, you could do an internationalization number format and pass in the locale that you want and the style is currency and the currencies you are in this case, it formats based on that locale and also puts in the pound sign. It says one, two, three dot four, five, six, seven, nine pounds. And, and same thing for a Japanese and, and you could also limit it to three significant digits. Here is an example for E and I, in we format with maximum significant digits to be three, it says one comma two three, zero, zero, zero. Right. So it makes it super easy to like handle numbers in terms of like handling currencies, which is a pretty common use case. We also have experimental shadow realm. In this case we do a new shadow realm and create the shadow realm and within the shadow realm we have evaluate where we are saying global this dot realm value equals inner. And then we have a getter, which returns the global this daughter. Look at the street equal here. When we do a getter, we get an inner. But if we check the real value in global this, it just falls. So you can think of Sharon as that box, which in which is not exposed to the global this, but can also has it its own global, this which where you can operate and add values and not leak into the the global scope of not right. So then that value is not in global this but getter does the returns in it, which has its own globalness. So federal makes it interesting. And encapsulation and of course, it brings in all the other aspects of sharing in the standard.
10. Node Features and Enhancements
We introduced a block class for buffers in Node 18 and above. The message channel example demonstrates passing a blog and operating on it. Structural cloning allows for maintaining mutations while deep cloning. Node util provides an easy way to parse args. Process.on multiple resolves helps catch promises resolved multiple times. OS.availableParallelism gives a hint on parallelism capacity. Replacing the URL parser with Ada improves URL parsing speed. The node --watch feature allows for hot module reload-like behavior. These are some of the cool new features in Node.
So we had a year that was most, too, which said graduate block from experiment to expose to global. So if you require a buffer on 18 and above, you'd have a block class, which is expose exposed on global and you could also import it with node buffer as a block.
In this example, I'm using two message channels and have a blog with hello there. And all we are doing is passing that blog from message channel, port one post message, blog and message and to post message blog and that that would receive and console log whenever it gets the message with the data or a buffer. The interesting part is the blob is still usable. After posting, you could do a blob dot test and blog. The text message or text entity value of the blog gets logged in in this case. So this is an example of using I know blobs and buffers and message channel to show how it could pass the message, basically the blog around and operate on it.
Structural clone is another example, say, in this case, I have a veg pizza with toppings, with tomatoes, capsicum, onions and corn. You could do topics dot push. You basic after taking a structural clone of veg pizza, so veg pizza the toppings stays. And I'm pushing a pineapple. I know many people wouldn't like that. So I have a veg pizza plus with that toppings that would log with pineapple. But the veg pizza or toppings remains there, thereby you maintain the mutation and also can do a deep clone, which wasn't really possible till we had structure structured clone.
Node util also gives us parse args, which is an util. Say, if you have args with hyphen, hyphen, hyphen, bar and b, and you have options for foo and bar types, you could just parse args and parse in those options, you will get the value and positionals. In this example, it will print an object with null prototype and, but we don't have foo to be true and bar to be b because that's the options that we passed on. Those were the args that we have. So it makes it super easy to parse args. Using node util, inbuilt util rather than depending on an external node module.
Sometimes your promise might get resolved multiple times. It's easier to catch it with process dot on multiple resolves today, which gets triggered when the process when the promise is resolved multiple times and helps it makes it easier to debug the situations where the promise is resolved twice. On V19.7 and above, we have OS dot available parallelism, which gives us a hint on the parallelism capacity of that particular OS and returns 16. And in this case as the number. There's this interesting PR that went in where replace the URL parser with Ada, which gained around 87 percentage faster URL parsing in terms of benchmark compared to how you are in parts was doing without the C plus plus implementation. The final example I have node hyphen hyphen. Watch and I do a watch on food at JS. On the top we see a console dot log me out. And it also giving an experimental warning for us. I would go ahead and change me out to something else. See, I would change it to wolf and I save it. It restarts food or JS and work is printed. So you're not like, it's like more like hot model reload, right? You're not, you're not closing the process and restarting it again after saving and it's watching the file for, for changes and re-executing note. It's more like no one but integrated within node today with hyphen hyphen watch while you should also watch directories and unlikes. So this has been some of the cool features that's new in node. I hope you liked few and some of them were really new to you.
Comments