Parse, Don’t Validate

Most JavaScript applications use JSON.parse to create any object first, and then validate and narrow the data type to the expected one. This approach has performance and security problems, as even if the data is invalid, the whole JSON string needs to be parsed first before the data is validated, instead of failing at JSON parsing stage (e.g., if number is passed instead of string in some property).

Many languages support parsing of JSON strings directly into the expected types, but it is not natively supported in JavaScript or TypeScript.

In this talk we will show how the powers of TypeScript combined with the new specification JSON Type Definition (RFC 8927) and Ajv library can be used to parse your data directly into the expected application-defined type faster than JSON.parse, and also how to serialize data of a known type approximately 10 times faster than JSON.serialize.

 

Rate this content
Bookmark
Video Summary and Transcription
In this video, the focus is on the concept of 'parse don't validate' in the context of JavaScript and TypeScript applications. The discussion highlights the challenges of using JSON Schema for data validation, including its flexibility that can lead to debugging issues. The alternative, JSON Type Definition (JTD), is presented as a simpler solution for most API use cases, offering better performance and reliability. The talk explains the benefits of using JTD over JSON Schema, especially in terms of type alignment, which ensures that data parsing and serialization are directly linked to specific data types, improving security and performance. Fastify is introduced as a library that enhances JavaScript application performance by efficiently handling serialization using JSON Schema. The issue of JSON's wastefulness and security vulnerabilities, like potential call stack exhaustion and DDoS attacks, is addressed. The video also covers how JTD can be integrated with TypeScript for effective data validation and parsing, making it a valuable tool for developers dealing with complex data structures. The speaker illustrates these points with an example of a Mario Kart character data schema, demonstrating how JTD ensures data validity and improves application efficiency.

This talk has been presented at Node Congress 2023, check out the latest edition of this JavaScript Conference.

Available in Español: Analizar, no validar

FAQ

Simplex Chat is a unique messaging platform founded by Evgeniy, which is distinctive for not having user identifiers.

The primary challenges include security, reliability, and data validation, which are consistent issues across various projects.

JSON Schema is a way to define the format and structure of data, widely adopted since 2009. JTD (JSON Type Definition) is a simpler alternative started in 2020, supporting discriminated unions but with some limitations compared to JSON Schema.

JSON can be wasteful as it requires parsing the entire data piece before validation, it's time-consuming due to its complex, nested nature, and it has security issues like potential for call stack exhaustion and DDoS attacks.

Fastify enhances performance by efficiently handling serialization of responses, understanding the structure of the data it returns, which minimizes loops and accelerates property access.

AJV is a library used for data validation with JSON Schema and has significantly grown, with 350 million downloads monthly as of 2015. It helps manage data correctness and security in JavaScript applications.

For API development, JTD (JSON Type Definition) is recommended for most use cases because of its simplicity and alignment with data types, providing clearer and more efficient data handling.

Type alignment ensures that data parsing and serialization are directly linked to specific data types, which improves performance, security, and reliability by ensuring data is correctly typed and structured.

1. Introduction to JavaScript and Data Correctness#

Short description:

Hello. We're going to talk today about JavaScript and how to ensure data correctness. We've worked together on various projects, including MailOnline and Threads. I'll hand over to Jason to introduce himself. Jason is the director of technology at Threads Styling and has extensive experience with data validation using JSON Schema. We've encountered common problems in our projects, such as security, reliability, and data validation. We'll discuss an alternative approach to validation that involves parsing and carrying the proof of validity. JSON, while flexible, has its challenges and can be wasteful.

Hello. I'm Evgeniy and this is Jason. We're going to talk today about JavaScript and how ensure data correctness but before we'll give a brief introduction.

So, we've done lots of great things together. We worked together at MailOnline, at Threads, and when I did java library Jason also helped a lot. So, currently I founded a Simplex chat, which is a messaging platform that is the only one of a kind that doesn't have user identifiers, but this is not what the talk is about.

So, I'll hand over to Jason to introduce himself. Thanks, Evgeniy. Obviously, you know by now I'm Jason Green, I'm director of technology at Threads Styling. Threads is a fashion tech company pioneering the world of personalized luxury shopping through chat and social media. I also previously worked with Evgeniy as a principal engineer at the MailOnline. I've been a long time user of data validation with JSON Schema and in particular using AJV, which I've witnessed grow and mature so much over the years. I'm an early investor in simplex chat as well.

Yeah, AJV growth has been indeed crazy so it's got from really nothing in 2015 when it started and now it has 350 million downloads every month with every JavaScript application probably everyone using that. So why do we want to talk about what we talk about, right? In all these projects we've done and we've done some really cool things, right? We've done a content creator at MailOnline when we've been quite mature and nevertheless we've built a very complex in-browser application and with hundreds of thousands of lines of JavaScript code that allowed editing. The whole MailOnline website is managed by that. And then when I was using engineering threads and Jason also joined threads, we built StoryMaker. Mostly Jason built it, I'm just basking in the glory. So it was a content management system for Instagram, which we definitely learned a lot of things from the previous project. And in all of those projects we did, we have been invariably hitting the same problems of security, reliability, data validation, whatever project we do, the problems are invariably the same. So I've done a lot of Haskell, and this parse.don't.validate.maxim belongs to Alexis Kane, one of the best and most genius Haskell engineers out there, who proposes the approach to parsing as an alternative to validation. So rather than just check that your data is correct, you push the proof and carry the proof of validity around. So not just correct, not just check that your data is correct, but also obtain some proof as if it's in some different type and use it across your application. So I'm gonna hand over to Jason. And in this class with JavaScript, what we'll learn is that you should really not be using native JSON in JavaScript. You should be doing some other things. Jason, over to you.

So as we all know, JSON is a widely used format that's generally considered to be flexible and easy to work with. However, it's important to be aware of some of the potential problems and challenges that it has. JSON is particularly wasteful.

2. Challenges with JSON and Importance of Performance#

Short description:

Passing JSON can be wasteful as you need to pass the entire data before checking its validity. JSON has security issues and can exhaust the call stack with deep structures or be used in DDoS attacks. Performance and reliability are important depending on the situation, especially when it affects user experience and satisfaction. Fastify is a library that tackles serialization by defining inputs and outputs in JSON Schema, increasing speed and improving data structure handling.

Now, it's not something you're gonna notice in your day-to-day debugging when you're working with it, but passing JSON can be a very wasteful process, as you need to pass the entire piece of data before you can understand or even begin to check if it's valid or not. Because of the potentially complex and nested nature of JSON, it can be particularly time-consuming to then go on and validate.

Many of us who started in JavaScript have come to love working with type script, but then you go and throw a big large blob of unstructured JSON into the mix, and suddenly, you're back to square one. None of your types matter, and everything is unknown again. It also has some security issues. These are issues that I actually wasn't very aware of despite working with it for a long time, until looking into it. If you have very, very deep structures, they can actually exhaust your call stack. This can be just because of the data itself, or it can be a deliberate attack with very deeply nested structures being sent to your APIs. You can also suffer from very large blobs of data being sent to your APIs in the form of a DDoS attack. Once again, before you can even understand if it's valid or not, your API will have to dutifully pass those blobs, which once again is very wasteful. It is even possible to do prototype pollution attacks via JSON as well.

So before we are concerned about performance and reliability, it's important to think about when performance and reliability is actually important. It does seem like an obvious statement. You know, most people wouldn't go out of their way to make an argument that it's not important, but it's not going to be important for every situation. It really depends on various factors. Obviously, a slow app is better than no app at all. So if you have an application that's delivering value, you may have much bigger issues that you need to face before worrying about performance and reliability. Particularly in the early stages of app development, you're going to be much more concerned with time to market. If your app isn't even available yet, that's obviously a big issue. You're going to be concerned about budget, the overall user experience of your application, and of course, what are your users' needs and what's most pressing to them. However, it is going to be an issue when the performance is affecting user experience and satisfaction. That can risk you losing users and those people who go away from your application or site because it didn't load fast enough, they may not come back, which is obviously what we refer to as high bounce rates.

Even worse, if reliability is your issue and your customers are losing their work or their data is becoming corrupted, that's a big issue that, in the best case, can result in some apologies. In the worst case, you may actually end up having to pay for it in some way through compensation or discounts to keep people happy. So there is actually a solution to part of this problem, which is tackled by a library called Fastify, which is a replacement for your Express router. It tackles the serialization part of the problem, which is to say that by defining the inputs and outputs and the shape of them in JSON Schema, this library is able to more quickly serialize the responses and it can get quite good increase in speed because it's focused on ... because it knows the structure of the data it's supposed to be returning. In this way, it can take a lot that would normally be loops and turn them into straight property access. So if you talk about schemas, for a long time JSON Schema was the only way to define the format of the data or the type of the data or whatever you call it. It started from 2009 and since 2020 there is an alternative specification that was created to address the shortcomings.

3. JTD vs JSON Schema: Pros, Cons, and Debugging#

Short description:

JTD is better than JSON Schema for most API use cases. In cases where JTD is worse, consider writing code instead of using a schema. JSON Schema can lead to debugging issues due to its interpretation of schemas, causing confusion and errors.

You can spend quite some time comparing pros and cons of two specifications. You can watch it later and pause. Fundamentally, JTD is defined by its simplicity and one of the important qualities is it supports discriminated unions. But at the same time it has limitations on the supports but it's extremely well aligned with data types, unlike JSON Schema.

JSON Schema, on the other hand, has extremely wide adoption today. JTD is part of some API specification, but at the same time it doesn't support discriminated unions and it doesn't define JTD. It's a long comparison. It's a non-trivial trade off and I use it first-hand with HLB Library that supports both of those specifications. So to skip to the recommendation I can give, JTD is really much, much better for the absolute majority of the use cases in an API. So if you are building an API, you should be using JTD full stop. And for the cases when JTD is actually worse than JSON scheme, it's a big question whether you should be using a schema at all. You should probably be writing code and using schema.

So I'll demonstrate on some examples in some funny way. Right. So we headline that JSON type definition logic versus JSON schema what. So if you remember those intranet memes was what. What is that? So if you look at this schema. So what it defines. Majority of software engineers would discover any schema would say that this is this thing, it's obviously an object. What else can it be? And it has a property and the property obviously must be a string. And that's what JTD treats this schema as. JSON schema has an interesting view. So it has like, if the data happens to be an object and it has this property, then this must be a string and any other data will be valid in JSON schema, meaning any number or any strain or array or any object that doesn't have property foo will all be valid. It caused millions of hours debugging to various engineers and I have been answering, like, in Azure library, probably 50% of questions are about this kind of gotchas, right? Or for example, this it's a typo, right? Clearly properties is misspelled here and JTD responds correctly. This schema is just invalid. What else can it be? Well, JSON schema has a view. JSON schema believes this is a valid schema. It just has some property that we don't know what it means. And any data is valid according to this schema. Again, millions of hours debugging spend fixing errors like this in JSON schemas.

4. JTD Structure and JSON Schema Flexibility#

Short description:

JTD has a more strict structure for arrays, requiring objects as elements with specific properties. JSON schema, on the other hand, allows for more flexibility, leading to debugging issues. JSON schema is error-prone and often requires additional annotation or the use of strict mode. JTD is simpler and often a better choice for your code.

Or for example, for arrays, right? So JTD has this structure of array. So this data must be array, obviously it must have objects as elements, and this objects must have property foo and this property must be string, so lots of hard requirements on the data shape, right?

So if you have a similar schema in JSON schema, the only difference is keyword. Well, what it really means. Oh, well, one can guess the data doesn't have to be array. And the data doesn't have to have all elements to be objects. And if they are objects, they don't have to have property foo, like it's kind of a lot of different data types can be valid according to this schema, which again causes a lot of debugging, right? And the conjecture like this, right?

So what if we put square brackets here? I thought that's array. Why don't you put square brackets? Well, JTD just says it's an invalid schema. In JSON schema, this is unfortunately a valid thing that only validates the first item and ignores all other items. And it's an exceptionally common support question in AGV and it's millions of hours spent debugging bugs like that. So fundamentally JSON schema is exceptionally error prone specification that requires lots of additional annotation to express what you actually want to express. AGV found a solution is turned out to be extremely popular called strict mode. You effectively make all those cases mistakes, which is an extension or deviation from JSON schema specification and people use it. But fundamentally it just means that JTZ is simpler and in many cases better for your code.

5. Validating Data with JTD and TypeScript#

Short description:

Jason will show you how to do some magic with JTZ and TypeScript for data validation and parsing. With JTD, the process is more intuitive and straightforward. Let's dive into a nested object example: a Mario Kart character. We'll define the schema for the character's data, including properties like name, surname, weight, createdAtDate, and an array of weapons. Using the JTD Schema Type utility, we can ensure that the created schema is valid for this data type.

But over to Jason, will show you how to do some magic with JTZ and TypeScript and do actually some data validation and parsing.

All right. Time to look at some code. So just a remark on a few of the points Evgeny made there. Having worked with a lot of JSON schema, we ended up building a lot of things around these decisions and it takes a long time to work all those things out. With JTD I found that a lot more intuitive and a lot simpler and more straightforward.

So let's have a look. Now there's obviously a lot of different types of data that you can validate. I'm going to jump straight into a nested object with I think enough complexity that it's a good example for us to have a look at how we build a schema for it. So I've been playing a lot of Mario Kart with my family lately. Surprisingly my wife is quite good at it as well and we're both just jumping in and playing a lot of Mario Kart. So I couldn't come up with any other example, but a Mario Kart character.

So if we have our character, this is our data. I said, well, this is our type, our interface. It has a name, an optional surname. We all know Mario Kart characters. They have a weight. The heavier guys are the best, obviously. You have createdAtDate, just to give us an example here. And we obviously have an array of weapons. So each weapon has an ID, a name of the weapon, which is an enum, and a damage counter for how much damage this weapon will do. I'm actually going to just clear out all of this, because I want to show the process of writing the schema more so than anything else. Because we have this very interesting utility type, which AGV comes with, it's called the JTD Schema Type. And when we use this type, passing in our Mario Kart character, it will only allow us to create a schema that is valid for this particular data type. So I don't really have to, I mean, I can start off by typing, typing. No, actually, I don't know. Well, I actually don't know how to write a schema at all yet. So I can just go straight to my, you know, type ahead and trigger the, what do you call it, trigger, I forgot the name, but triggering the, the possible properties that I can have here. And we have a properties value. So we're going to need that to start with.

6. Defining Object Properties and Arrays#

Short description:

When defining an object, you specify the properties it should have. Start by defining the type, such as string or date. Optional properties are supported, and you can also define numbers with different types. The schema ensures data validity, allowing for nullable values. Arrays can be defined by specifying the elements and nested properties.

That's because when we're defining an object, this is how you do it with the properties, property. That sounds awful, but you get the idea. If I go to trigger again, now we can start putting in the actual different properties that our Mario Kart character, you know, needs to have defined.

You'll notice that surname's missing from here, and we'll get to that in a second. We're just going to start off by putting in, defining a couple of these. So we need a type obviously, and the type of this one is it can only be, well, it can be one of two things actually, but in our case, this is string. In this case, you'll see that when I go to put the createdAt type, it's only giving me the option of date. This time it's no longer allowing me to put in a string, and that's because it knows that from this type of the Mario Kart character, the createdAt is a date. Finally, we have weapons.

I'm going to get to defining that in a second, because I want to first jump to how we define the surname. So surname is an optional property. And if I look at my predictions again, we have optional properties. And in here, as soon as I start typing S, we get surname as well. So it just makes everything very intuitive. And then from this point, it's exactly the same as any other property that you're defining. I also forgot we also have the weight of the character and this is a number. And as you can see, there's a whole different, there's quite a bit more variety of different types of numbers that we can support in JDT, JTD. I'm just going to go with, given this is weight, I think probably U int eight will make sense. Just a positive integer there. And have I got that right? Oh yes. So this is the best part about this. It's not, it's not valid. It's not, it's not correct. This weight is not fulfilling the needs of this type here in the schema. And that's because I've made weight that it can be a null value as well. So here we have to pass nullable true. And then that's going to be valid as well.

Weapons. So when we want to define an array, I think you might remember from the example, Evgeniy showed before it is elements, and then from that point on, you're once again, just defining, it's just more, more of the same schema. It's sort of nested, a nested value now so that also in this case, we want properties because this is an array of objects to define a object in JTD, unique properties and we can once again go about putting in the rest of these values type that is number ID.

7. Enum Type for Name#

Short description:

In this case, we have a new type called name, which is an ad hoc enum. We pass in all the possible values as an array, and the type prediction is based on the defined type.

I'm going to go even 32 and name. So the name in this case, this is another new type that we've come across. This, we've made this an enum. Um, this is actually more of an ad hoc enum rather than using the num, uh, keyword in a TypeScript. And so in this case, we just need to pass in all the possible values for this enum. And that is in the form of an array. So as you can see already, it starts predicting what are the possible values. It knows this from the type that we've defined, which is really useful. And red shell.

8. Serialization and Parsing with MarioKart Schema#

Short description:

Finally, we have damage, which is another number. We can serialize data using a type-aligned serializer for the MarioKart schema, which is 10 times faster than using a regular serializer. We can also parse data using a parser generated from the schema, which returns the expected type or undefined if the data is invalid. This approach of parsing JSON directly to the application type and serializing a specific type improves performance and reliability.

Finally, we have damage, which is another number. Uh, we're going to make this a float 32. So we can have decimal points as well. And that is the full schema now defined.

So what can we do without MarioKart schema? Um, we can serialize data. So if I take, uh, if, if I have some, you know, JavaScript object defined, uh, that fulfills this data, well, that fulfills this type, I can create a serializer from my MarioKart schema. Um, apologies. This is from a previous example. Oh, gosh. So here's our MarioKart serializer, we can now pass in as much data as we like, and it will serialize 10 times faster than, uh, by using this type aligned serializer that we've created. We can also of course, parse, excuse me, here's a string with a JSON string of MarioKart character data. We can compile a parser using our schema as well. And the best part of this is that it knows the type that we're expecting from this. Uh, when we use this parser to actually parse this string, the resulting type is going to be either MarioKart character or undefined. Undefined if it's, uh, it basically is going to be the case if the data is in any way invalid. And so if the data isn't invalid and we have a result, then we know that this is Mario Kart data. We have all of our typings set up. We can start, we get our type ahead. Um, we can be sure of the data that's been defined here and it just makes all of our code from that point on so much easier. And of course, if it's undefined, then we can refer to the error message via the property on the pass function.

So, uh, yeah, so I think that's it for the code example. This is, this is really cool and it's exciting. So I actually didn't create, I created a such type utility for Jason schema and, uh, equivalent type utility that Jason just presented by was created by some exceptionally bright JavaScript engineer. He contributes to the talk on his own. But that gives you like huge powers in TypeScript because this approach of parsing Jason directly to the application type rather than into some generic data structure is, or, and the opposite serializing a specific type rather than serializing a generic data structure, uh, is used in all languages, almost all languages, right? So haskell, Go, Rust, like a generic Jason data structure is usually used in scripted languages, like JavaScript and Python, but it fundamentally undermines performance and reliability and using parsers which are type aligned, uh, which are, uh, results in high security and high performance at the same time. So as Jason said, compile serialize. So compile serializer actually generates JavaScript codes under the hood. So if it's used repeatedly, it gives you a 10 times performance boost compared to Jason serialize and parser. Uh, if their data is valid, then it will parse in pretty much the same time as Jason parse, but it will validate it as it goes. But for example, if somebody sends array instead of object, it will fail on the first character.

9. Improved Security and Type Alignment#

Short description:

The parser fails on the first invalid character in the JSON string, improving security and performance. It has been used in large scale applications, including Wastestream. For the Storymaker project, JSON schemas were generated from types for type alignment. JTD and AJA make your job easier, secure your API, and save you time compared to alternatives.

If somebody will send the property that's not allowed, or of a run type, the parser will not parse the whole data structure that can be huge. It will simply fail on the first invalid character in the Jason string, which gives you much better defense against any kind of attacks and you never can end up with properties you don't expect in your data that may result in prototype pollution or in any other sense.

So this has really, really improved security and performance. And by now I know it's been used in large scale applications, even Wastestream where it's actually currently used in their offices to return the results. It was my previous company when I was leading the engineering team. They were kind to allow us. They use this approach as well, so we did it when I was here.

So, you know, one more note I had was that at for the Storymaker project, we built all of the state of the individual stories that you create is all managed on the server. And there's a lot of back and forth of deep, of big pieces of JSON. And we built, we did all the validation with JSON schema. However, because we wanted it to be type aligned, what we ended up doing was kind of similar in a roundabout way in that we generated JSON schemas from our types. And we didn't have any JSON schemas that were doing anything that you couldn't do in the type, TypeScript type anyway. So because we didn't, we didn't want that. We didn't want to have to second guess things and have an inconsistency there. So it, yeah, I think it's kind of the right way to go and it may seem more restrictive in what you can do with it, but I think I agree. I mean, I agree with your statement before. Those things you probably shouldn't be doing them in a schema anyway. So yes, JTD is fun. AJA makes it exceptionally powerful. So you use them both. That'll make your job easier and your API is more secure and you'll save yourself lots of hours lost on the buggin compared to alternatives. So that's, that's our recommendation from this talk. Thank you.

Evgeny Poberezkin
Evgeny Poberezkin
Jason Green
Jason Green
26 min
17 Apr, 2023

Comments

Sign in or register to post your comment.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Scaling Up with Remix and Micro Frontends
Remix Conf Europe 2022Remix Conf Europe 2022
23 min
Scaling Up with Remix and Micro Frontends
Top Content
This talk discusses the usage of Microfrontends in Remix and introduces the Tiny Frontend library. Kazoo, a used car buying platform, follows a domain-driven design approach and encountered issues with granular slicing. Tiny Frontend aims to solve the slicing problem and promotes type safety and compatibility of shared dependencies. The speaker demonstrates how Tiny Frontend works with server-side rendering and how Remix can consume and update components without redeploying the app. The talk also explores the usage of micro frontends and the future support for Webpack Module Federation in Remix.
React's Most Useful Types
React Day Berlin 2023React Day Berlin 2023
21 min
React's Most Useful Types
Top Content
Watch video: React's Most Useful Types
Today's Talk focuses on React's best types and JSX. It covers the types of JSX and React components, including React.fc and React.reactnode. The discussion also explores JSX intrinsic elements and react.component props, highlighting their differences and use cases. The Talk concludes with insights on using React.componentType and passing components, as well as utilizing the react.element ref type for external libraries like React-Select.
Full Stack Components
Remix Conf Europe 2022Remix Conf Europe 2022
37 min
Full Stack Components
Top Content
RemixConf EU discussed full stack components and their benefits, such as marrying the backend and UI in the same file. The talk demonstrated the implementation of a combo box with search functionality using Remix and the Downshift library. It also highlighted the ease of creating resource routes in Remix and the importance of code organization and maintainability in full stack components. The speaker expressed gratitude towards the audience and discussed the future of Remix, including its acquisition by Shopify and the potential for collaboration with Hydrogen.
TypeScript and React: Secrets of a Happy Marriage
React Advanced 2022React Advanced 2022
21 min
TypeScript and React: Secrets of a Happy Marriage
Top Content
React and TypeScript have a strong relationship, with TypeScript offering benefits like better type checking and contract enforcement. Failing early and failing hard is important in software development to catch errors and debug effectively. TypeScript provides early detection of errors and ensures data accuracy in components and hooks. It offers superior type safety but can become complex as the codebase grows. Using union types in props can resolve errors and address dependencies. Dynamic communication and type contracts can be achieved through generics. Understanding React's built-in types and hooks like useState and useRef is crucial for leveraging their functionality.
Debugging JS
React Summit 2023React Summit 2023
24 min
Debugging JS
Top Content
Watch video: Debugging JS
Debugging JavaScript is a crucial skill that is often overlooked in the industry. It is important to understand the problem, reproduce the issue, and identify the root cause. Having a variety of debugging tools and techniques, such as console methods and graphical debuggers, is beneficial. Replay is a time-traveling debugger for JavaScript that allows users to record and inspect bugs. It works with Redux, plain React, and even minified code with the help of source maps.
Making JavaScript on WebAssembly Fast
JSNation Live 2021JSNation Live 2021
29 min
Making JavaScript on WebAssembly Fast
Top Content
WebAssembly enables optimizing JavaScript performance for different environments by deploying the JavaScript engine as a portable WebAssembly module. By making JavaScript on WebAssembly fast, instances can be created for each request, reducing latency and security risks. Initialization and runtime phases can be improved with tools like Wiser and snapshotting, resulting in faster startup times. Optimizing JavaScript performance in WebAssembly can be achieved through techniques like ahead-of-time compilation and inline caching. WebAssembly usage is growing outside the web, offering benefits like isolation and portability. Build sizes and snapshotting in WebAssembly depend on the application, and more information can be found on the Mozilla Hacks website and Bike Reliance site.

Workshops on related topic

React, TypeScript, and TDD
React Advanced 2021React Advanced 2021
174 min
React, TypeScript, and TDD
Top Content
Featured WorkshopFree
Paul Everitt
Paul Everitt
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.
Mastering advanced concepts in TypeScript
React Summit US 2023React Summit US 2023
132 min
Mastering advanced concepts in TypeScript
Top Content
Featured WorkshopFree
Jiri Lojda
Jiri Lojda
TypeScript is not just types and interfaces. Join this workshop to master more advanced features of TypeScript that will make your code bullet-proof. We will cover conditional types and infer notation, template strings and how to map over union types and object/array properties. Each topic will be demonstrated on a sample application that was written with basic types or no types at all and we will together improve the code so you get more familiar with each feature and can bring this new knowledge directly into your projects.
You will learn:- - What are conditional types and infer notation- What are template strings- How to map over union types and object/array properties.
Deep TypeScript Tips & Tricks
Node Congress 2024Node Congress 2024
83 min
Deep TypeScript Tips & Tricks
Top Content
Featured Workshop
Josh Goldberg
Josh Goldberg
TypeScript has a powerful type system with all sorts of fancy features for representing wild and wacky JavaScript states. But the syntax to do so isn't always straightforward, and the error messages aren't always precise in telling you what's wrong. Let's dive into how many of TypeScript's more powerful features really work, what kinds of real-world problems they solve, and how to wrestle the type system into submission so you can write truly excellent TypeScript code.
Master JavaScript Patterns
JSNation 2024JSNation 2024
145 min
Master JavaScript Patterns
Top Content
Featured Workshop
Adrian Hajdin
Adrian Hajdin
During this workshop, participants will review the essential JavaScript patterns that every developer should know. Through hands-on exercises, real-world examples, and interactive discussions, attendees will deepen their understanding of best practices for organizing code, solving common challenges, and designing scalable architectures. By the end of the workshop, participants will gain newfound confidence in their ability to write high-quality JavaScript code that stands the test of time.
Points Covered:
1. Introduction to JavaScript Patterns2. Foundational Patterns3. Object Creation Patterns4. Behavioral Patterns5. Architectural Patterns6. Hands-On Exercises and Case Studies
How It Will Help Developers:
- Gain a deep understanding of JavaScript patterns and their applications in real-world scenarios- Learn best practices for organizing code, solving common challenges, and designing scalable architectures- Enhance problem-solving skills and code readability- Improve collaboration and communication within development teams- Accelerate career growth and opportunities for advancement in the software industry
Best Practices and Advanced TypeScript Tips for React Developers
React Advanced 2022React Advanced 2022
148 min
Best Practices and Advanced TypeScript Tips for React Developers
Top Content
Featured Workshop
Maurice de Beijer
Maurice de Beijer
Are you a React developer trying to get the most benefits from TypeScript? Then this is the workshop for you.In this interactive workshop, we will start at the basics and examine the pros and cons of different ways you can declare React components using TypeScript. After that we will move to more advanced concepts where we will go beyond the strict setting of TypeScript. You will learn when to use types like any, unknown and never. We will explore the use of type predicates, guards and exhaustive checking. You will learn about the built-in mapped types as well as how to create your own new type map utilities. And we will start programming in the TypeScript type system using conditional types and type inferring.
Building Your Own Custom Type System
React Summit 2024React Summit 2024
38 min
Building Your Own Custom Type System
Featured Workshop
Kunal Dubey
Kunal Dubey
I'll introduce the audience to a concept where they can have end-to-end type systems that helps ensure typesafety across the teams Such a system not only improves communication between teams but also helps teams collaborate effectively and ship way faster than they used to before. By having a custom type system, teams can also identify the errors and modify the API contracts on their IDE, which contributes to a better Developer Experience. The workshop would primarily leverage TS to showcase the concept and use tools like OpenAPI to generate the typesystem on the client side.