Panel Discussion: Node.js in the Cloud

This ad is not shown to multipass and full ticket holders
JSNation US
JSNation US 2025
November 17 - 20, 2025
New York, US & Online
See JS stars in the US biggest planetarium
Learn More
In partnership with Focus Reactive
Upcoming event
JSNation US 2025
JSNation US 2025
November 17 - 20, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

FAQ

Node.js is a backend technology that runs on the JavaScript runtime environment, allowing developers to build scalable network applications. It is popular in serverless architectures because it's lightweight, efficient, and its non-blocking I/O model is ideal for asynchronous operations, which are common in serverless environments where applications scale dynamically.

Serverless architecture offers scalability, as it allows applications to scale automatically with increased loads without manual intervention. It is cost-effective for startups since costs are based on the actual amount of resources consumed by applications, rather than on pre-purchased units of capacity. Additionally, it can reduce operational complexities and maintenance requirements.

In serverless architecture, microservices interact through APIs. Each service is deployed as a separate function, and they communicate with each other by making API calls. This modularity allows for easier scaling and maintenance, as each microservice can be updated independently without affecting others.

AWS Amplify is a development platform for building secure, scalable mobile and web applications. It provides a broad range of services and features, including authentication, APIs, storage, and analytics, all designed to work together seamlessly. Amplify facilitates the deployment of serverless applications by handling backend functions and allowing developers to focus on frontend development.

Choosing between a monolithic and microservices architecture in serverless environments depends on the project's requirements. Monolithic applications are simpler to develop and deploy but can become difficult to manage as they grow. Microservices offer greater flexibility and scalability, allowing independent deployment and scaling of different parts of an application, but they can introduce complexity in terms of service integration and management.

Yes, traditional Node.js frameworks like Express can be used in serverless environments, but they may require modifications or additional tools to manage serverless-specific challenges such as statelessness and cold starts. Developers may need to adapt their applications to optimize performance and cost-efficiency in a serverless context.

In serverless architectures, the role of DevOps shifts from managing physical servers or virtual machines to handling configurations, deployments, and monitoring cloud resources. DevOps practices are still crucial for ensuring that serverless applications perform optimally and scale properly, focusing on automation, continuous integration, and continuous delivery processes.

Ali Spittel
Ali Spittel
Eran Hammer
Eran Hammer
Ruben Casas
Ruben Casas
Alejandro Oviedo
Alejandro Oviedo
34 min
24 Jun, 2021

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Open discussion about Node.js in the cloud with introductions from Allie, Aaron, Ruben, Alejandro, and Slobodan. Discussion on Node.js and serverless architecture preferences with Slobodan, co-founder of Vacation Tracker, and AWS serverless hero. Perspectives on Serverless Architecture Implementation and Benefits, with insights on microservices within a serverless environment and cost-effectiveness. Understanding Scalability in Serverless for Startups and Large Enterprises, Emphasizing Developer Comfort and Productivity for Effective Technology Adoption. Balancing Costs and Scalability: Server vs. Serverless for Infrastructure Budgeting and Adaptability in Growing Systems. Exploring Node.js for IoT in the Cloud and Framework Dilemmas in Serverless Development. Addressing Framework Flexibility and Shift Towards Modern Development Practices. Challenges of Transitioning to Serverless Architecture and Embracing Low-Code Frameworks like AWS Amplify. Discussing Future Frameworks in Serverless Architecture and Vendor Lock-in Challenges. Discussing Integration of Express with Serverless, Business Logic Isolation, DevOps Necessity, and Role Evolution. Aaron on the Persistence of DevOps in Serverless Environments.

1. Introduction and Speaker Introductions

Short description:

Welcome to our open discussion about Node.js in the cloud. We have a great lineup of speakers and experts. Allie is a senior developer advocate at AWS on the Amplify team, specializing in making the cloud easier for front-end. Aaron is the founder of Sideway, a streaming service that presents stories within a simulated social network. Ruben is a software developer at American Express, working with the Microphone 10 framework. Alejandro is also present.

Hello everyone. Welcome to our open discussion about Node.js in the cloud. I'm very glad to be here. And today we have a great lineup of the speakers and the experts.

Let me ask them to introduce yourself and say where you're from, and what you do. So let's start from Allie. Hey, I'm Allie. I live in Chicago right now, but I'm a little bit of a digital nomad, so I don't stay in one place for too, too long. I am a senior developer advocate at AWS on the Amplify team. So I work mostly with making the cloud a little bit easier for front-end So that's my specialization, I guess.

Great, thank you. Aaron, please introduce yourself. Well, good morning. I'm Aaron Hammer. I'm based in Los Galas, California, which is in the Bay Area. I'm the founder of Sideway, which is a new streaming service where stories are presented within a simulated social network, so we kind of create fake sandbox social networks to create fictional stories. And I've actually spent the last four months directing a show for the first time in 22 years since I left film school.

Great. Ruben, please go ahead. Hello everybody. So yeah, I'm Ruben. I live in Brighton, UK, near London. And I am a software developer on American Express. And I mostly work with Microphone 10 framework, which is open source. And I will be sharing about that later. But yeah, that's me.

Great. Alejandro. All right. Hey, everyone.

1. Introduction to Node.js Discussion

Short description:

Open discussion about Node.js in the cloud with introductions from Allie, Aaron, Ruben, Alejandro, and Slobodan.

Hello, everyone. Welcome to our open discussion about Node.js in the cloud. I'm very glad to be here and today we have a great lineup of speakers and the experts.

Let me ask them to introduce themselves and say where you're from and what you do. Let's start from Allie. Hey, I'm Allie. I live in Chicago right now, but I'm a little bit of a digital nomad, so I don't stay in one place for too, too long. I am a senior developer advocate at AWS on the Amplify team, so I work mostly with making the cloud a little bit easier for frontend and mobile developers. That's my specialization, I guess.

Great, thank you. Aaron, please introduce yourself. Well, good morning. I'm Aaron Hammer. I'm based in Los Galles, California, which is in the Bay Area. I'm the founder of Sideway, which is a new streaming service where stories are presented within a simulated social network, so we create fake sandbox social networks to create fictional stories, and I've actually spent the last four months directing a show for the first time in 22 years since I left film school.

Great. Ruben, please go ahead. Hello, everybody. So yeah, I'm Ruben. I live in Brighton, UK, near London, and I am a software developer on American Express, and I mostly work with the Microphone10 framework, which is open source, and I'll be sharing about that later, but yeah, that's me. Great. Alejandro. All right. Hey, everyone. I'm Alejandro from Buenos Aires, Argentina. I've worked mostly on backend technologies, Node.js. I help with a couple communities, both local and a few international, like NoSchool, and, yeah, excited to be here. Great. And our serverless hero, Slobodan. Hello.

2. Introduction of Alejandro and Slobodan

Short description:

Alejandro from Buenos Aires, Argentina, has experience working with Node.js and is excited to be part of the discussion. Slobodan, co-founder and CTO of Vacation Tracker, is an AWS serverless hero and has extensive experience with serverless applications.

I'm Alejandro from Buenos Aires, Argentina. I've worked mostly on Black-end Technologies, Node.js. I help with a couple of communities, both local and a few international, like non-school. And yeah, excited to be here. Great. And our serverless heroes, Slobodan. Hello. I'm Slobodan, and I'm co-founder and CTO of Vacation Tracker, a live tracking management system. I'm also AWS serverless hero. I've worked a lot with the serverless and also wrote a book about serverless applications with Node.js with my friend, Aleksandr Simovich. Great, great.

3. Node.js and Serverless Architecture

Short description:

Node.js has become a popular choice for startups and enterprises, especially in serverless architecture. When building an app, the choice between a regular server approach and a serverless architecture with functions depends on the desired architecture and level of granularity. Starting with a serverless approach can work well, but fallback options like containers or instances may be needed. For Vacation Tracker, a startup on AWS, the serverless approach has been successful with a combination of services and Lambda functions. The cost is low, less than 1% of monthly revenue. Different problems have different solutions, and the serverless approach can alleviate the workload for developers, making it less daunting.

So let's start, guys. Node.js, as you know, have become a very popular choice in startups and even enterprises, and especially in serverless. And building the new app, what architecture would you like to have and what is the way you would like to choose, the regular server approach with the roads or the serverless architecture use of functions? And if you are building an application with the functions in a serverless, what level of granularity would you like to prefer and suggest to our audience? Is it a monolithic application or is it services, microservices or functions? And should I start on the beginning of the project from a monolith and then deep dive to the functions? What do you think about this?

So who will start? Should we start with the same? Yes, Slobodan, please, because I'm looking at your picture. Yeah, sorry, so I can just speak about my perspective and what I'm doing for my startup and also my other company, Cloud Horizon. Basically, our startup, Vacation Tracker, is now 100% serverless. It's on AWS. And our approach is basically to start with serverless because it works really well for us. And then if some parts are not, if you're not able to do something with serverless, if you need to hack something, then we just fall back on some other things such as containers or maybe instances if needed. But fortunately for now, we managed to build everything with serverless without big problems. And so far, it works really good. We have a lot of Lambda functions now in production. We use mono repo for our application. But it's basically, we have few services, and each service has a bunch of different Lambda functions and other services inside that service. For example, lead management system, then Slack integration, then Microsoft Teams integration, and things like these. And they interact with one another through some APIs. And there are basically some kind of microservices. For me, Lambda function is a small function, but I don't look at one Lambda function as an application. It's more like one controller inside, let's say Happy Framework, or Express, or something like that. So for us, that works really well. And the cost so far is really low. Right now, our infrastructure costs less than 1% of our monthly revenue. So yeah, that works really good for us. But again, this is a different use case. So maybe it will not be the same for everyone.

Nice. Ellie, what do you think about this? Yeah, I would agree that different problems have different solutions. And so depending on your team, their experiences and what you need to prioritize in your product, that's what you should use. That being said, I really appreciate the serverless approach because it does take a lot of work off of the developers and make that process a lot less scary. I have worked for startups most of my career and have definitely dealt with a lot of downtime and a lot of server catastrophes, and that being managed by serverless I think is a positive. So that's the direction that I would go in, but there are different solutions to different problems.

4. Serverless and Architecture Decision

Short description:

There's been a lot of improvement on CPU bound tasks moving into the function area, but there's still room for improvement. For CPU bound processes, it's easier to use a server with an instance running. For tasks that don't require high performance or granularity, functions are a good choice. Scalability is a key factor for startups, as serverless eliminates the need to worry about scaling and offers cost benefits. For large enterprises with existing infrastructure, the decision to adopt serverless depends on the benefits and fit within the architecture. The choice to use serverless for specific tasks in a large enterprise can be challenging. Developer economics play a role in choosing the technology and tools that make developers the most productive. Serverless is suitable for CPU-intensive tasks and use cases like image and data processing. However, for tasks like login, there is no trivial solution. While serverless is cheaper at the start, it can become more expensive than regular containers as the number of requests and serverless instances increase.

Greg, Alejandro, serverless, the functions? I think there's been a lot of improvement on CPU bound tasks moving into the function area, but I think there's a lot of room for improvement there. Still, for CPU bound processes I would still think that it's easier to get by with a server with an instance running than a function. And for things that maybe does not need that level of performance or granularity, whether you change the instance size or the CPU or the threads number for that, I would go with functions for that.

Great. Thank you. Ruben? Yeah, so coming back to the use case, the first question I would have is, would be scalability. If you want to start small and then go big, then serverless is very useful because you don't have to worry about the scaling things. You don't have to worry about how are you going to be able to manage the load from the start. I guess, apart from that, and obviously the implications on the small cost, because you're only paying for what you're using, that's very attractive for startups, because they don't have to worry about scalability. But now if you think about large enterprises, and if you think, shall we adopt a serverless architecture, where we already have an infrastructure in place? So I think the question there would be, okay, what are the benefits? And if we already have something established, then where does serverless fit into that architecture? And also, if we are going to do serverless for a specific task, so we were discussing earlier, like, if we are going to carry on with our infrastructure, but then we want to try some of the serverless functions for CPU intensive tasks, or for things that require a load request, or that will have a lot of load, I think that would be the question on the decision that we will have to make in terms of architecture, deciding where does it fit in a large enterprise, because, I mean, not everyone has the opportunity to just design from scratch, like in their startup world, where you can just have everything at your disposition, that you can just choose the technology. Sometimes when you have something already established, it's harder to make that decision, to adopt something different. But I guess there is several areas that are attractive for large enterprises because of the load as well. So, yeah.

Agree. Thank you. Aaron? I think Ali made an excellent point, when she basically said, like, you have to like start with what you're comfortable with. And I think, you know, like to me, like developer economics are always gonna be like a key decision making. You know, if you, I usually, when I hire people and they ask, oh, should I use this technology or this framework to start building this new service? And it was like, you know, well, how experienced are you with whatever you're suggesting? And it's like, well, I've been doing, you know, five years of work with this framework, but I've been hearing that this is the future, so I should, you know, move there. And I would say like, well, you know, start with whatever will make you the most productive. You know, you personally, as a developer, would start with the technology and the architecture and the tools that you feel the most comfortable with. And then as you start building your service, you don't have to wait five years to do this, you know, like after a few months, when you kind of you know what you're doing, you kind of have a proof of concept that's up and running, then you can start asking yourself the bigger architectural question. And it's very challenging. You know, there are use cases for serverless that are obvious, you know, you do image processing, data processing, you know, it's all these things that Node is not great at when it comes to the event loop, because, you know, if anything is CPU bound, takes a long time, then your server is basically doing one thing. And that's kind of where, you know, serverless really shines. So if you have to resize images for previews, that's kind of a no brainer. Like, you know, anybody, anybody starting today will go and reach for that, that tool, regardless of where they're working in. But then if you're just doing, okay, you log in, should your login be a server or should it be a serverless function? And now you're kind of like, okay, this isn't there's no, you know, trivial solution for that. Some of the, the, you know, the thing that's been said before, yes, serverless will be way cheaper for you when you start. But it will probably be a lot more expensive than regular containers as you scale up, because it's unbounded. Because, basically, the more requests you're getting, the more serverless instances are potentially running.

5. Budgeting and Scalability

Short description:

Yes, you can set boundaries and decide on the exact cost of adding servers when running containers. However, budgeting for serverless functions is different, with variable costs depending on the application and user load. For systems with limited budgets, predictable solutions may be preferred, but for SaaS and enterprise applications, scaling infrastructure with more users and revenue is feasible. The cost of bugs and scaling other components should also be considered. While serverless can be cost-effective, enterprise applications may require additional expenses for server maintenance. Controlling the number of functions is possible, but the decision depends on various factors.

Yes, you can set boundaries and stuff like that. But if you're running containers, you say like, look, I can only afford to run five containers, you know, this size, I want to pay this amount of money every month. And yeah, my service quality might degrade over time if I have more load. But at least, you know, I can decide, okay, I'm going to add one more server, I know the exact cost of that server. So it's a lot easier for budgeting than other, you know, more elastic infrastructure components.

Can I jump in just quickly here? So yeah, I don't agree that it's easier to budget, it's different. So yeah, it's easier to know one server costs like, let's say $30 or something like that. And with Lambda function, you have completely different pricing models. So I personally have zero idea how many servers do I need for something. You need to load test your application, you need to see how many users can serve with one server. And with Lambda functions, it's different. It's like basically from fixed costs, you're moving to variable costs. And it really depends on what you're building. That's, I think, the key of everything. If you're building the system where you have limited budget, then of course it's better to go with something that you can predict easily. But if you're building a SaaS or something like that, or even a big enterprise where you earn some money from your users, it's okay to scale your infrastructure and everything with more users and of course more money that you're getting. And it's really hard to do that in the beginning, but I don't think that functions are the most important part of serverless for us. It's really important that we don't think about scalability of some other components. For example, I know the cost of my bugs. For example, we did some bug and we had like 250 million writes inside our database for one month and that cost $300. So yeah, it's something that we could prevent, but we have alarms on billing and everything else. And yeah, everything scaled, my service didn't stop working for them and the cost itself, it's not that big. So for us, production is like $250 or something like that. So it's really cheap. It's different for enterprise applications, of course, but again, if you pay for infrastructure servers, but you also need to pay people that will maintain these servers. So yeah, the cost is not that predictable. It's not just the cost of the server itself. Someone need to do something about that server. Yeah. And you can control the amount of the functions, right? Yeah, but it still depends.

QnA

Node.js Architecture Preferences

Short description:

Discussion on Node.js and serverless architecture preferences with Slobodan, co-founder of Vacation Tracker, and AWS serverless hero.

I'm Slobodan, and I'm co-founder and CTO of Vacation Tracker, a live tracking management system. I'm also AWS serverless hero. I work a lot with serverless and also wrote a book about serverless applications with Node.js with my friend Alexander Simovitch.

Great. So let's start, guys. Node.js, as you know, has become a very popular choice in the startups and even enterprises, and especially in serverless. And building a new app, what architecture would you like to have, and what is the way you would like to choose, the regular approach with Roads, or the serverless architecture, these functions?

If you're building an application with the functions in a serverless, what level of granularity you would like to prefer and suggest to our audience? Is it a monolithic application, or is it services, micro-services, or functions? Should I start on the beginning of the project from a monolith and then deep dive into the functions? Who will start? Should we start with the same person?

Serverless Architecture Perspectives

Short description:

Perspectives on Serverless Architecture Implementation and Benefits, with insights on microservices within a serverless environment and cost-effectiveness.

Yeah, sorry. So I can just speak about my perspective and what I'm doing for my startup, and also my other company, Cloud Horizon. Basically, our startup, Vacation Tracker, is now 100% serverless. It's on AWS. And our approach is basically to start with serverless, because it works really well for us. And then if some parts are not, if we're not able to do something with serverless, if we need to hack something, then we just fall back on some other things such as containers or maybe instances if needed. But fortunately for now, we managed to build everything with serverless without big problems. And so far, it works really good. We have a lot of Lambda functions now in production.

We use Monorepo for our application, but it's basically, we have few services and each service has a bunch of different Lambda functions and other services inside that service. For example, Lead Management System, then Slack integration, then Microsoft Teams integration, and things like these. And they interact with one another through some APIs. And they're basically some kind of microservices. So for me, Lambda function is a small function, but I don't look at one Lambda function as an application. It's more like one controller inside, let's say Happy Framework or Express or something like that. So for us, that works really well. And the cost so far is really low. Right now, our infrastructure costs less than 1% of our monthly revenue. So yeah, that works really good for us. But again, this is different use case. So maybe it will not be the same for everyone.

Yeah, I would agree that different problems have different solutions. And so depending on your team, their experiences, and what you need to prioritize in your product, that's what you should use. That being said, I really appreciate the Serverless approach because it does take a lot of work off of the developers and make that process a lot less scary. I have worked for startups most of my career and have definitely dealt with a lot of downtime and a lot of server catastrophes, and that being managed by Serverless, I think, is a positive. So, that's the direction that I would go in, but there are different solutions to different problems. Great. Alejandro? Server, or Serverless, the functions? I think there's been a lot of improvement on CPU-bound tasks moving into the function area, but I think there's a lot of room for improvement there. Still, for CPU-bound processes, I still think that it's easier to get by with a server with an instance running than a function.

Scalability and Developer Economics in Serverless

Short description:

Understanding Scalability in Serverless for Startups and Large Enterprises, Emphasizing Developer Comfort and Productivity for Effective Technology Adoption.

Alejandro? Server, or Serverless, the functions? I think there's been a lot of improvement on CPU-bound tasks moving into the function area, but I think there's a lot of room for improvement there. Still, for CPU-bound processes, I still think that it's easier to get by with a server with an instance running than a function. And for things that maybe does not need that level of performance or granularity, whether you change the instance size or the CPU or the threads number for that, I would go with functions for that. Thank you. Ruben? Yeah, so coming back to the use case, the first question I would have would be scalability. You know, if you want to start small and then go big, then serverless is very useful, because you don't have to worry about scaling things. You don't have to worry about, are we going to be able to manage the load from the start? I guess, apart from that, and obviously, the implications on the small cost, because you're only paying for what you're using, that's very attractive for startups, because they don't have to worry about scalability. But now, if you think about large enterprises, if you think, shall we adopt a serverless architecture, where we already have an infrastructure in place? So I think the question there would be, OK, what are the benefits? And if we already have something established, then where does serverless fit into that architecture? And also, if we are going to serverless for a specific task, so we were discussing earlier, like, if we are going to carry on with our infrastructure, but then we want to try some of the serverless functions for CPU intensive tasks, or for things that require a lot of requests, or that will have a lot of load. I think that would be the question on the decision that we have to make in terms of architecture, deciding where does it fit in a large enterprise? Because, I mean, not everyone has the opportunity to just design from scratch, like in the startup world where you can just have everything at your disposition, that you can just choose the technology. Sometimes when you have something already established, it's harder to make that decision to adopt something different. But I guess there is several areas that are attractive for large enterprises because of the load as well. So, yeah. Great. Thank you. Aaron? I think Ally made an excellent point when she basically said, you know, like, you have to like start with what you're comfortable with. And I think, you know, like to me, like developer economics are always going to be a key decision making. You know, if you I usually, you know, when I hire people and they ask, oh, should I use, you know, this technology or this framework to start building this new service? And I was like, you know, well, how experienced are you with whatever you're suggesting? And it's like, well, you know, I've been doing, you know, five years of work with, you know, with this framework, but I've been hearing that this is the future. So I should move there. And I would say like, well, you know, start with whatever you will make the most productive. You know, you personally, the developer, would start with the technology and the architecture and the tools that you feel the most comfortable with. And then as you start building your service, you don't have to wait five years to do this. You know, after a few months when you kind of know what you're doing, you kind of have a proof of concept that's up and running. Then you can start asking yourself the bigger architectural question. And it's very challenging. You know, there are use cases for serverless that are obvious. You know, you do image processing, data processing, you know, there's all these things that Node is not graded when it comes to the event loop. Because, you know, if anything is CPU bound, it takes a long time, then your server is basically doing one thing. And that's kind of where, you know, serverless really shines. So, if you have to resize images for previews, that's kind of a no brainer. Like, you know, anybody starting today will go and reach for that that tool, regardless of where they're working in.

Cost Considerations in Server vs. Serverless

Short description:

Balancing Costs and Scalability: Server vs. Serverless for Infrastructure Budgeting and Adaptability in Growing Systems.

But then if you're just doing, okay, you log in, should your login be a server or should it be a serverless function? And now, you're kind of like, okay, there's no trivial solution for that. Some of the, you know, the thing that had been said before, yes, serverless will be way cheaper for you when you start, but it will probably be a lot more expensive than regular containers as you scale up because it's unbounded. Because, basically, the more requests you're getting, the more serverless instances are potentially running. Yes, you can set boundaries and stuff like that, but if you're running containers, you say, look, I can only afford to run five containers this size. I'm going to pay this amount of money every month, and, yeah, my service quality might degrade over time as I have more load, but at least I can decide, okay, I'm going to add one more server, I know the exact cost of that server, so it's a lot easier for budgeting than other more elastic infrastructure components.

Can I jump in just quickly here? So, yeah, I don't agree that it's easier to budget, it's different, so yeah, it's easier to know one server costs, like, let's say $30 or something like that, and with Lambda Functions, you have completely different pricing model. So I personally have zero idea how many servers do I need for something. You need to load test your application, you need to see how many users can you serve with one server, and with Lambda functions, it's different. It's like basically from fixed costs you're moving to variable costs, and it really depends on what you're building. That's I think the key of everything. If you're building the system where you have limited budget, then of course it's better to go with something that you can predict easily. But if you're building a SaaS or something like that, or even a big enterprise where you earn some money from your users, it's okay to scale your infrastructure and everything with more users and, of course, more money that you're getting.

And it's really hard to do that in the beginning, but I don't think that functions are the most important part of serverless. For us, it's really important that we don't think about scalability of some other components. For example, I know the cost of my bugs. For example, we did a bug and we had 250 million writes inside our database for one month, and that cost $300. So, yeah, it's something that we could prevent, but we have alarms on billing and everything else. And, yeah, everything scaled. My service didn't stop working for them, and the cost itself, it's not that big. So for us, production is like $250 or something like that. So it's really cheap. It's different for enterprise applications, of course, but, again, if you pay for infrastructure servers, but you also need to pay people that will maintain these servers. So, yeah, the cost is not that predictable. It's not just the cost of the server itself. Someone needs to do something about that server. Yeah, and you can control the amount of the functions, right? Yeah, but it still depends. If you have a limited number of functions, then some of your users will get the error. With servers, it's different. You can get the higher load, but you'll still get the response. So that point is really good, of course.

Node.js and IoT in the Cloud

Short description:

If you have a limited number of functions, then some of your users will get the error. With servers, it's different. You can get the higher load, but you'll still get the response. So that point is really good. We have a question from our audience about how Node.js in the cloud helps with IoT. It depends on what you want to do with IoT. Node.js can help process data, and it can be used with WebSockets in both serverless and traditional Node.js servers. The choice between existing frameworks and a pure Node.js server depends on the specific requirements and whether the out-of-the-box SDKs provided by cloud providers are sufficient. Aaron highlights the challenges faced by traditional frameworks in the serverless world, where long load times and assumptions about long-running architecture are not suitable. Caching and efficient data processing are examples of functionality that may not work well in a serverless environment.

If you have a limited number of functions, then some of your users will get the error. With servers, it's different. You can get the higher load, but you'll still get the response. So that point is really good.

Of course. I agree. And also, we have a question from our audience. It's exactly about the use case. So the question is, how Node.js in the cloud helps for IoT in the cloud? Well, it depends on what you want to do with IoT. So you need to process some data, and as long as you want to process some data, of course, Node.js can help you with that. And most of the Node.js is in the cloud anyway, right? It's like if you don't own a serverless function, you probably host that on some kind of server or something like that that you don't have in your living room. So anyway, in the end, that's kind of a cloud. But yeah, I'm not really experienced with IoT, but you can use WebSockets with serverless, and you can use WebSockets with Node.js, so with traditional Node.js server, so I don't think there's a big difference. It's just it depends on what you want to do with your data.

Agree, and also we can use the IoT Hub, let's say, in AWS, yes, and process the data via Node.js again in a stream. Anyone would like to add something to this question? Okay, let's go to the next question.

So we decided what we will choose, server or serverless, and we're considering the frameworks, guys. What do you think of existing current frameworks like Happy, Express, Fastify can help us in a serverless? Or if you prefer to use a pure Node.js server without any frameworks and use a serverless framework, and the SDK from popular cloud providers, do you have enough from out of the box of SDK, or you feel like there is a gap between the development, and you have to structure your code manually. What do you think about this? Let's start with Aaron.

Oh, yeah. That's, you know, for me, that was like the million-dollar question over the last year and a half. And I reached two conclusions with that, is one, is that, yes, we're definitely lacking in tools. And the frameworks we have today are kind of struggling to make the transition to this new world. Me, personally, I've created Happy and maintained it for almost a decade, and last year, I basically decided that the framework is kind of done, and the future is going to be something that's going to be new. The biggest challenges for traditional frameworks is that they're all assuming long-running architecture, which means they do a lot of work upfront, so load time is long, but once they're up, and warm up, and loaded, they provide a lot of functionality that is extremely efficient and productive for developers. So, for example, they have, you know, all kind of built-in caching. So, let's say you have an application that needs to access country data. Country data is stable, you know, we don't have new countries added dynamically on the fly. So, it's not one of those things you can load into memory, you can process it, you can cache it, index it, and then you don't have to go to the database every time you have an API call that's dealing with country data. Makes it very efficient for a regular, you know, container architecture, but then you can't use this architecture for serverless because every time you're going to bring up one instance, you're going to have to go in and load this entire table into memory and process it.

Node.js for IoT and Framework Challenges

Short description:

Exploring Node.js for IoT in the Cloud and Framework Dilemmas in Serverless Development.

Agree. And also, we have a question from our audience. It's exactly about the use case. So the question is, how Node.js in the cloud helps for IoT in the cloud? Well, it depends what you want to do with IoT. So you need to process some data. And as long as you want to process some data, of course Node.js can help you with that. And most of the Node.js is in the cloud anyway. It's like, if you don't own a serverless function, you probably host that on some kind of server or something like that, that you don't have in your living room. So anyway, in the end, that's kind of a cloud. But yeah, I'm not really experienced with IoT, but you can use WebSockets with serverless and you can use WebSockets with Node.js with traditional Node.js servers. So I don't think there's a big difference. And it depends on what do you want to do with your data.

Agreed. And also we can use the IoT Hub, let's say in AWS, yes, and process the data via Node.js again in a stream. Anyone would like to add something to this question? So we decided what we will choose, server or serverless. And considering the frameworks, guys, what do you think of existing current frameworks like Hapi, Express, Fastify can help us in a serverless or if you prefer to use a pure Node.js server without any frameworks and use serverless framework and SDK from popular cloud providers, do you have enough from the out-of-the-box of SDK? Or you feel like there is a gap between the development and you have to structure your code manually? What do you think about this? Let's start with Aaron. You know for me that was like the million dollar question over the last year and a half. And I reached two conclusions with that, is one is that yes we're definitely lacking in tools and the frameworks we have today are kind of struggling to make the transition to this new world. Me personally I've created Happy and maintained it for almost a decade and last year basically decided that the framework is kind of done and the future is going to be something that's going to be new.

The biggest challenges for traditional frameworks is that they are all assuming long running architecture which means they do a lot of work upfront so load time is long. But once they're up and warm up and loaded they provide a lot of functionality that is extremely efficient and productive for developers. So for example they have all kind of built-in caching. So let's say you have an application that needs to access country data. Country data is stable. We don't have new countries added dynamically on the fly so it's not one of those things you can load into memory, you can process it, you can cache it, index it and then you don't have to go to the database every time you have an API call that's dealing with country data. Makes it very efficient for for a regular container architecture but then you can't do use this architecture for serverless because every time you're going to bring up one instance you're going to have to go in and load this entire table into memory and process it and now you're spending even if it's only 200 milliseconds you just first you ever all your requests are 200 millisecond longer but also you're now paying for 200 milliseconds of cloud functions on every request as the overhead. So this is the area where I think I feel that developers today are forced to choose. So if you're saying okay I need to build a new endpoint I need to decide up front where I'm going to deploy it. So you can you know you can write the the abstraction you can write a function that will you know take an image and resize it and then save it to S3 and then send it back or something like that.

Challenges with Deployment and Performance

Short description:

Developers today are forced to choose between deploying their code as a function or using a traditional framework like Happy or Express. This limits creativity, flexibility, and testing in an integrated environment. We need a new approach that allows us to write middleware and routes in a more abstract way, adapting them to different deployment options. Traditional frameworks that don't support serverless deployment will struggle to survive. Moving to a serverless architecture requires considering the impact on performance, especially for long-running processes. Cold start is a significant challenge.

And now you're spending, even if it's only 200 milliseconds, you just, first all your requests are 200 milliseconds longer, but also you're now paying for 200 milliseconds of Cloud Functions on every request. So this is the area where I feel that developers today are forced to choose. So if you're saying, okay, I need to build a new endpoint, I need to decide upfront where I'm going to deploy it. So you can, you know, you can write the abstraction, you can write a function that will take an image and resize it and then save it to S3 and then send it back or something like that. But if as soon as you think about deploying it, you have to decide where it's going to go. Is this going to be an express or happy or festify route or it's going to be a function? And I don't think that's a good thing to have right now. I think that's limiting developers both creativity, flexibility, and the ability to really test things in an integrated environment. What I think we're missing right now is a new approach to basically what I call containers for functions. So if we can write our middleware, our routes in a more abstract way, kind of like some kind of like a lightweight standard, and then be able to adapt this to either framework deployment or regular serverless deployment or framework deployment, I think that would be a really great thing. And that's kind of what I've been playing with for a year and a half. The problem is that all open source projects at some point you kind of run out of funding, and that's when I decided it was time to kind of hand over happy to a new generation of developers and kind of go and play with other things on the side. But I think that's going to be the big thing is to stop thinking about development as either, or and start thinking about how can we be a little more more fluid and more flexible with how we're developing. So I think that's going to be the biggest challenge. And any framework, any traditional framework like Happy and Express, who is not going to find ways to allow their business logic to be easily extracted from routes and deployed as serverless is going to die. They're just not going to get any adoption. So whoever is using them today will continue using them because they're comfortable and productive in that environment. But anyone who's picking a new framework today is going to look at the landscape and say, like I don't know, should I start with Happy or Express or Festify today if I'm starting from scratch, I don't know, it's going to basically lock me into an old architecture that's not going to be easy to scale or move around. And then you're going to end up with probably a less mature environment because it doesn't come with 10 years of optimizations that the other frameworks come with. So, there's definitely going to be a shift here and it might be generational in terms of developers like me who are older and crustier and have been doing things the same way for 25 years and then new people in the industry who are kind of coming in and embracing all the new technology and building the new vision. That's a very interesting idea.

Ruben, what do you think about this? Yeah, so I was going to say I've been there actually and vendor locking is an issue and I'm pretty sure that a lot of people are thinking today, shall we move out from Express, shall we try Fastify but then there is another case where, okay what if you just do serverless functions? And I think what Aaron mentioned that is really, really will be awesome is if we have a way to just abstract all the business logic and just make sure that we can just deploy either a serverless or traditional without the vendor locking, without having to worry about all this internals that come from a framework that is provided to us. So I guess, yeah, I've been there and also with the long running process. That is another thing that I have experienced before where we have a long running server and we have everything into memory. And that's how it works right now. And it's really fast because everything is in memory and we can serve requests really, really fast. But as we were starting the option of moving towards a serverless architecture, then we were like, okay, this is not gonna work anymore. We don't have things in memory, there is no long running process, so we need to assess the impacts on performance because we're used to just really fast request responses from the server because we have all the setup and configuration and all the pieces that we need to respond already into memory. So I experienced that as well. And that's something that people have to think about if you want to move towards serverless architecture. I think about cold start is probably the curse word.

Framework Adaptation and Modernization

Short description:

Addressing Framework Flexibility and Shift Towards Modern Development Practices.

So you can you know you can write the the abstraction you can write a function that will you know take an image and resize it and then save it to S3 and then send it back or something like that. But if as soon as you think about deploying it you have to decide where it's going to go. Is this going to be an express or happy or festify route or is it going to be a function? And I don't think that's a good thing to have right now. I think that's limiting developers both creativity, flexibility, and the ability to really test things in an integrated environment. What I think we're missing right now is is a new approach, basically what I call container for fun containers for functions. So if we can write our middleware or our routes in a more abstract way, kind of like some kind of like a lightweight standard, and then have been able to adapt this to either framework deployment or regular serverless deployment or framework deployment, I think that would be a really great thing. And that's kind of what I've been playing with for like a year and a half.

The problem is that like open source project, at some point, you kind of run out of funding. And that's when I decided it was time to kind of hand over happy to a new generation of developers and kind of go and play with other things on the side. But I think that's going to be the next big thing, is to stop thinking about development as either or, and start thinking about how can we be a little more fluid and more flexible with how we're developing. So I think that's going to be the biggest challenge. And any framework, any traditional framework like happy and express, who is not going to find ways to allow their business logic to be easily extracted from routes and deployed as serverless is going to die. They're just not going to get any adoption. So whoever is using them today will continue using them because they're comfortable and productive in that environment.

But anyone who's thinking a new, a new framework today is going to look at the landscape and say, like, I don't know, should I start with happier, express or Festivite today from starting from scratch? I don't know. It's going to basically lock me into an old architecture. That's not going to be easy to scale or move around. And then you can end up with probably a less mature environment because it doesn't come with 10 years of optimizations that the other frameworks come with. So there's definitely going to be a shift here. And it might be generational in terms of developers like me who are older and crustier and have been doing things the same way for 25 years. And then new people in the industry who are kind of coming in and embracing all the new technology and building the new vision. That's a very interesting idea. Ruben, what do you think about this? Yeah, so I was going to say, I've been there actually. And vendor lock-in is an issue. And I'm pretty sure that a lot of people are thinking today, shall we move out from Express? Shall we try Fastify? But then there is another case where, okay, what if you just do serverless functions? And I think what Aaron mentioned that is really, really, would be awesome is if we have a way to abstract all the business logic and just make sure that we can just deploy either a serverless or traditional without the vendor lock-in, without having to worry about all these internals that come from a framework that is provided to us. So I guess, yeah, I've been there. And also, with the long running process, that is another thing that I have experienced before, where we have a long running server and we have everything into memory.

Extracting Business Logic into Functions

Short description:

I agree with Aaron and Ruben's point about the importance of making it easy to extract business logic and put it into a function. It's closely related to ergonomics and developer experience. Authors and maintainers of projects should consider how to integrate this transition. It's a long-term process, but worth exploring.

So that's what I think. Alejandro, what do you think about this? Yeah, I like what Aaron and Ruben were mentioning about whoever does not make it easy to extract business logic and put that into a function, will either die or lose adoption. And yeah, I think it's closely related to ergonomics, developer experience. Yeah, I agree with that view and I think it is something to have in mind if you are an author or maintainer of one of those projects that are out there, how to integrate that. It's, I know it's a long-term transition, it's not something that will happen overnight, but it's something to look at and yeah, get better into transitioning your existing logic into function. Yeah.

Serverless Challenges and Low-Code Frameworks

Short description:

Challenges of Transitioning to Serverless Architecture and Embracing Low-Code Frameworks like AWS Amplify.

And that's how it works right now. And it's really fast because everything is in memory and we can serve requests really, really fast. But as we were starting the option of moving towards a serverless architecture, then we were like, OK, this is not going to work anymore. We don't have things in memory. There is no longer running process. So, we need to assess the impact on performance because, you know, we're used to just really fast request responses from the server because we have all the setup and configuration and all the pieces that we need to respond already into memory. So, I experienced that as well. And that's something that people have to think about if you want to move towards serverless architecture. I think about cold start is the curse word. So, yeah, that's what I think.

Alejandro, what do you think about this? Yeah, I like what Aaron and Ruben were mentioning about whoever does not make it easy to extract business logic and put that into a function will either die or lose adoption. And yeah, I think it's closely related to developer ergonomics, developer experience. Yeah, I agree with that view and I think it is something to have in mind if you are an author or maintainer of one of those projects that are out there. How to integrate that? I know it's a long-term transition. It's not something that will happen overnight. But it's something to look at and get better into transitioning your existing logic into functionality, function, cloud. Yeah. Great, I agree.

We have a question about AWS Amplify. Ellie, so I guess it goes to you. What do you think about the rise of low-code frameworks like AWS Amplify? I like that question. So, first off, I think personally that low-code has the most potential when it thinks about developers as part of the solution instead of an opposition to it, right? Like a lot of the original low-code solutions were, like, just click and drag, and then you have an app that's up. And I think that that's a lot of low-code. And something that I'm really excited with low-code is putting developers in this. So, making it so that anybody can be a developer, but still thinking about the developer in that. We launched the admin UI for Amplify over reInvent. And something I really like about that is that it's still compatible with the Amplify command line interface, and so it really meets developers where they're at as well. And all the code is actually generated, so it's not in a black box or anything like that. You can access all of it. And so, I'm excited about low-code solutions that still think about developers in the process and just make it more accessible to be a developer.

Rise of Low-code Frameworks

Short description:

Low-code has the most potential when it includes developers as part of the solution. AWS Amplify's admin UI is compatible with the command line interface and generates accessible code. It makes low-code solutions more accessible to developers.

Great, I agree. We have a question about AWS Amplify, Ellie, so I guess it goes to you. What do you think about the rise of low-code frameworks like AWS Amplify? I like that question. So, first off, I think personally that low-code has the most potential when it thinks about developers as part of the solution instead of in opposition to it, right? Like a lot of the original low-code solutions were like just click and drag and then you have an app that's app, and I think that that's a lot of no-code. And something that I'm really excited with low-code is putting developers in that. So, making it so that anybody can be a developer, but still thinking about the developer in that. And so, we launched the admin UI for Amplify over reInvent. And something I really like about that is that it's still compatible with the Amplify command line interface. And so, it really meets developers where they're at as well. And all the code is actually generated, so it's not in a black box or anything like that. You can access all of it. And so, I'm excited about low-code solutions that still think about developers in the process and just make it more accessible to be a developer.

Future Frameworks and Vendor Lock-in

Short description:

Discussing Future Frameworks in Serverless Architecture and Vendor Lock-in Challenges.

Good. Great. We have a lot of questions about serverless and even questions directly to Slabadan. I guess we can proceed with all of them after this discussion because we will have a topic of discussion about the serverless. But let me ask Slabadan about the framework and its future in serverless.

Slabadan, what do you think about this? Might we have some framework in the future that will combine some popular services? And yeah, what do you think about this? I really agree with Aaron and everything he said. I completely agree with that. I'm a big fan of Hopi framework and I used it a lot at some point and that was definitely my favorite Node.js framework. Right now we don't have anything similar in serverless. We have some new approaches and I like them. Of course, people are trying to copy some good concepts from traditional frameworks to serverless with some middlewares and things like these, which are really good.

But on the other side, we are definitely missing some bigger frameworks that will come over time for sure. It's still new. People are using serverless in the last few years, so they just started building big applications with it. But I'm talking a lot about vendor lock-in and things like these, and we can't expect that Amazon or Microsoft or any big vendor will solve that problem for us, because, of course, it's not on their feature list to solve the vendor lock-in problem. But on the other side, you can pick the architecture that works for you.

Serverless Frameworks and Vendor Locking

Short description:

We have a lot of questions about serverless and even questions directly to Slobodan. Let me ask Slobodan about the framework and its future in serverless. I'm a big fan of the happy framework, but right now we don't have anything similar in serverless. We are missing bigger frameworks that will come over time. Vendor locking is a challenge, but we can pick an architecture that works for us. We can use architectures like hexagonal architecture and build that with our code. We have adapters for different things. I'm looking forward to frameworks that will simplify this in the future.

Good, great. We have a lot of questions about serverless and even questions to directly to Slobodan. I guess we can proceed with all of them after this discussion, because we'll have a topic, a discussion about the serverless.

But let me ask Slobodan about the framework and its future in the serverless. Slobodan, what do you think about this? Might we have some framework in the future that will combine some popular services? What do you think about this?

I really agree with Eran and everything he said. I completely agree with that. I'm a big fan of the happy framework and I used it a lot a lot at some point. That was definitely my favorite Node.js framework. Right now we don't have anything similar in serverless. We have some new approaches and I like them. Of course, there are some people are trying to copy some good concepts from traditional frameworks to serverless with some middlewares and things like these, which are really good. But on the other side, yeah, we are definitely missing some bigger frameworks that will come over time, for sure. It's still new. It's like people are using serverless in the last few years, so they just started building big applications with it. But I'm talking a lot about vendor locking and things like this. And we can't expect that Amazon or Microsoft or any big vendor will solve that problem for us, because of course, it's not on their feature list to solve the vendor locking problem. But on the other side, you can pick the architecture that works for you. For example, yeah, it would be great if we can just take part of our, let's say, Express handler or something like that and put that into serverless function. But we can use architectures like hexagonal architecture and basically build that with our code. So we'll have one input from lambda function and another input, let's say, from Express.js route or something like that. And then we'll have a small adapter that will adopt our business logic that should be isolated from everything, basically to work with, to translate that request to something that our business logic understands. We're using that on vacation tracker, and we migrated a lot of things. We are using GraphQL now, and we started with even Express server. And then after that, we continued with a lot of lambda functions, but basically our business logic don't know about the database or about the trigger itself. It knows about the business logic. And then we have adapters for all the different things. It's a bit more complex, and I'm looking forward to some frameworks that will simplify that in the future.

That's good. We have time just only for one question. Let me take the last question from the chat.

Express Integration and DevOps Evolution

Short description:

Discussing Integration of Express with Serverless, Business Logic Isolation, DevOps Necessity, and Role Evolution.

For example, it would be great if we could just take part of our, let's say, Express handler or something like that, and put that into a serverless function. But we can use architectures like hexagonal architecture and basically build that with our code. So we'll have one input from Lambda function and another input, let's say, from express.js route or something like that, and then we'll have a small adapter that will adapt our business logic that should be isolated from everything, basically, to translate that request to something that our business logic understands. We were using that on Vacation Tracker, and we migrated a lot of things. We are using GraphQL now, and we started with even express server, and then after that, we continued with a lot of Lambda functions. But basically, our business logic don't know about the database or about the trigger itself. It knows about the business logic, and then we have adapters for all the different things. It's a bit more complex, and I'm looking forward to some frameworks that will simplify that in the future.

That's good. We have time just only for one question, and let me take the last question from the chat. So, we have a question about the DevOps, and if you're developing in the serverless, using Do we need the DevOps in them? Or another question about this topic is how do you guys manage the big amount of functions and how do you deploy them? What approach would you prefer to use in monorepo or deploy everything separately? And I believe, do we need a DevOps in this case? I'll answer really quickly about our case. We don't have anyone that is doing DevOps in our team, but our team is really small. On a bigger scale, of course you will need some people that will take care about infrastructure, not really about the single Lambda function, but there are a lot of things happening in the application and someone needs to understand that and make sure that everything really works, scaling properly, and everything else. Yeah, you need some help. Yeah, we are using monorepo, but there are a lot of other things that you can do and deploy easily all the functions with one command.

Ruben? Yeah, we definitely need DevOps. There's a use case that I like and it's where you have all your environments set up from the ground up every day, like having testing environments. You definitely need someone that is able to help you get all those testing environments set up and running and shutting them down and making sure that they work. I think there is, again, depends on the scale, there is opportunity to have a team dedicated to this. So, I don't think serverless will completely get rid of DevOps, I think you will enhance it because people that become a specialist will take care of all these configurations and things. Like they told that we had before, they just said that was about Terraform and making sure everything works. I mean, someone probably needs that job to be in charge of that. Ely. Yeah, I think I'll just plus one what everybody else said with the DevOps are still important, but I think that that role definitely shifts and I think it probably has over time too. What about you, Alejandro? Yeah, exactly the same. Having worked with Kubernetes environments and those type of workloads, I think that DevOps role will totally transition if you are the size of a huge enterprise. You will need DevOps to run that infrastructure, whatever you are running, either servers or functions in the cloud. You will need people to help you build that. Yeah.

DevOps in Serverless Environment

Short description:

Do we need DevOps when developing in a serverless environment with Node.js? On a small team, DevOps may not be necessary, but as the scale increases, infrastructure management becomes crucial. Having dedicated specialists can enhance the serverless architecture. The DevOps role will transition and remain essential, focusing on configurations and higher-level interfaces.

So we have a question about the devops. And if you're developing in a serverless with Node.js, do we need to devop them? Or another question about this topic is how do you guys manage the big amount of functions and how you deploy them? What approach would you prefer to use? And one repo deploy everything separately. And I believe do we need a DevOps in this case?

So I'll answer really quickly about our case. So we don't have anyone that is doing DevOps in our team, but our team is really small. On a bigger scale, of course, you will need some people that will take care about infrastructure, not really about the single Lambda function, but there are a lot of things happening in the application and someone needs to understand that and make sure that everything really works scaling properly and everything else. So yeah, you need some help and yeah, we are using Monorepo, but there are a lot of other things that you can do and deploy easily all the functions with one command.

Okay, Ruben. Yeah, we definitely need DevOps. There is a use case that I like and it's where you have all your environments set up from the ground up every day like having testing environments. You definitely need someone that is able to help you get all those testing environments set up and running and shutting them down and making sure that they work. So I think there is, again, depends on the scale, there is opportunity to have a team dedicated to this. So I don't think serverless will completely get rid of DevOps, I think you will enhance it because people that become specialists will take care of all these configurations and things like the talk that we had before we take us, that was about Terraform and making sure everything works. I mean, someone probably needs that job to be in charge of that.

True, true. Eli. Yeah, I think I'll just plus one what everybody else said with the DevOps are still important, but I think that that role definitely shifts and I think it probably has over time too. What about you Alejandro? Yeah, exactly the same. Having worked with Kubernetes environments and this type of you know workloads. I think that role, the DevOps role will totally transition. If you are in the size of, you know, a huge enterprise, you will need DevOps to run that infrastructure. Whatever you are running, either servers or functions in the cloud, you will need people to help you build that. Yeah. Aaron, I guess we are out of time. So thank you so much. Okay, Aaron, please. It's DevOps will never go away. We're all you're doing is you're moving complexity from one place to the other. And so if, you know, before, you know, DevOps focused more on, like running container or running a server or running, you know, bare metal. Now they're switching to focus more on configurations and on more, you know, higher level interfaces. But forget about everything else.

DevOps Relevance in Serverless Environments

Short description:

Aaron on the Persistence of DevOps in Serverless Environments.

Yeah. Aaron, I guess we are out of time. So, thank you so much. Okay, Aaron, please. It's... DevOps will never go away. All you're doing is you're moving complexity from one place to the other, and so if, you know, before, you know, DevOps focused more on, like running a container or running a server, running, you know, bare metal. Now, they're switching to focus more on configurations and on more, you know, higher level interfaces.

But forget about everything else. Somebody has to make sure the application is running well, and they need to make sure that, you know, that there's someone looking at the logs, that you have logs, that you know what's happening. Developers are not going to do that on a daily basis, because they're running code. So, yes, serverless might take away the need to configure a container or less work with Kubernetes, but it's not going to take away the normal work of someone looking at what's happening and making sure that the app is still up and that it's serving users and it's performing well.

And also giving feedback back to the developers, telling them, hey, look, your app is too slow. Users are not getting the performance that they're paying for and you should do something about it. Yeah. Thank you so much and thanks for your questions. And thank you, our experts and speakers.

Monitoring and Performance in Serverless

Short description:

Serverless eliminates the need to configure containers or work with Kubernetes, but it doesn't remove the responsibility of monitoring the application's performance and providing feedback to developers. Developers focus on running code, while someone else ensures the app is running well and serving users effectively.

Somebody has to make sure the application is running well, and they need to make sure that, you know, that there's someone looking at the logs, that you have logs, that you know what's happening. Developers are not going to do that on a daily basis because they're running code.

So yes, serverless might take away the need to configure a container or less work with Kubernetes. But it's not going to take away the normal work of someone looking at what's happening and making sure that the app is still up and that it's serving users and it's performing well. And also giving feedback back to developers, telling them, hey, look, your app is too slow. Users are not getting the performance that they're paying for. And you should do something about it.

Yeah. Thank you so much. Thanks for your questions.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Node Congress 2022Node Congress 2022
26 min
It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Top Content
The talk discusses the importance of supply chain security in the open source ecosystem, highlighting the risks of relying on open source code without proper code review. It explores the trend of supply chain attacks and the need for a new approach to detect and block malicious dependencies. The talk also introduces Socket, a tool that assesses the security of packages and provides automation and analysis to protect against malware and supply chain attacks. It emphasizes the need to prioritize security in software development and offers insights into potential solutions such as realms and Deno's command line flags.
ESM Loaders: Enhancing Module Loading in Node.js
JSNation 2023JSNation 2023
22 min
ESM Loaders: Enhancing Module Loading in Node.js
Top Content
ESM Loaders enhance module loading in Node.js by resolving URLs and reading files from the disk. Module loaders can override modules and change how they are found. Enhancing the loading phase involves loading directly from HTTP and loading TypeScript code without building it. The loader in the module URL handles URL resolution and uses fetch to fetch the source code. Loaders can be chained together to load from different sources, transform source code, and resolve URLs differently. The future of module loading enhancements is promising and simple to use.
Towards a Standard Library for JavaScript Runtimes
Node Congress 2022Node Congress 2022
34 min
Towards a Standard Library for JavaScript Runtimes
Top Content
There is a need for a standard library of APIs for JavaScript runtimes, as there are currently multiple ways to perform fundamental tasks like base64 encoding. JavaScript runtimes have historically lacked a standard library, causing friction and difficulty for developers. The idea of a small core has both benefits and drawbacks, with some runtimes abusing it to limit innovation. There is a misalignment between Node and web browsers in terms of functionality and API standards. The proposal is to involve browser developers in conversations about API standardization and to create a common standard library for JavaScript runtimes.
Out of the Box Node.js Diagnostics
Node Congress 2022Node Congress 2022
34 min
Out of the Box Node.js Diagnostics
This talk covers various techniques for getting diagnostics information out of Node.js, including debugging with environment variables, handling warnings and deprecations, tracing uncaught exceptions and process exit, using the v8 inspector and dev tools, and generating diagnostic reports. The speaker also mentions areas for improvement in Node.js diagnostics and provides resources for learning and contributing. Additionally, the responsibilities of the Technical Steering Committee in the TS community are discussed.
The State of Node.js 2025
JSNation 2025JSNation 2025
30 min
The State of Node.js 2025
The speaker covers a wide range of topics related to Node.js, including its resilience, popularity, and significance in the tech ecosystem. They discuss Node.js version support, organization activity, development updates, enhancements, and security updates. Node.js relies heavily on volunteers for governance and contribution. The speaker introduces an application server for Node.js enabling PHP integration. Insights are shared on Node.js downloads, infrastructure challenges, software maintenance, and the importance of update schedules for security.
Node.js Compatibility in Deno
Node Congress 2022Node Congress 2022
34 min
Node.js Compatibility in Deno
Deno aims to provide Node.js compatibility to make migration smoother and easier. While Deno can run apps and libraries offered for Node.js, not all are supported yet. There are trade-offs to consider, such as incompatible APIs and a less ideal developer experience. Deno is working on improving compatibility and the transition process. Efforts include porting Node.js modules, exploring a superset approach, and transparent package installation from npm.

Workshops on related topic

Node.js Masterclass
Node Congress 2023Node Congress 2023
109 min
Node.js Masterclass
Top Content
Workshop
Matteo Collina
Matteo Collina
Have you ever struggled with designing and structuring your Node.js applications? Building applications that are well organised, testable and extendable is not always easy. It can often turn out to be a lot more complicated than you expect it to be. In this live event Matteo will show you how he builds Node.js applications from scratch. You’ll learn how he approaches application design, and the philosophies that he applies to create modular, maintainable and effective applications.

Level: intermediate
Build and Deploy a Backend With Fastify & Platformatic
JSNation 2023JSNation 2023
104 min
Build and Deploy a Backend With Fastify & Platformatic
Top Content
WorkshopFree
Matteo Collina
Matteo Collina
Platformatic allows you to rapidly develop GraphQL and REST APIs with minimal effort. The best part is that it also allows you to unleash the full potential of Node.js and Fastify whenever you need to. You can fully customise a Platformatic application by writing your own additional features and plugins. In the workshop, we’ll cover both our Open Source modules and our Cloud offering:- Platformatic OSS (open-source software) — Tools and libraries for rapidly building robust applications with Node.js (https://oss.platformatic.dev/).- Platformatic Cloud (currently in beta) — Our hosting platform that includes features such as preview apps, built-in metrics and integration with your Git flow (https://platformatic.dev/). 
In this workshop you'll learn how to develop APIs with Fastify and deploy them to the Platformatic Cloud.
Deploying React Native Apps in the Cloud
React Summit 2023React Summit 2023
88 min
Deploying React Native Apps in the Cloud
WorkshopFree
Cecelia Martinez
Cecelia Martinez
Deploying React Native apps manually on a local machine can be complex. The differences between Android and iOS require developers to use specific tools and processes for each platform, including hardware requirements for iOS. Manual deployments also make it difficult to manage signing credentials, environment configurations, track releases, and to collaborate as a team.
Appflow is the cloud mobile DevOps platform built by Ionic. Using a service like Appflow to build React Native apps not only provides access to powerful computing resources, it can simplify the deployment process by providing a centralized environment for managing and distributing your app to multiple platforms. This can save time and resources, enable collaboration, as well as improve the overall reliability and scalability of an app.
In this workshop, you’ll deploy a React Native application for delivery to Android and iOS test devices using Appflow. You’ll also learn the steps for publishing to Google Play and Apple App Stores. No previous experience with deploying native applications is required, and you’ll come away with a deeper understanding of the mobile deployment process and best practices for how to use a cloud mobile DevOps platform to ship quickly at scale.
Building a Hyper Fast Web Server with Deno
JSNation Live 2021JSNation Live 2021
156 min
Building a Hyper Fast Web Server with Deno
Workshop
Matt Landers
Will Johnston
2 authors
Deno 1.9 introduced a new web server API that takes advantage of Hyper, a fast and correct HTTP implementation for Rust. Using this API instead of the std/http implementation increases performance and provides support for HTTP2. In this workshop, learn how to create a web server utilizing Hyper under the hood and boost the performance for your web apps.
0 to Auth in an Hour Using NodeJS SDK
Node Congress 2023Node Congress 2023
63 min
0 to Auth in an Hour Using NodeJS SDK
WorkshopFree
Asaf Shen
Asaf Shen
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool.
We will enhance a full-stack JS application (Node.JS backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session for subsequent client requests, validating / refreshing sessions
At the end of the workshop, we will also touch on another approach to code authentication using frontend Descope Flows (drag-and-drop workflows), while keeping only session validation in the backend. With this, we will also show how easy it is to enable biometrics and other passwordless authentication methods.
Table of contents- A quick intro to core authentication concepts- Coding- Why passwordless matters
Prerequisites- IDE for your choice- Node 18 or higher
Full Stack GraphQL In The Cloud With Neo4j Aura, Next.js, & Vercel
GraphQL Galaxy 2021GraphQL Galaxy 2021
161 min
Full Stack GraphQL In The Cloud With Neo4j Aura, Next.js, & Vercel
Workshop
William Lyon
William Lyon
In this workshop we will build and deploy a full stack GraphQL application using Next.js, Neo4j, and Vercel. Using a knowledge graph of news articles we will first build a GraphQL API using Next.js API routes and the Neo4j GraphQL Library. Next, we focus on the front-end, exploring how to use GraphQL for data fetching with a Next.js application. Lastly, we explore how to add personalization and content recommendation in our GraphQL API to serve relevant articles to our users, then deploy our application to the cloud using Vercel and Neo4j Aura.

Table of contents:
- Next.js overview and getting started with Next.js
- API Routes with Next.js & building a GraphQL API
- Using the Neo4j GraphQL Library
- Working with Apollo Client and GraphQL data fetching in Next.js
- Deploying with Vercel and Neo4j Aura