Supercharge Your Debugging With the New Features in Chrome Devtools

This ad is not shown to multipass and full ticket holders
React Summit US
React Summit US 2025
November 18 - 21, 2025
New York, US & Online
The biggest React conference in the US
Learn More
In partnership with Focus Reactive
Upcoming event
React Summit US 2025
React Summit US 2025
November 18 - 21, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

Chrome DevTools just got a turbo boost! The Chrome DevTools team has been releasing new features at a furious rate over the past year, including  a complete Performance Panel overhaul with real-time Live Metrics and AI-powered Insights that'll expose bottlenecks you never knew existed. 

Forget tedious documentation searches and forum hunts; now, AI assists you directly within the Elements, Performance, Network and Sources panels, keeping you in the flow. Use the new alignment and collaboration tools to easily share your observations and debugging wizardry with your team and stakeholders, 

We'll unleash the full potential of these game-changing features, revealing hidden gems and shortcuts that'll transform your debugging workflow from a slog to a supercharged sprint. Get ready to code faster, debug smarter, and build experiences that fly.

This talk has been presented at JSNation 2025, check out the latest edition of this JavaScript Conference.

FAQ

The live metrics screen is a feature in the Chrome DevTools performance panel that provides real-time performance data as you interact with a webpage. It is centered around the three core web vitals: LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), and INP (Interaction to Next Paint).

AI assistance in Chrome DevTools provides insights and suggestions for improving webpage performance. It uses AI to interpret performance data, suggest fixes, and even generate code patches. The AI features are available from Chrome version 137 and require user opt-in.

Core Web Vitals are a set of metrics introduced by Google to measure and summarize webpage performance. They include LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), and INP (Interaction to Next Paint), and are used to assess loading, stability, and responsiveness of web pages.

Developers can use the performance panel in Chrome DevTools to pull in real user data from the Chrome User Experience Report (Crux) and compare it with local metrics. This allows developers to see how their performance metrics align with what real users experience.

Yes, AI in Chrome DevTools can suggest and generate code patches to fix issues. These changes are temporary unless saved to a local workspace, which can be done if the project is connected to DevTools.

The AI assistance panel helps developers understand and resolve performance issues by providing detailed explanations and suggested actions. It offers context-sensitive help and can interact with the performance trace to provide deeper insights.

The performance panel has been reimagined to make it easier to find and fix performance issues. Improvements include the introduction of the live metrics screen, which shows real-time performance data, and the ability to compare local metrics with real user data from the Chrome User Experience Report (Crux).

The Insights sidebar combines performance insights from Lighthouse with the detailed data from the performance trace. It prioritizes insights for developers to review and allows for interactive exploration of performance data.

To use AI features in Chrome DevTools, developers need to be logged in and explicitly enable the features in the settings. These AI features are designed to assist in debugging and understanding code, and feedback from users is encouraged.

Barry Pollard
Barry Pollard
Ewa Gasperowicz
Ewa Gasperowicz
29 min
12 Jun, 2025

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Eva and Bari introduce the complexity of web development and emphasize the evolving nature of DevTools, discussing productivity gains through new features. The talk covers performance debugging enhancements and user-friendly changes in the performance panel. Live metrics screen offers real-time insights, integrating real user data for performance comparison. Configuring DevTools for accurate user emulation and leveraging performance trace capabilities for debugging. Enhancing user experience with AI insights and visual assistance, setting up AI workspace in DevTools. Addressing data privacy concerns and AI usage control. Chrome DevTools API and Gemini model enhancements, AI features specific to Chrome, web sockets throttling, and AI agent probing in performance context.

1. Introduction to DevTools Talk

Short description:

Eva and Bari introduce the complexity of web development, emphasizing the evolving nature of DevTools. They highlight the challenges developers face in adopting new features and the importance of exploring productivity gains through fresh perspectives. The talk covers performance debugging enhancements and user-friendly changes in the performance panel to address usability concerns.

Hello, everyone. I'm Eva. And I'm Bari. Just in case it wasn't clear who's whom. Thank you so much for coming back after the coffee break. I know it's a bit late in the day, but I hope you still have a little bit of energy left for all the DevTools magic we're going to show you today with Bari. Let's get started.

We've seen a lot of interesting talks today. One of the universal conclusions we can draw is that web development is becoming increasingly complex, especially with the rise of AI tools and processes. The developer tooling landscape is developing quickly, and Chrome DevTools are no different. Developers tend to stick to what they know, even missing out on productivity gains. A conference like this is a great opportunity for a fresh perspective on new features.

Today, we'll discuss various improvements in DevTools, particularly in performance debugging and AI insights. The performance panel improvements aim to make it easier to find and fix issues without overwhelming new users. Changes have been made to tackle the intimidation issue while still catering to power users. The focus is on a gentler introduction to performance analysis, ensuring a smoother experience for all users.

2. Exploring Chrome DevTools Updates

Short description:

The talk discusses inspiring users to explore new Chrome DevTools features for productivity gains. It focuses on performance debugging enhancements and AI insights, emphasizing user-friendly changes in the performance panel.

Even when they're there in the UI, straight in front of our eyes, our trained blindness teaches us how to simply go around them. And a conference like this one is a great opportunity for us all to take a step back, take a fresh perspective, and soak in all the newness that was introduced.

In this talk, we will try to inspire you to try out all the new features that were added to Chrome DevTools recently and to get a productivity bust as a result. Today, we're going to be talking to you a little bit about a lot of the various improvements we've been making to DevTools, primarily in the performance debugging space and also some exciting AI insights later on. Because let's be honest, what's a conference talk today if it didn't involve AI?

So let's start with some updates that are near and dear to my own heart, the performance panel improvements. Over the last couple of years, we've completely reimagined the performance panel to make it much easier to find and fix performance issues. We've tackled the problem with a number of changes in the panel that we believe solves that intimidation issue, and we're going to talk through them today. It all starts when you first open the panel.

3. Enhancing Performance Analysis with Live Metrics

Short description:

A performance trace is essential for in-depth issue investigation while being overwhelming for new users. The introduction of the live metrics screen provides real-time performance insights based on core web vitals, enhancing user understanding and identification of performance bottlenecks.

A performance trace includes an awful lot of data, and it's really great for being able to dig dive into that issue, but it's really overwhelming, particularly for new users, and it takes a long time to really understand how to use it properly. Well, we wanted to fix that, but at the same time not ruin the experience for all those hardcore developers, power users that do depend on this complexity. So we've tackled the problem with a number of changes in the panel that we believe solves that intimidation issue, and we're going to talk through them today. And it all starts when you first open the panel.

This is what you used to see, a big old empty space with not much information on it at all, and just telling you to do a full performance trace. We wanted a gentler introduction to performance analysis, something that didn't require taking a trace immediately, even before you know what you're looking at. We're calling it the live metrics screen, and it's the first improvement we're going to talk about today. Now, when you open the performance panel, this is what you see. It will immediately show you some details about the performance of the page that you're visiting. And it really is live. As you click, type, and interact with the page, and scroll down, and even click on other pages, the panel is continually updating with your performance metrics. This helps you identify exactly what performance bottlenecks you have, whether it's on page load or through particular interactions.

The live metrics screen is centered around the three core web vitals. These are three metrics that Google introduced five years ago to really measure, give a quick summary of how your page is performing. We have LCP, or large contentful paint, which is the loading metric, and shows how long it took from your click until the largest bit of content was shown on the page. CLS, or cumulative layout shift, says how much the content moved and jumped around, which is really quite annoying. Whenever you load a page, and you start reading, and an ad pops in, and you lose your place, and you're like, what's going on? And finally, interaction to next paint, which says how responsive your website is to interactions. When you click on something, do you immediately get some sort of visual feedback, or does it take a while and you're not even sure if the click's been registered?

4. Integrating Real User Data in DevTools

Short description:

Developers need to consider real user experiences by utilizing field data from Crux in DevTools for performance comparison.

It's all green. Well, Barry, it's all green and nice, because maybe you're developing it on your fancy laptop and on a high-speed internet. Are you sure that our users are experiencing the same on their real devices? We've heard today from Alex how it looks in real life, right? We know that there is a disconnect between developers, what they see on their powerful laptops during the development process, and what users see on their devices.

Actually, a huge part of the Core Web Vitals initiative was to convince the site owners of the importance of using field data, which is measured by real users on their real devices. Now, we've made it really, really easy to see this real user data directly in DevTools in the performance panel. You can use the button on the top right to pull that field data on. Data comes from Crux, the Chrome User Experience Report. It's a source of anonymous real user experiences collected by Chrome.

Once you set it up in that field, the data is pulled from Crux. And DevTools will automatically display the field metrics next to your local metrics for the URL you're currently viewing. If you develop a localhost and there is no real URL to speak of, you can map your localhost to your production URL to still get that matching numbers next to each other. This allows us to compare the performance metrics we're seeing to what our real users are seeing. Oh, and as you can see, maybe loading's a little bit slower for the real users who aren't on fancy high-end laptops.

5. Configuring DevTools for User Emulation

Short description:

Configuring DevTools for accurate user emulation using Crux data and setting CPU throttling for performance analysis.

Oh, and as you can see, maybe loading's a little bit slower for the real users who aren't on fancy high-end laptops. And that's the desktop data. But when we switch to mobile, we automatically load the mobile data and you can see it's even worse. So at this point, we know we've got two issues to solve, our LCP and our INP. Fortunately, you can configure your DevTools to act more like what to better emulate what your users are actually seeing. Use the settings panel on the right.

If we zoom in, you can see for a start, the Crux data tells you exactly the breakdown of your website and your website visitors. Here, we see most of our visitors are on mobile. And that's pretty normal these days. As Alex said earlier, we're in a mobile-first world. It also tells us that 75% of our users are on slow 3G, slow 4G throttling. So rather than having to guess what to do, you can use your real data to actually set that. So let's set that.

And we're also going to set some CPU throttling here. Now, it recommends 4X slowdown, but to be honest, that's just a default. It doesn't explain exactly why you're doing that. Now, this data that we're going to set here doesn't come from Crux because we don't have that data in there. But we do have this nice calibrate option that we've added at the bottom. And when you click on that, it will give you this option and actually calculate some calibrations for your laptop. So you simply run the calibration. It'll run some hardcore processing, see how fast your computer is, and then suggest some custom multipliers for you to use.

6. Leveraging Performance Trace Capabilities

Short description:

Exploring performance trace capabilities for accurate debugging and utilizing live metrics for quick checks and in-depth analysis.

Here, we say, we see a 3X slowdown is recommended for my laptop for a mid-tier device. So we choose that. And then when we reload the page, we see that our local metrics are more closely matching the field metrics that we saw our users getting. So at this point, we're now in a good place to actually debug this performance because we should be able to replicate exactly what they're seeing.

Performance trace can actually give you much more information than the live metric screen alone to really help you understand the performance issues. It might take a longer while to record though, so it is best to use the live metrics that I just showed you to do some quick checks or experimentation and only record the full trace when you want to dig deeper into some specific problem. You can record the trace using the buttons on the bottom right, or there are also available two smaller shortcuts in top left for your convenience.

You can either record the specific user interaction like a series of clicks or a scroll using the record button or trace a full page load using record and reload. Let's go for record and reload for now. After a few seconds, you'll be presented with this. And I say this detailed kind of scary view. Now, we've made an awful lot of improvements to try and make this view more useful and make it easier to navigate around the... Ooh, sorry. It's not... To make it easier to navigate around this trace view. Some of these are shown in the video.

7. Integrating Performance Insights with Lighthouse

Short description:

Bringing together Lighthouse and Performance panels for enhanced insights and interactive trace views.

We're not going to have time to cover all of these today, but check out the blog posts at the bottom and some of the other blog posts in this series get more details on that. And the reason we're not going to have time is I want to talk about one particular improvement we've made with the Performance Insights panel. Many of you may be familiar with Lighthouse which presents a summary of your page experience, including the well-known performance score. As well as those scores, Lighthouse has a wealth of data in its performance audits. But Lighthouse can't show you the deep performance data that the trace view shows. And flicking back and forth between the Lighthouse panel and the Performance panel is a bit annoying, to be honest.

What if we could bring those two things together? And that's exactly what the Lighthouse sidebar... Sorry, the Insights sidebar does. You get a number of insights here in Lighthouse style... Sorry, you get a number of Lighthouse style insights in this sidebar. And they're listed in rough order of what we think is a priority for you to look at here. We didn't just transfer these insights from Lighthouse. We looked at them, we retired some old ones, we merged some together, we tried to make them more useful. So we did a complete audit of Lighthouse, if you will, and came up with better insights. And we're now bringing those insights back to Lighthouse.

Because we want the same experience whether you're looking at Lighthouse support or whether you're looking at the trace view, rather than this disjointed experience that you used to have. But the really powerful thing about the insights in the Performance panel is that the insights in the trace view are fully interactive. Clicking on any of the insights automatically zooms the trace to the appropriate time span. It filters out the irrelevant data and highlights the important data. We even overlay additional data, such as LCP subparts, to really help you understand the trace view. With Performance Insights, you can quickly see the render blocking requests, layout shifts, network dependency trees.

8. Enhancing User Experience with AI Insights

Short description:

Enhancing user experience with AI assistance for insights and script analysis.

You can use insights to zoom into exactly where large DOM is causing your page performance issues or highlight third parties or force reflow issues. But we aren't done yet. All that information is fantastic. But what if you don't know what it actually means? We want to make the insights sidebar as user-friendly as possible, independently of your expertise level. So for Performance experts, but also for newbies alike. So we added one more friendly feature to it, the Ask AI button. When you click on the Ask AI button, you're taken directly to the AI assistance panel. And it passes to the AI for the context, the details of the insight you're looking at, and the part of the trace. You can have a conversation with Gemini here, based on your performance page data. You can even suggest some example questions, but you're free to type whatever questions you want. If needed, it also has the ability to probe back to the trace for extra information as your conversation progresses. In the end, you should get an exhaustive explanation together with suggested actions that you can try out to fix the issue.

Speaking of AI assistance, you can find it in other places too, in Chrome For example, here I'm looking at one specific insight, which is called Legacy JavaScript. And here I can see the name of the script that is causing some trouble in my performance. But it's not instantly clear for me what script that is or what does it do. By clicking on the script name, I'm taken straight to the sources panel. In the sources panel, I can see the source code of that script, but it still looks fairly cryptic to me. Fortunately, I can use the little SKI icon next to the script to ask a question directly about the script. For example, what does it do or what it is responsible for? Here I learned that it is the core part of my Next.js page setup and it is responsible for routing and rendering of my Next.js app. Very similarly, in the same insight, I can click instead of the name, I can click on the icon, network icon next to the script's name. This time, I'm taken to the network panel. And here again, in a very similar way, I can use the same AI icon to invoke the AI assistance panel. And now, it's my network request that is being passed in as context. So I can ask questions regarding headers, caching, payload, security, or get ideas of possible network-related improvements. You can use the predefined prompt as starting point or craft your own queries to drive your investigation further. As you can see, this AI assistance panel is fairly versatile and you can do a lot with it.

Now, with the new AI capabilities built into DevTools, you can even fix code for you straight in DevTools. Let's have a look at that. When you inspect element in the elements panel, the little AI icon lets you interact with Gemini and ask for advice regarding this element's layout or styling. With prompts like, can you center this element on the page for me? Or why is this text overflowing its container? It can not only give you an actionable piece of advice but even produce a patch of code that would fix the issue.

9. Enhancing Development with Visual AI Assistance

Short description:

Using multimodal input for visual assistance and saving AI-suggested patches to local code base.

If you are more visually inclined and prefer to use images instead of text, now, thanks to the multimodal input, you can do it too. Here's an example. This is a website. And on this website, we have a few video thumbnails listed. And we immediately notice that one of those thumbnails stands out and it's not cropped like the others. When with the camera icon next to the prompt, I can easily grab a snapshot of that part of the page to help the element stand which thumbnails it should fix for me. Gemini analyzes the page and proposes a patch for me.

With just a few clicks, we get the issue resolved and now all the thumbnails are nicely trimmed and aligned. Gemini proposed a code patch that we've seen briefly on the previous screen. As well as seeing the changes in the AI conversation flow, you can see the code it alters on the page highlighted in the changes panel. The change AI made on this page is temporary, though, a bit similar as if you were editing styles by hand in the Elements panel.

Let's say the website I was just developing a moment ago with the help of AI assistants is actually a local project and it runs on my local host. In my case, the folder containing my project is called DevTools Cinema. Now, we can take the AI assistance to the next level. Let's say I'm happy with the patch AI suggested earlier and I would like to save it to my local code base. All the magic happens in the unsafe changes section at the end of the AI conversation.

10. Setting up AI Workspace in DevTools

Short description:

Setting up workspace for saving AI patches to local code base and exploring AI assistance possibilities in DevTools.

Let's say I'm happy with the patch AI suggested earlier and I would like to save it to my local code base. All the magic happens in the unsafe changes section at the end of the AI conversation. Because my local folder is connected to DevTools Workspace, I can see the name of the folder, the DevTools Cinema, and this nifty apply to workspace button at the bottom of the section. When I click it, thanks to the source maps and some more AI magic, the AI agent will find the most appropriate file and appropriate place in that file to apply my fix, even though my code is compiled from SAS or uses a JS framework like Next.js, for example. This allows me to proceed the change to the local files and make my fix permanent.

How can we set up that workspace? Generally, you need to connect the local folder to DevTools and grant appropriate write permissions. You can either do that manually in the workspace tab in the sources panel, or just the same way you would do for in any modern IDE, or you can rely on the new automatic workspaces discovery mechanism. It works like this, you need a JSON file that describes a path to your local folder. If you are on local host and expose this JSON on the particular URL you can see on the screen right now, it will be automatically detected by DevTools. And an option to connect will be displayed in the sources panel. You can add such endpoint to your local server yourself or you can use one of the integrations. For example, Angular has support already built in and for V-based applications, there is a plugin that you can use. Nuxt also showed interest and we hope to get more frameworks on board soon.

To learn more about what is possible with AI assistance in DevTools, make sure to check out the AI assistance home page on developer.chrome.com DevTools section. And the great thing about this is we're not showing you the future. These features are all available in production Chrome now, from Chrome 1.37, which is the production version that you should all be using. They're available without Gemini subscription, so we'd really encourage you to have a go, try them out. If you've been afraid to go into performance tracing before, have a play, it should be much more friendly. And the AI features are not just AI features we're throwing in there for the sake of them, we really find them quite useful for debugging and getting help understanding your code. If you use AI assistance, you'll need to be logged in and to explicitly enable in the settings.

QnA

DevTools Talk Conclusion

Short description:

Covered topics: performance panel improvements, SKI feature, AI assistance benefits, and productivity boost in DevTools. Subscribe for updates and connect with Eva and Barry. Q&A session available for further demonstrations.

The first time you try to use it, it will direct you to the screen to turn these settings on. And if you're in an enterprise policy, you can also use to control the feature if you prefer to turn it off. All the AI features are still under active development, so we do want to hear feedback from you. Play with them, have a look. If you discover any bugs, any issues, then let us know either in socials, contacting Eva or myself, or by using our public bug tracker.

Well, today, we covered a lot of topics. We talked about performance panel improvements, including live metrics, field data, environment settings, and performance insights. We showed you the dedicated SKI feature in insights and how AI assistance can help you in sources and the network panels. Finally, you've seen how AI can fix some styling bugs and generate code that can be saved to a local workspace. We really hope that you enjoyed the talk and you get a chance to try out all the new features next time you open DevTools. Give it a try and hopefully, you will see some productivity soar as a result.

To stay up to date, please subscribe to our YouTube channel and check out regularly Developer Chrome for our updates. I've been Eva and I'm mostly DevNook on socials. And I've been Barry, and you can find me at TuneTheWeb and the most social websites. We hope that was useful for you, and I think we have a little bit of time for questions. I would also encourage you to come to us afterwards in the Q&A section where we can show this working on your site. Thank you very much.

Data Privacy and AI Usage in DevTools

Short description:

Addressing concerns about sensitive data and AI usage in DevTools. Cloud-based AI sends data to Google with opt-in settings. Public data is mainly transmitted, with detailed privacy policy information available. Proactive enabling in DevTools settings ensures data control. Memory heap panel improvements not immediate, as tracking memory leaks remains a challenge.

There are quite a few questions about sensitive data and AI, as you might have expected. So let's group those together a little bit. One of them is, when using the AI button, how can I be sure that no sensitive data is sent? The AI that we use is in the cloud-based AI. We've experimented with on-device AI that Tom talked about earlier, but it's just not quite powerful enough for many of the features that we're looking at here. It does send data to Google, and that's why you have to opt in and accept those settings.

In most cases, it's the public data you see on the website. Obviously, if you're developing on localhost, it's separate. We do know that some companies have restrictions, which is why we have that enterprise policy, which can also be set by individuals. It will be sending data to Google, and there is a detailed page on our privacy policy explaining exactly what the data is used for.

We try to make it very explicit how to ensure data is not sent to the AI. You need to be signed in and enable it proactively in the DevTools settings to be well informed about what is being sent and when. On a website, this data is public anyway, and for most website tracing, it's public data. Improvements to the memory heap panel are not immediate, as tracking memory leaks is still challenging, but it's a feature that many encounter issues with.

Improving AI Data Control and Memory Panel

Short description:

Ensuring data control in DevTools settings for AI usage on websites. Memory heap panel improvements not immediate; focus on other features. Encouraging user feedback submission for DevTools bug and enhancement requests.

Thank you so much. Let's see what is the next question. There's a couple of more questions, like how do I make sure that this is not sent to the AI? We try to make it very explicit. You need to be signed in and you need to actually enable it proactively in the DevTools settings so that you are very well informed what is being sent and what moment, so you can adjust it to your preferences. And again, I would remind you, on a website, this is public data anyway. You might not like it being public. Obviously, if you've got your source code mapped, then there might be comments and stuff that you don't want to do there. But for most website tracing, it's public anyway.

Are there plans to make improvements to the memory heap panel as well? Tracking memory leaks is still a bit cumbersome. I think we got this question previously. You get that a lot, Barry. Yeah, exactly. Memory leaks are a difficult thing to track. I'm not aware of any immediate work being done at the moment. It's kind of a, I don't want to say a niche feature because a lot of people do have issues with that. But for most people, what we have there kind of works and it's really hardcore people that really want extra changes to this normally. And yeah, at the minute, that's not being prioritized compared to some of the other features that we're working on.

I've seen that question popping up a few times, though. So the link we shared for the DevTools bug, I encourage you to submit this as a question or as a feature request and then I'll vote it. And the team is very responsive for feedback. So if it's important for many developers, definitely will be looked into eventually. Absolutely. And particularly if you have suggestions of what you want to improve, just make this better means that we might try something that doesn't work for you. So, hey, no debugger does this. And it's fantastic if only Chrome DevTools did something similar. That's much more actionable feedback. Awesome. Can I submit the AI changes to a project in a dev container? So you can certainly do the changes in the browser.

Chrome DevTools API and Gemini Model

Short description:

Submitting AI changes in dev container. No Chrome API for Playwrights yet; ongoing interest in performance analysis tools. Evolving Gemini model in DevTools; AI features specific to Chrome.

Awesome. Can I submit the AI changes to a project in a dev container? So you can certainly do the changes in the browser. I mean, they're just like kind of like you locally editing and stuff like that. Whether that can map back to the dev container depends whether the file path is actually mapped and viewable by the browser for the source code or whether it's completely closed off and you're just running it with a port open. Like a lot of cases like Docker uses your local file system and particular part of that. So in which case, again, as long as it's got right access, it should be able to move back to it.

Nice one. Is there an API in Chrome that testing frameworks like Playwrights can use to analyze performance in automation tests? Not yet. Well, in Chrome, not yet. But that is something we've got active interest in and we're doing an awful lot of work on that. And I expect we'll have some updates on that later in the year. Saying that, there's lots of other tools we use. We have a PageSpeed API. We have the Lighthouse CI tool. But yeah, for performance traces, watch this space.

Oh, that's exciting. Maybe next year, JS Nation, we have some news there. Maybe what Gemini model is used in DevTools? Generally, this changes. So we talk to Gemini and Gemini is evolving pretty quickly. It will always be Gemini, but the exact version will differ depending on the browser version and the particular time frame. Perfect. Which features are part of Chromium and which are Chrome specific? I assume, this is the person asking the question, not me. The AI insights are only in Chrome? Yes, so the performance improvements I showed you are in Chromium core base. We have a shared DevTools. Some of them do slightly different things. Edge likes to use different icons, but most of the code base is the same. So they're available everywhere. And yes, you're correct. The AI features of the minute are specific to Chrome. I think we heard earlier Thomas was saying that Edge is looking at those and I expect that they will build upon those and use their AI's and similarly for other ones.

AI Features and Web Sockets Throttling

Short description:

AI features new in Chrome. Throttling for web sockets. AI training insights in Gemini model.

But yeah, the AI features at present are brand new. And literally just launched on Chrome only at the moment, not Chromium. Thank you so much.

Can I simulate a slow socket connection in the performance step? There has been, oh, you're throwing me here. There's definitely some changes we've done to the throttling tab for web sockets. I'm going to say yes, and it's in our documentation. I can't remember exactly, but I'm sure we changed there. And I'm annoyed because you can't do it for general things, only for web sockets. So yeah, web sockets are ahead of the game here a little bit. But check out our documentation, because there is something about that. I think it was a blog post on it.

Well, it's Gemini then. But what was used for AI training to produce those hints and explanations? What do you train it on? So we pass the context window is whatever we're doing. So you click on an element. We pass that element. We pick on a bit of the trace. We pass the bit of the trace or the insight data that's showing there. Similarly, the sources panel will, if possible, send a whole source file. Again, we're getting context limits for really large files. As for the model itself, that's beyond me. And that's the Google AI geniuses that unfortunately aren't represented on stage today. Hey! Okay. Sorry. Generally, the data that is in the context window, which is part of the trace or your prompt or some additional data you're putting in your prompt, is not used for retraining purposes.

AI Agent Probing and Context Usage

Short description:

AI Agent's Probing Ability in Performance Context.

Again, we're getting context limits for really large files. As for the model itself, that's beyond me. And that's the Google AI geniuses that unfortunately aren't represented on stage today. Hey! Okay. Sorry. Generally, the data that is in the context window, which is part of the trace or your prompt or some additional data you're putting in your prompt, is not used for retraining purposes. It's not used for training the model. But the performance agent especially has the ability to probe, which means if it doesn't receive enough information to solve your query in the kind of first batch of the context and your prompt, it will go back and request for more additional data from that particular performance trace. So it has access to different parts of the performance trace that is already recorded. It can request this depending on the use case. Yeah, and I think the question is about training, but to go sideways for that, it does use whatever you've selected and done. So if you're clicking on an element and saying, center this element, and then it's kind of natural to have a conversation with it and go, oh, also change that. You can give it much more information and context by clicking on that element and refocusing it towards a different part of the DOM or a different network requests, whatever. Because that's what it uses to actually figure out the answers. We actually highlighted in the UI. So next to the prompt, you see which element is selected or which insight is selected or which context, for example, network request is selected. So that you know, like your prompts should be more or less in the area of expertise of that particular element you're asking. If you ask it, who won the last Super Bowl game? Probably it might not answer. All right, well, thank you so much again for your talk and thanks for your answers. And if your question wasn't asked, please join Ewa and Barry in the Q&A spot to make sure that your question does get answered. Thanks so much. Thanks very much for your time.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced 2022React Advanced 2022
25 min
A Guide to React Rendering Behavior
Top Content
This transcription provides a brief guide to React rendering behavior. It explains the process of rendering, comparing new and old elements, and the importance of pure rendering without side effects. It also covers topics such as batching and double rendering, optimizing rendering and using context and Redux in React. Overall, it offers valuable insights for developers looking to understand and optimize React rendering.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
Watch video: React Concurrency, Explained
React 18's concurrent rendering, specifically the useTransition hook, optimizes app performance by allowing non-urgent updates to be processed without freezing the UI. However, there are drawbacks such as longer processing time for non-urgent updates and increased CPU usage. The useTransition hook works similarly to throttling or bouncing, making it useful for addressing performance issues caused by multiple small components. Libraries like React Query may require the use of alternative APIs to handle urgent and non-urgent updates effectively.
If You Were a React Compiler
React Summit US 2024React Summit US 2024
26 min
If You Were a React Compiler
Top Content
In this talk, the speaker aims to build an accurate understanding of how the new React compiler works, focusing on minimizing re-renders and improving performance. They discuss the concept of memoization and how it can be used to optimize React applications by storing the results of function calls. The React compiler automates this process by analyzing code, checking dependencies, and transpiling JSX. The speaker emphasizes the importance of being aware of memory concerns when using memoization and explains how the React compiler detects changes in function closure values. They also mention the Fibre Tree, which drives the reconciliation process and helps optimize performance in React. Additionally, the speaker touches on JSX transpilation, compiler caching, and the generation of code. They encourage developers to understand the code generated by the compiler to optimize specific sections as needed.
Making Magic: Building a TypeScript-First Framework
TypeScript Congress 2023TypeScript Congress 2023
31 min
Making Magic: Building a TypeScript-First Framework
Top Content
Daniel Rowe discusses building a TypeScript-first framework at TypeScript Congress and shares his involvement in various projects. Nuxt is a progressive framework built on Vue.js, aiming to reduce friction and distraction for developers. It leverages TypeScript for inference and aims to be the source of truth for projects. Nuxt provides type safety and extensibility through integration with TypeScript. Migrating to TypeScript offers long-term maintenance benefits and can uncover hidden bugs. Nuxt focuses on improving existing tools and finds inspiration in frameworks like TRPC.
Inside Fiber: the in-depth overview you wanted a TLDR for
React Summit 2022React Summit 2022
27 min
Inside Fiber: the in-depth overview you wanted a TLDR for
Top Content
This Talk explores the internals of React Fiber and its implications. It covers topics such as fibres and units of work, inspecting elements and parent matching, pattern matching and coroutines, and the influence of coroutines on concurrent React. The Talk also discusses effect handlers in React, handling side effects in components, and the history of effect handlers in React. It concludes by emphasizing the importance of understanding React internals and provides learning resources for further exploration.
Deep Diving on Concurrent React
React Advanced 2022React Advanced 2022
29 min
Deep Diving on Concurrent React
The Talk discussed Concurrent React and its impact on app performance, particularly in relation to long tasks on the main thread. It explored parallelism with workers and the challenges of WebAssembly for UI tasks. The concepts of concurrency, scheduling, and rendering were covered, along with techniques for optimizing performance and tackling wasted renders. The Talk also highlighted the benefits of hydration improvements and the new profiler in Concurrent React, and mentioned future enhancements such as React fetch and native scheduling primitives. The importance of understanding React internals and correlating performance metrics with business metrics was emphasized.

Workshops on related topic

React Hooks Tips Only the Pros Know
React Summit Remote Edition 2021React Summit Remote Edition 2021
177 min
React Hooks Tips Only the Pros Know
Top Content
Featured Workshop
Maurice de Beijer
Maurice de Beijer
The addition of the hooks API to React was quite a major change. Before hooks most components had to be class based. Now, with hooks, these are often much simpler functional components. Hooks can be really simple to use. Almost deceptively simple. Because there are still plenty of ways you can mess up with hooks. And it often turns out there are many ways where you can improve your components a better understanding of how each React hook can be used.You will learn all about the pros and cons of the various hooks. You will learn when to use useState() versus useReducer(). We will look at using useContext() efficiently. You will see when to use useLayoutEffect() and when useEffect() is better.
Designing Effective Tests With React Testing Library
React Summit 2023React Summit 2023
151 min
Designing Effective Tests With React Testing Library
Top Content
Featured Workshop
Josh Justice
Josh Justice
React Testing Library is a great framework for React component tests because there are a lot of questions it answers for you, so you don’t need to worry about those questions. But that doesn’t mean testing is easy. There are still a lot of questions you have to figure out for yourself: How many component tests should you write vs end-to-end tests or lower-level unit tests? How can you test a certain line of code that is tricky to test? And what in the world are you supposed to do about that persistent act() warning?
In this three-hour workshop we’ll introduce React Testing Library along with a mental model for how to think about designing your component tests. This mental model will help you see how to test each bit of logic, whether or not to mock dependencies, and will help improve the design of your components. You’ll walk away with the tools, techniques, and principles you need to implement low-cost, high-value component tests.
Table of contents- The different kinds of React application tests, and where component tests fit in- A mental model for thinking about the inputs and outputs of the components you test- Options for selecting DOM elements to verify and interact with them- The value of mocks and why they shouldn’t be avoided- The challenges with asynchrony in RTL tests and how to handle them
Prerequisites- Familiarity with building applications with React- Basic experience writing automated tests with Jest or another unit testing framework- You do not need any experience with React Testing Library- Machine setup: Node LTS, Yarn
Advanced TypeScript types for fun and reliability
TypeScript Congress 2022TypeScript Congress 2022
116 min
Advanced TypeScript types for fun and reliability
Workshop
Maurice de Beijer
Maurice de Beijer
If you're looking to get the most out of TypeScript, this workshop is for you! In this interactive workshop, we will explore the use of advanced types to improve the safety and predictability of your TypeScript code. You will learn when to use types like unknown or never. We will explore the use of type predicates, guards and exhaustive checking to make your TypeScript code more reliable both at compile and run-time. You will learn about the built-in mapped types as well as how to create your own new type map utilities. And we will start programming in the TypeScript type system using conditional types and type inferring.
Are you familiar with the basics of TypeScript and want to dive deeper? Then please join me with your laptop in this advanced and interactive workshop to learn all these topics and more.
You can find the slides, with links, here: http://theproblemsolver.nl/docs/ts-advanced-workshop.pdf
And the repository we will be using is here: https://github.com/mauricedb/ts-advanced