Never Have an Unmaintainable Jupyter Notebook Again!

Rate this content
Bookmark

Data visualisation is a fundamental part of Data Science. The talk will start with a practical demonstration (using pandas, scikit-learn, and matplotlib) of how relying on summary statistics and predictions alone can leave you blind to the true nature of your datasets. I will make the point that visualisations are crucial in every step of the Data Science process and therefore that Jupyter Notebooks definitely do belong in Data Science. We will then look at how maintainability is a real challenge for Jupyter Notebooks, especially when trying to keep them under version control with git. Although there exists a plethora of code quality tools for Python scripts (flake8, black, mypy, etc.), most of them don't work on Jupyter Notebooks. To this end I will present nbQA, which allows any standard Python code quality tool to be run on a Jupyter Notebook. Finally, I will demonstrate how to use it within a workflow which lets practitioners keep the interactivity of their Jupyter Notebooks without having to sacrifice their maintainability.

This talk has been presented at ML conf EU 2020, check out the latest edition of this Tech Conference.

FAQ

One major challenge is version control, as traditional git diff commands produce unclear diffs with Jupyter Notebooks. Another challenge is the lack of integrated code-quality tools which are commonly used with Python scripts.

Using a specialized tool like nbdime can simplify version control by providing a more visually understandable view of diffs between notebook versions. This tool can be called from the command line and viewed in a web browser.

NBQA is a tool that allows you to run typical Python code-quality tools on Jupyter Notebooks by temporarily converting them into Python scripts. It supports tools like Black, iSort, PyUpgrade, and Flake8 to ensure code quality.

Precommit runs specified code-quality checks automatically before a commit is accepted, ensuring that notebooks maintain consistent code quality. It uses a configuration file to specify the tools and versions for the checks.

Yes, Jupyter Notebooks are excellent for data visualization and can be used as a developmental environment for writing maintainable code, not just for testing. Tools like NBQA and nbdime enhance their functionality and maintainability.

Jupyter Notebooks allow for interactive data visualization which is crucial for understanding data beyond just summary statistics. This interactive environment supports better insights and storytelling with data.

Yes, tools like nbdime offer GitHub integrations for reviewing pull requests, making it easier to manage notebook versions and collaborate on projects hosted on GitHub.

Marco Gorelli
Marco Gorelli
26 min
02 Jul, 2021

Comments

Sign in or register to post your comment.

Video Summary and Transcription

The video discusses how to avoid having an unmaintainable Jupyter Notebook by addressing key challenges such as version control and code quality. One major challenge is that traditional git diff commands produce unclear diffs with Jupyter Notebooks, but using a specialized tool like nbdime can simplify version control by providing a more visually understandable view of diffs. Another challenge is the lack of integrated code-quality tools, which can be tackled using NBQA. This tool allows you to run typical Python code-quality tools on Jupyter Notebooks by temporarily converting them into Python scripts. The Precommit tool can also help maintain code quality by running checks automatically before a commit is accepted. Jupyter Notebooks are crucial for data science, allowing for interactive data visualization and better storytelling with data. The video also mentions tools for integrating Jupyter Notebooks with GitHub, such as nbdime for reviewing pull requests. For long-term reproducibility, it might be better to keep code in a notebook rather than moving it to a Python package.

1. Introduction to Jupyter Notebooks

Short description:

We will discuss the importance of Jupyter Notebooks and the challenges of maintaining them. Then, I will demonstrate a workflow for keeping your Jupyter Notebooks maintainable.

Hello, friends. We are here today to talk about Jupyter Notebooks and how to keep them maintainable. We will start with a motivating example, in which I'll make the case for why you might care about using Jupyter Notebooks in the first place. Then, I'll address a couple of challenges which people often bring up when trying to keep their Jupyter Notebooks maintainable.

The first one has to do with version control, and anyone who's tried to look at the difference between two notebooks using git diff will know what I'm talking about. It's not easy. The second has to do with continuous integration and, more specifically, the lack of code-quality tools which are available to run on Jupyter Notebooks.

So, then, finally, I will demonstrate a workflow for keeping your Jupyter Notebooks maintainable. Let's dive straight in with our motivating example. I've prepared a pretty standard data science workflow here, absolutely standard. We'll go through it in a second. Now, you might be wondering why I'm showing you an absolutely standard data science workflow, and bear with me, there might be a twist at the end, might. So let's go through it.

2. Analyzing Summary Statistics

Short description:

We start by reading in four CSV files using Pandas read CSV. We print out summary statistics for all four data sets, which show that they are pretty similar.

We start by reading in four CSV files using Pandas read CSV, pretty standard. Each of these has two columns, x and y, pretty standard. So then we'll print out some summary statistics, so we'll print out the mean of x, the mean of y, the standard deviation of x, the standard deviation of y, and the correlation between x and y. We will do this for all four data sets, still pretty standard.

And then, using Scikit-learn, for each of these data sets we will fit a linear regression model, also pretty standard, and we will print out the mean squared error, also absolutely standard.

So where's the twist? Well, let's see what happens if we run this using Python. Right, look at that. If we look at what's been printed on the console, we'll see that the mean of x is the same for all four data sets, but so is the mean of y, the standard deviation of x, the standard deviation of y, the correlation between x and y, and the mean squared error from having fit a linear regression model is also almost identical. So if we look at this, we can tell that the four data sets must be pretty similar. That's what these summary statistics are telling us.

QnA

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

(Easier) Interactive Data Visualization in React
React Advanced Conference 2021React Advanced Conference 2021
27 min
(Easier) Interactive Data Visualization in React
Top Content
This Talk is about interactive data visualization in React using the Plot library. Plot is a high-level library that simplifies the process of visualizing data by providing key concepts and defaults for layout decisions. It can be integrated with React using hooks like useRef and useEffect. Plot allows for customization and supports features like sorting and adding additional marks. The Talk also discusses accessibility concerns, SSR support, and compares Plot to other libraries like D3 and Vega-Lite.
TensorFlow.js 101: ML in the Browser and Beyond
ML conf EU 2020ML conf EU 2020
41 min
TensorFlow.js 101: ML in the Browser and Beyond
TensorFlow.js enables machine learning in the browser and beyond, with features like face mesh, body segmentation, and pose estimation. It offers JavaScript prototyping and transfer learning capabilities, as well as the ability to recognize custom objects using the Image Project feature. TensorFlow.js can be used with Cloud AutoML for training custom vision models and provides performance benefits in both JavaScript and Python development. It offers interactivity, reach, scale, and performance, and encourages community engagement and collaboration between the JavaScript and machine learning communities.
Using MediaPipe to Create Cross Platform Machine Learning Applications with React
React Advanced Conference 2021React Advanced Conference 2021
21 min
Using MediaPipe to Create Cross Platform Machine Learning Applications with React
Top Content
MediaPipe is a cross-platform framework that helps build perception pipelines using machine learning models. It offers ready-to-use solutions for various applications, such as selfie segmentation, face mesh, object detection, hand tracking, and more. MediaPipe can be integrated with React using NPM modules provided by the MediaPipe team. The demonstration showcases the implementation of face mesh and selfie segmentation solutions. MediaPipe enables the creation of amazing applications without needing to understand the underlying computer vision or machine learning processes.
Charlie Gerard's Career Advice: Be intentional about how you spend your time and effort
0 min
Charlie Gerard's Career Advice: Be intentional about how you spend your time and effort
Article
Charlie Gerard
Jan Tomes
2 authors
When it comes to career, Charlie has one trick: to focus. But that doesn’t mean that you shouldn’t try different things — currently a senior front-end developer at Netlify, she is also a sought-after speaker, mentor, and a machine learning trailblazer of the JavaScript universe. "Experiment with things, but build expertise in a specific area," she advises.

What led you to software engineering?My background is in digital marketing, so I started my career as a project manager in advertising agencies. After a couple of years of doing that, I realized that I wasn't learning and growing as much as I wanted to. I was interested in learning more about building websites, so I quit my job and signed up for an intensive coding boot camp called General Assembly. I absolutely loved it and started my career in tech from there.
 What is the most impactful thing you ever did to boost your career?I think it might be public speaking. Going on stage to share knowledge about things I learned while building my side projects gave me the opportunity to meet a lot of people in the industry, learn a ton from watching other people's talks and, for lack of better words, build a personal brand.
 What would be your three tips for engineers to level up their career?Practice your communication skills. I can't stress enough how important it is to be able to explain things in a way anyone can understand, but also communicate in a way that's inclusive and creates an environment where team members feel safe and welcome to contribute ideas, ask questions, and give feedback. In addition, build some expertise in a specific area. I'm a huge fan of learning and experimenting with lots of technologies but as you grow in your career, there comes a time where you need to pick an area to focus on to build more profound knowledge. This could be in a specific language like JavaScript or Python or in a practice like accessibility or web performance. It doesn't mean you shouldn't keep in touch with anything else that's going on in the industry, but it means that you focus on an area you want to have more expertise in. If you could be the "go-to" person for something, what would you want it to be? 
 And lastly, be intentional about how you spend your time and effort. Saying yes to everything isn't always helpful if it doesn't serve your goals. No matter the job, there are always projects and tasks that will help you reach your goals and some that won't. If you can, try to focus on the tasks that will grow the skills you want to grow or help you get the next job you'd like to have.
 What are you working on right now?Recently I've taken a pretty big break from side projects, but the next one I'd like to work on is a prototype of a tool that would allow hands-free coding using gaze detection. 
 Do you have some rituals that keep you focused and goal-oriented?Usually, when I come up with a side project idea I'm really excited about, that excitement is enough to keep me motivated. That's why I tend to avoid spending time on things I'm not genuinely interested in. Otherwise, breaking down projects into smaller chunks allows me to fit them better in my schedule. I make sure to take enough breaks, so I maintain a certain level of energy and motivation to finish what I have in mind.
 You wrote a book called Practical Machine Learning in JavaScript. What got you so excited about the connection between JavaScript and ML?The release of TensorFlow.js opened up the world of ML to frontend devs, and this is what really got me excited. I had machine learning on my list of things I wanted to learn for a few years, but I didn't start looking into it before because I knew I'd have to learn another language as well, like Python, for example. As soon as I realized it was now available in JS, that removed a big barrier and made it a lot more approachable. Considering that you can use JavaScript to build lots of different applications, including augmented reality, virtual reality, and IoT, and combine them with machine learning as well as some fun web APIs felt super exciting to me.


Where do you see the fields going together in the future, near or far? I'd love to see more AI-powered web applications in the future, especially as machine learning models get smaller and more performant. However, it seems like the adoption of ML in JS is still rather low. Considering the amount of content we post online, there could be great opportunities to build tools that assist you in writing blog posts or that can automatically edit podcasts and videos. There are lots of tasks we do that feel cumbersome that could be made a bit easier with the help of machine learning.
 You are a frequent conference speaker. You have your own blog and even a newsletter. What made you start with content creation?I realized that I love learning new things because I love teaching. I think that if I kept what I know to myself, it would be pretty boring. If I'm excited about something, I want to share the knowledge I gained, and I'd like other people to feel the same excitement I feel. That's definitely what motivated me to start creating content.
 How has content affected your career?I don't track any metrics on my blog or likes and follows on Twitter, so I don't know what created different opportunities. Creating content to share something you built improves the chances of people stumbling upon it and learning more about you and what you like to do, but this is not something that's guaranteed. I think over time, I accumulated enough projects, blog posts, and conference talks that some conferences now invite me, so I don't always apply anymore. I sometimes get invited on podcasts and asked if I want to create video content and things like that. Having a backlog of content helps people better understand who you are and quickly decide if you're the right person for an opportunity.What pieces of your work are you most proud of?It is probably that I've managed to develop a mindset where I set myself hard challenges on my side project, and I'm not scared to fail and push the boundaries of what I think is possible. I don't prefer a particular project, it's more around the creative thinking I've developed over the years that I believe has become a big strength of mine.***Follow Charlie on Twitter
TensorFlow.JS 101: ML in the Browser and Beyond
JSNation Live 2021JSNation Live 2021
39 min
TensorFlow.JS 101: ML in the Browser and Beyond
JavaScript with TensorFlow.js allows for machine learning in various environments, enabling the creation of applications like augmented reality and sentiment analysis. TensorFlow.js offers pre-trained models for object detection, body segmentation, and face landmark detection. It also allows for 3D rendering and the combination of machine learning with WebGL. The integration of WebRTC and WebXR enables teleportation and enhanced communication. TensorFlow.js supports transfer learning through Teachable Machine and Cloud AutoML, and provides flexibility and performance benefits in the browser and Node.js environments.
An Introduction to Transfer Learning in NLP and HuggingFace
ML conf EU 2020ML conf EU 2020
32 min
An Introduction to Transfer Learning in NLP and HuggingFace
Transfer learning in NLP allows for better performance with minimal data. BERT is commonly used for sequential transfer learning. Models like BERT can be adapted for downstream tasks such as text classification. Handling different types of inputs in NLP involves concatenating or duplicating the model. Hugging Face aims to tackle challenges in NLP through knowledge sharing and open sourcing code and libraries.

Workshops on related topic

Leveraging LLMs to Build Intuitive AI Experiences With JavaScript
JSNation 2024JSNation 2024
108 min
Leveraging LLMs to Build Intuitive AI Experiences With JavaScript
Featured Workshop
Roy Derks
Shivay Lamba
2 authors
Today every developer is using LLMs in different forms and shapes, from ChatGPT to code assistants like GitHub CoPilot. Following this, lots of products have introduced embedded AI capabilities, and in this workshop we will make LLMs understandable for web developers. And we'll get into coding your own AI-driven application. No prior experience in working with LLMs or machine learning is needed. Instead, we'll use web technologies such as JavaScript, React which you already know and love while also learning about some new libraries like OpenAI, Transformers.js
Build a powerful DataGrid in few hours with Ag Grid
React Summit US 2023React Summit US 2023
96 min
Build a powerful DataGrid in few hours with Ag Grid
Top Content
WorkshopFree
Mike Ryan
Mike Ryan
Does your React app need to efficiently display lots (and lots) of data in a grid? Do your users want to be able to search, sort, filter, and edit data? AG Grid is the best JavaScript grid in the world and is packed with features, highly performant, and extensible. In this workshop, you’ll learn how to get started with AG Grid, how we can enable sorting and filtering of data in the grid, cell rendering, and more. You will walk away from this free 3-hour workshop equipped with the knowledge for implementing AG Grid into your React application.
We all know that rolling our own grid solution is not easy, and let's be honest, is not something that we should be working on. We are focused on building a product and driving forward innovation. In this workshop, you'll see just how easy it is to get started with AG Grid.
Prerequisites: Basic React and JavaScript
Workshop level: Beginner
Build a Powerful Datagrid With AG Grid
React Summit 2024React Summit 2024
168 min
Build a Powerful Datagrid With AG Grid
WorkshopFree
Brian Love
Brian Love
Does your React app need to efficiently display lots (and lots) of data in a grid? Do your users want to be able to search, sort, filter, and edit data? AG Grid is the best JavaScript grid in the world and is packed with features, highly performant, and extensible. In this workshop, you’ll learn how to get started with AG Grid, how we can enable sorting and filtering of data in the grid, cell rendering, and more. You will walk away from this free 3-hour workshop equipped with the knowledge for implementing AG Grid into your React application.
Can LLMs Learn? Let’s Customize an LLM to Chat With Your Own Data
C3 Dev Festival 2024C3 Dev Festival 2024
48 min
Can LLMs Learn? Let’s Customize an LLM to Chat With Your Own Data
WorkshopFree
Andreia Ocanoaia
Andreia Ocanoaia
Feeling the limitations of LLMs? They can be creative, but sometimes lack accuracy or rely on outdated information. In this workshop, we’ll break down the process of building and easily deploying a Retrieval-Augmented Generation system. This approach enables you to leverage the power of LLMs with the added benefit of factual accuracy and up-to-date information.
Let AI Be Your Docs
JSNation 2024JSNation 2024
69 min
Let AI Be Your Docs
Workshop
Jesse Hall
Jesse Hall
Join our dynamic workshop to craft an AI-powered documentation portal. Learn to integrate OpenAI's ChatGPT with Next.js 14, Tailwind CSS, and cutting-edge tech to deliver instant code solutions and summaries. This hands-on session will equip you with the knowledge to revolutionize how users interact with documentation, turning tedious searches into efficient, intelligent discovery.
Key Takeaways:
- Practical experience in creating an AI-driven documentation site.- Understanding the integration of AI into user experiences.- Hands-on skills with the latest web development technologies.- Strategies for deploying and maintaining intelligent documentation resources.
Table of contents:- Introduction to AI in Documentation- Setting Up the Environment- Building the Documentation Structure- Integrating ChatGPT for Interactive Docs
Data Visualization for Web Developers
JSNation Live 2021JSNation Live 2021
139 min
Data Visualization for Web Developers
WorkshopFree
Anjana Vakil
Anjana Vakil
In this workshop, through hands-on projects we'll learn how to use Observable, a browser-based reactive coding platform, to rapidly build insightful, interactive visualizations in JavaScript. After completing this workshop, you'll have the basic tools & techniques you need to start using dataviz to better understand your code, your projects & your users, and make better data-driven decisions as a developer.