Docker 101 - Intro to Container

Rate this content
Bookmark

Software Containers are quickly becoming an essential tool in every developer's toolbelt. They make it easy to share, run, and scale code. In this talk you'll learn how to use Docker to write better, more sharable software. In this workshop Sr. Developer Advocate at Docker, Shy Ruparel, will walk you through getting started with Docker. He'll covers setting up Docker, running your first container, creating a basic web application with Python and Docker, and how to push the Docker Image to DockerHub. He'll share why you'd even want to use containers in the first place and how they enable a developer to write better, more shareable software.

This workshop has been presented at JSNation 2022, check out the latest edition of this JavaScript Conference.

FAQ

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of its parts, such as libraries and other dependencies, and ship it as one package.

A Docker image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings. A Docker container is a runtime instance of a Docker image.

To run a Docker container, you use the command 'docker run'. You can specify various options such as the container name, ports to expose, and the base image to use. For example, 'docker run -d -p 80:80 nginx' runs an Nginx server detached in the background, mapping port 80 of the container to port 80 on the host.

Docker Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application's services, networks, and volumes. Then, with a single command, you create and start all the services defined in your configuration.

Docker containers and images can be shared via Docker Hub, which is a cloud-based registry service that allows you to link code repositories, build your images, and test them. You can push your Docker images to Docker Hub using 'docker push' command and pull them using 'doker pull', making it accessible to others.

Shy Ruparel
Shy Ruparel
116 min
04 Jul, 2022

Comments

Sign in or register to post your comment.
Video Summary and Transcription
Software containers provide a consistent running experience, making it easy to package code and dependencies. Docker Hub is a repository of existing images that can be used to build containers. The process of creating containers from images and managing them is explained. The usage of Docker files to build images and the concepts of entry point and CMD are discussed. The process of moving files and running code in Python within a container is explained. Finally, the topics of pushing images to Docker Hub, building for different platforms, and networking and running containers are covered.

1. Introduction to Software Containers and Docker

Short description:

In this part, we will discuss software containers and their relevance in solving common problems faced by developers. The analogy of shipping containers is used to explain the concept of software containers. Containers provide a consistent running experience regardless of the environment, making it easy to package code and dependencies. They can share resources and are more modular compared to virtual machines. Docker Hub is introduced as a repository of existing images that can be used to build containers. The official node image is mentioned as an example. Docker commands and the process of creating containers from images are explained.

So, let's get started. Let's talk about software containers and kinda do the Docker... the Docker 101, right? So, I'm Shai, I'm a senior developer advocate at Docker, I'm on Twitter, if you all want to follow me, I tweet nothing of value but, you know, it's there if you want it. And I'm really excited to get to share things with you about using Docker. Full disclosure, I'm primarily a Python developer, so I will be doing my best to teach you all in JavaScript today, but I might struggle a little bit, and I might need some help. My Python foo is much better than my JavaScript foo. So, that's my disclaimer for this workshop, and let's go ahead and get started.

So I have a question. I want you to give me a hand-raised emoji or raise your hand in the chat if you've had this problem. If you've had any of the problems that I'm about to outline or, you know, share your favorite stern emoji, you know, the emoji you send your coworkers when you're mad at them or something, if you've had any of these problems. So, first off, you've built something, you've made something, it works on your machine, It doesn't work like that on the cloud or you share it with a co-worker, you know, maybe it's because you've got OS inconsistencies. You know, you're running Mac on your local machine. You've got a Linux instance in the cloud, and that's causing problems. Maybe you've got to upgrade the machine or the server. Maybe you're having issues with compatible dependencies. I've had this problem a few times myself. You know, and some of these problems aren't even taken into account if you want to change your cloud, you know, if you are on AWS, right, and you decide, alright, I'm fed up with this, the billing isn't working for me, you want to move clouds, or maybe I just want to explore and check out Azure or Google Cloud or I'm fed up with Azure and Google Cloud and I want to move to AWS, how do you keep all that secure, how do you keep it all maintained, how do you keep it consistent, how do you make it easy for yourself to move all that stuff around because it can be kind of a pain in the butt? Now I want to talk about how we can use containers to kind of solve those problems. But before we get there, I want to take a brief tangent and talk to you about shipping. And also, I have some cool gifs of trains and this is my excuse to share them with you. But yeah, let's take a brief tangent and talk about how shipping works.

So back in the olden days, before they invented color, color television, they would do shipping kind of inconsistently, right? So like products would be in also two different containers, you know, you'd have your bags of beans and spices, you'd have your boxes full of full of non-perishable, you'd have your barrels of rum and it made storing things really consistent, you'd have to you'd have to move individual units around. And then there was no standard format of transit, you know, so we had different types of boats back in the day, we'd have horse drawn wagons, you know, eventually cars got invented. Eventually trains got invented. It was a huge pain in the butt to kind of sort and organize and kind of keep things maintainable, you know, have a consistent idea of how much stuff you could you could get around from point A to point B. And this is actually kind of a solved problem now. So these days, containers like modern shipping containers have been standardized, you can load them with a bunch of stuff and you can efficiently transport these these boxes, these containers between multiple different modes of transit. And because we standardized around this format, we're able to build all sorts of different vehicles, assuming that this is going to be the format that we use. So we have things like trucks that can just take them. We have trains that can take them. We have boats that can take them. And when people are done with them, they'll recycle them and put them outside my apartment and turn them into covid-19 testing locations, which I think is pretty cool. I love the reused shipping container aesthetic, especially when it gets turned into like boutique stores. There's some really nice ones of those across across the country, which I think are really fun as well. And so I think shipping containers are pretty cool. And this is like 90 percent of cargo transit today is done in this format. And you might be thinking, Shai, this is an excellent tangent. But how is this relevant? And I'm going to tell you, I think this is this is actually a really good analogy for how software containers are set up. And I'm assuming that's why they decided to name them software containers. So with the software container, you kind of take that same aesthetic. You have this box, right? And you know, the box is going to be able to be run regardless of where you put it. You're going to have this thing that can run on your local machine, on an arm, be seven architecture on Linux, on cloud, on Windows. You have confidence that this thing is going to be consistent, have a consistent running experience everywhere. And so when you create a container, when you build a container, basically what you're doing is you're putting everything you need into that container that you need for it to run. You're throwing all your dependencies in there. You're throwing all your low level libraries in there and you're pushing it out with confidence that it'll be consistent regardless of where it happens. It's easy to make for us as developers. There's a lot of reusable components that are happening, so there's this standard ecosystem that's kind of being built on and you get all that dependency stuff coming with you. So they're great. They're able to let you package up code and your dependencies so things run faster and quickly and it's great. I kind of talked about this already, but this graph is kind of nice. It kind of shows how containers get built. You can actually share. So if you've got multiple containers running, they'll actually share resources between each other too. So unlike a VM that needs a discrete amount of memory and CPU time allocated to it and that's stuck while the VM is up, containers are a little more modular, I guess, they're able to talk to each other, able to share resources, they're to take advantage of being able to talk to each other as well. So that's really fun. There's Docker Hub, which I'll talk about, and I think that's enough PowerPoint for now. I'm going to go ahead and start doing some, so hopefully everyone has downloaded Docker and has had a chance to poke around this tool as well. I'm going to do a bit of coding myself and please feel free to follow along and tell me to slow down if I'm going too fast. But I'm going to get started by pulling up my terminal. I am a hardcore terminal fan. I am old school, I guess. Let me make this bigger so we can all read it. Give me a shout if this is too small and I will keep making it bigger. So, here we just have a plain command line, right? And I've got Docker installed. And let's go ahead and just do a quick help and see how we can figure out how to use this thing. So, here we go. I've got Docker, I've done my help command and there's all sorts of commands that we can use and options that we can use. And I'm going to start by running a new command. So, the way Docker works is when you do run, let's do the help so we could see the documentation here, what you do is you're going to pass it some options and you're going to give it an image. Now, an image is like a recipe or like a template. And that's basically saying, this is the thing that I want you to run. Whenever we do run, we take an image and we build a new container out of it. So, the difference between an image and a container is a containers is an actual process that is running and containers themselves can be changed and have state. They can be on or off and you can use them over again or you can not use them. Whereas an image is kind of like a set of instructions to make a container. And every time you create a container from an image, it's going to be the same starting point, and then containers themselves can change. So, let's go ahead and take a look at Docker Hub. So, Docker Hub is really nice because there's, it's this kind of repository of existing images. So, you don't necessarily have to build them from scratch. One of my favorite things about working with containers is that you're able to kind of always build on top of it. So, maybe you want to use a Python environment or a Node.js environment, right? So, we can type Node.js. Let's just type node and we can see that we've got this official node image. So, this means it's being curated. It's a curated open source repo that like the folks at Docker have looked at and we trust the people making it. We've done our due diligence on it and so we can be sure that this is something you can trust. And you can see here we've got like a verified publisher one. So, again, we know that this is being done by the real people at CircleCI. They're putting it together and so you can trust that this is going to be an image that's in good state and good quality. And so, when you pull it and run it, you're going to feel confident about it.

2. Pulling and Running Official Node.js Docker Images

Short description:

When you pull the official Node.js Docker image, you can choose from different versions. The Node.js Docker team maintains and updates the image regularly to ensure it's up to date and secure. By default, the latest version is pulled, but you can specify a specific version if needed.

And so, when you pull it and run it, you're going to feel confident about it. So, if we click into this node one, we can see that there's a whole bunch of information here and we can take a look at it. So, all we need to get this running is we just need to pull it locally, kind of similar to how like GitHub works. And we can see that it's maintained by the Node.js Docker team. So, this is the official Node.js folks and they're making commits to this fairly regularly to make sure it's up to date, to deal with any security vulnerabilities that might get discovered and trying making it like the best experience for you. And they have a bunch of different versions that they support. So, when you do that Docker pull node, unless you say which version you want, you're going to get the one that is tagged as latest. So in this instance, it's node 18.3, but you have the capacity to select whichever versions you want. So, if you want to do node 14 or some of the upcoming Node releases, node 18.3.15, you can set that level of granularity, as well, too, which is really useful.

Watch more workshops on topic

React, TypeScript, and TDD
React Advanced 2021React Advanced 2021
174 min
React, TypeScript, and TDD
Top Content
Featured WorkshopFree
Paul Everitt
Paul Everitt
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.
Web3 Workshop - Building Your First Dapp
React Advanced 2021React Advanced 2021
145 min
Web3 Workshop - Building Your First Dapp
Top Content
Featured WorkshopFree
Nader Dabit
Nader Dabit
In this workshop, you'll learn how to build your first full stack dapp on the Ethereum blockchain, reading and writing data to the network, and connecting a front end application to the contract you've deployed. By the end of the workshop, you'll understand how to set up a full stack development environment, run a local node, and interact with any smart contract using React, HardHat, and Ethers.js.
Remix Fundamentals
React Summit 2022React Summit 2022
136 min
Remix Fundamentals
Top Content
Featured WorkshopFree
Kent C. Dodds
Kent C. Dodds
Building modern web applications is riddled with complexity And that's only if you bother to deal with the problems
Tired of wiring up onSubmit to backend APIs and making sure your client-side cache stays up-to-date? Wouldn't it be cool to be able to use the global nature of CSS to your benefit, rather than find tools or conventions to avoid or work around it? And how would you like nested layouts with intelligent and performance optimized data management that just works™?
Remix solves some of these problems, and completely eliminates the rest. You don't even have to think about server cache management or global CSS namespace clashes. It's not that Remix has APIs to avoid these problems, they simply don't exist when you're using Remix. Oh, and you don't need that huge complex graphql client when you're using Remix. They've got you covered. Ready to build faster apps faster?
At the end of this workshop, you'll know how to:- Create Remix Routes- Style Remix applications- Load data in Remix loaders- Mutate data with forms and actions
Vue3: Modern Frontend App Development
Vue.js London Live 2021Vue.js London Live 2021
169 min
Vue3: Modern Frontend App Development
Top Content
Featured WorkshopFree
Mikhail Kuznetsov
Mikhail Kuznetsov
The Vue3 has been released in mid-2020. Besides many improvements and optimizations, the main feature of Vue3 brings is the Composition API – a new way to write and reuse reactive code. Let's learn more about how to use Composition API efficiently.

Besides core Vue3 features we'll explain examples of how to use popular libraries with Vue3.

Table of contents:
- Introduction to Vue3
- Composition API
- Core libraries
- Vue3 ecosystem

Prerequisites:
IDE of choice (Inellij or VSC) installed
Nodejs + NPM
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
JSNation 2023JSNation 2023
174 min
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
Top Content
Featured WorkshopFree
Alba Silvente Fuentes
Roberto Butti
2 authors
This SvelteKit workshop explores the integration of 3rd party services, such as Storyblok, in a SvelteKit project. Participants will learn how to create a SvelteKit project, leverage Svelte components, and connect to external APIs. The workshop covers important concepts including SSR, CSR, static site generation, and deploying the application using adapters. By the end of the workshop, attendees will have a solid understanding of building SvelteKit applications with API integrations and be prepared for deployment.
Build Modern Applications Using GraphQL and Javascript
Node Congress 2024Node Congress 2024
152 min
Build Modern Applications Using GraphQL and Javascript
Featured Workshop
Emanuel Scirlet
Miguel Henriques
2 authors
Come and learn how you can supercharge your modern and secure applications using GraphQL and Javascript. In this workshop we will build a GraphQL API and we will demonstrate the benefits of the query language for APIs and what use cases that are fit for it. Basic Javascript knowledge required.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Don't Solve Problems, Eliminate Them
React Advanced 2021React Advanced 2021
39 min
Don't Solve Problems, Eliminate Them
Top Content
Kent C. Dodds discusses the concept of problem elimination rather than just problem-solving. He introduces the idea of a problem tree and the importance of avoiding creating solutions prematurely. Kent uses examples like Tesla's electric engine and Remix framework to illustrate the benefits of problem elimination. He emphasizes the value of trade-offs and taking the easier path, as well as the need to constantly re-evaluate and change approaches to eliminate problems.
Understanding React’s Fiber Architecture
React Advanced 2022React Advanced 2022
29 min
Understanding React’s Fiber Architecture
Top Content
This Talk explores React's internal jargon, specifically fiber, which is an internal unit of work for rendering and committing. Fibers facilitate efficient updates to elements and play a crucial role in the reconciliation process. The work loop, complete work, and commit phase are essential steps in the rendering process. Understanding React's internals can help with optimizing code and pull request reviews. React 18 introduces the work loop sync and async functions for concurrent features and prioritization. Fiber brings benefits like async rendering and the ability to discard work-in-progress trees, improving user experience.
Jotai Atoms Are Just Functions
React Day Berlin 2022React Day Berlin 2022
22 min
Jotai Atoms Are Just Functions
Top Content
State management in React is a highly discussed topic with many libraries and solutions. Jotai is a new library based on atoms, which represent pieces of state. Atoms in Jotai are used to define state without holding values and can be used for global, semi-global, or local states. Jotai atoms are reusable definitions that are independent from React and can be used without React in an experimental library called Jotajsx.
Debugging JS
React Summit 2023React Summit 2023
24 min
Debugging JS
Top Content
Watch video: Debugging JS
Debugging JavaScript is a crucial skill that is often overlooked in the industry. It is important to understand the problem, reproduce the issue, and identify the root cause. Having a variety of debugging tools and techniques, such as console methods and graphical debuggers, is beneficial. Replay is a time-traveling debugger for JavaScript that allows users to record and inspect bugs. It works with Redux, plain React, and even minified code with the help of source maps.
The Epic Stack
React Summit US 2023React Summit US 2023
21 min
The Epic Stack
Top Content
Watch video: The Epic Stack
This Talk introduces the Epic Stack, a project starter and reference for modern web development. It emphasizes that the choice of tools is not as important as we think and that any tool can be fine. The Epic Stack aims to provide a limited set of services and common use cases, with a focus on adaptability and ease of swapping out tools. It incorporates technologies like Remix, React, Fly to I.O, Grafana, and Sentry. The Epic Web Dev offers free materials and workshops to gain a solid understanding of the Epic Stack.
Fighting Technical Debt With Continuous Refactoring
React Day Berlin 2022React Day Berlin 2022
29 min
Fighting Technical Debt With Continuous Refactoring
Top Content
Watch video: Fighting Technical Debt With Continuous Refactoring
This Talk discusses the importance of refactoring in software development and engineering. It introduces a framework called the three pillars of refactoring: practices, inventory, and process. The Talk emphasizes the need for clear practices, understanding of technical debt, and a well-defined process for successful refactoring. It also highlights the importance of visibility, reward, and resilience in the refactoring process. The Talk concludes by discussing the role of ownership, management, and prioritization in managing technical debt and refactoring efforts.