Docker 101 - Intro to Container

Rate this content
Bookmark

Software Containers are quickly becoming an essential tool in every developer's toolbelt. They make it easy to share, run, and scale code. In this talk you'll learn how to use Docker to write better, more sharable software. In this workshop Sr. Developer Advocate at Docker, Shy Ruparel, will walk you through getting started with Docker. He'll covers setting up Docker, running your first container, creating a basic web application with Python and Docker, and how to push the Docker Image to DockerHub. He'll share why you'd even want to use containers in the first place and how they enable a developer to write better, more shareable software.

This workshop has been presented at JSNation 2022, check out the latest edition of this JavaScript Conference.

FAQ

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of its parts, such as libraries and other dependencies, and ship it as one package.

A Docker image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings. A Docker container is a runtime instance of a Docker image.

To run a Docker container, you use the command 'docker run'. You can specify various options such as the container name, ports to expose, and the base image to use. For example, 'docker run -d -p 80:80 nginx' runs an Nginx server detached in the background, mapping port 80 of the container to port 80 on the host.

Docker Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application's services, networks, and volumes. Then, with a single command, you create and start all the services defined in your configuration.

Docker containers and images can be shared via Docker Hub, which is a cloud-based registry service that allows you to link code repositories, build your images, and test them. You can push your Docker images to Docker Hub using 'docker push' command and pull them using 'doker pull', making it accessible to others.

Shy Ruparel
Shy Ruparel
116 min
04 Jul, 2022

Comments

Sign in or register to post your comment.

Video Summary and Transcription

Software containers provide a consistent running experience, making it easy to package code and dependencies. Docker Hub is a repository of existing images that can be used to build containers. The process of creating containers from images and managing them is explained. The usage of Docker files to build images and the concepts of entry point and CMD are discussed. The process of moving files and running code in Python within a container is explained. Finally, the topics of pushing images to Docker Hub, building for different platforms, and networking and running containers are covered.

1. Introduction to Software Containers and Docker

Short description:

In this part, we will discuss software containers and their relevance in solving common problems faced by developers. The analogy of shipping containers is used to explain the concept of software containers. Containers provide a consistent running experience regardless of the environment, making it easy to package code and dependencies. They can share resources and are more modular compared to virtual machines. Docker Hub is introduced as a repository of existing images that can be used to build containers. The official node image is mentioned as an example. Docker commands and the process of creating containers from images are explained.

So, let's get started. Let's talk about software containers and kinda do the Docker... the Docker 101, right? So, I'm Shai, I'm a senior developer advocate at Docker, I'm on Twitter, if you all want to follow me, I tweet nothing of value but, you know, it's there if you want it. And I'm really excited to get to share things with you about using Docker. Full disclosure, I'm primarily a Python developer, so I will be doing my best to teach you all in JavaScript today, but I might struggle a little bit, and I might need some help. My Python foo is much better than my JavaScript foo. So, that's my disclaimer for this workshop, and let's go ahead and get started.

So I have a question. I want you to give me a hand-raised emoji or raise your hand in the chat if you've had this problem. If you've had any of the problems that I'm about to outline or, you know, share your favorite stern emoji, you know, the emoji you send your coworkers when you're mad at them or something, if you've had any of these problems. So, first off, you've built something, you've made something, it works on your machine, It doesn't work like that on the cloud or you share it with a co-worker, you know, maybe it's because you've got OS inconsistencies. You know, you're running Mac on your local machine. You've got a Linux instance in the cloud, and that's causing problems. Maybe you've got to upgrade the machine or the server. Maybe you're having issues with compatible dependencies. I've had this problem a few times myself. You know, and some of these problems aren't even taken into account if you want to change your cloud, you know, if you are on AWS, right, and you decide, alright, I'm fed up with this, the billing isn't working for me, you want to move clouds, or maybe I just want to explore and check out Azure or Google Cloud or I'm fed up with Azure and Google Cloud and I want to move to AWS, how do you keep all that secure, how do you keep it all maintained, how do you keep it consistent, how do you make it easy for yourself to move all that stuff around because it can be kind of a pain in the butt? Now I want to talk about how we can use containers to kind of solve those problems. But before we get there, I want to take a brief tangent and talk to you about shipping. And also, I have some cool gifs of trains and this is my excuse to share them with you. But yeah, let's take a brief tangent and talk about how shipping works.

So back in the olden days, before they invented color, color television, they would do shipping kind of inconsistently, right? So like products would be in also two different containers, you know, you'd have your bags of beans and spices, you'd have your boxes full of full of non-perishable, you'd have your barrels of rum and it made storing things really consistent, you'd have to you'd have to move individual units around. And then there was no standard format of transit, you know, so we had different types of boats back in the day, we'd have horse drawn wagons, you know, eventually cars got invented. Eventually trains got invented. It was a huge pain in the butt to kind of sort and organize and kind of keep things maintainable, you know, have a consistent idea of how much stuff you could you could get around from point A to point B. And this is actually kind of a solved problem now. So these days, containers like modern shipping containers have been standardized, you can load them with a bunch of stuff and you can efficiently transport these these boxes, these containers between multiple different modes of transit. And because we standardized around this format, we're able to build all sorts of different vehicles, assuming that this is going to be the format that we use. So we have things like trucks that can just take them. We have trains that can take them. We have boats that can take them. And when people are done with them, they'll recycle them and put them outside my apartment and turn them into covid-19 testing locations, which I think is pretty cool. I love the reused shipping container aesthetic, especially when it gets turned into like boutique stores. There's some really nice ones of those across across the country, which I think are really fun as well. And so I think shipping containers are pretty cool. And this is like 90 percent of cargo transit today is done in this format. And you might be thinking, Shai, this is an excellent tangent. But how is this relevant? And I'm going to tell you, I think this is this is actually a really good analogy for how software containers are set up. And I'm assuming that's why they decided to name them software containers. So with the software container, you kind of take that same aesthetic. You have this box, right? And you know, the box is going to be able to be run regardless of where you put it. You're going to have this thing that can run on your local machine, on an arm, be seven architecture on Linux, on cloud, on Windows. You have confidence that this thing is going to be consistent, have a consistent running experience everywhere. And so when you create a container, when you build a container, basically what you're doing is you're putting everything you need into that container that you need for it to run. You're throwing all your dependencies in there. You're throwing all your low level libraries in there and you're pushing it out with confidence that it'll be consistent regardless of where it happens. It's easy to make for us as developers. There's a lot of reusable components that are happening, so there's this standard ecosystem that's kind of being built on and you get all that dependency stuff coming with you. So they're great. They're able to let you package up code and your dependencies so things run faster and quickly and it's great. I kind of talked about this already, but this graph is kind of nice. It kind of shows how containers get built. You can actually share. So if you've got multiple containers running, they'll actually share resources between each other too. So unlike a VM that needs a discrete amount of memory and CPU time allocated to it and that's stuck while the VM is up, containers are a little more modular, I guess, they're able to talk to each other, able to share resources, they're to take advantage of being able to talk to each other as well. So that's really fun. There's Docker Hub, which I'll talk about, and I think that's enough PowerPoint for now. I'm going to go ahead and start doing some, so hopefully everyone has downloaded Docker and has had a chance to poke around this tool as well. I'm going to do a bit of coding myself and please feel free to follow along and tell me to slow down if I'm going too fast. But I'm going to get started by pulling up my terminal. I am a hardcore terminal fan. I am old school, I guess. Let me make this bigger so we can all read it. Give me a shout if this is too small and I will keep making it bigger. So, here we just have a plain command line, right? And I've got Docker installed. And let's go ahead and just do a quick help and see how we can figure out how to use this thing. So, here we go. I've got Docker, I've done my help command and there's all sorts of commands that we can use and options that we can use. And I'm going to start by running a new command. So, the way Docker works is when you do run, let's do the help so we could see the documentation here, what you do is you're going to pass it some options and you're going to give it an image. Now, an image is like a recipe or like a template. And that's basically saying, this is the thing that I want you to run. Whenever we do run, we take an image and we build a new container out of it. So, the difference between an image and a container is a containers is an actual process that is running and containers themselves can be changed and have state. They can be on or off and you can use them over again or you can not use them. Whereas an image is kind of like a set of instructions to make a container. And every time you create a container from an image, it's going to be the same starting point, and then containers themselves can change. So, let's go ahead and take a look at Docker Hub. So, Docker Hub is really nice because there's, it's this kind of repository of existing images. So, you don't necessarily have to build them from scratch. One of my favorite things about working with containers is that you're able to kind of always build on top of it. So, maybe you want to use a Python environment or a Node.js environment, right? So, we can type Node.js. Let's just type node and we can see that we've got this official node image. So, this means it's being curated. It's a curated open source repo that like the folks at Docker have looked at and we trust the people making it. We've done our due diligence on it and so we can be sure that this is something you can trust. And you can see here we've got like a verified publisher one. So, again, we know that this is being done by the real people at CircleCI. They're putting it together and so you can trust that this is going to be an image that's in good state and good quality. And so, when you pull it and run it, you're going to feel confident about it.

2. Pulling and Running Official Node.js Docker Images

Short description:

When you pull the official Node.js Docker image, you can choose from different versions. The Node.js Docker team maintains and updates the image regularly to ensure it's up to date and secure. By default, the latest version is pulled, but you can specify a specific version if needed.

And so, when you pull it and run it, you're going to feel confident about it. So, if we click into this node one, we can see that there's a whole bunch of information here and we can take a look at it. So, all we need to get this running is we just need to pull it locally, kind of similar to how like GitHub works. And we can see that it's maintained by the Node.js Docker team. So, this is the official Node.js folks and they're making commits to this fairly regularly to make sure it's up to date, to deal with any security vulnerabilities that might get discovered and trying making it like the best experience for you. And they have a bunch of different versions that they support. So, when you do that Docker pull node, unless you say which version you want, you're going to get the one that is tagged as latest. So in this instance, it's node 18.3, but you have the capacity to select whichever versions you want. So, if you want to do node 14 or some of the upcoming Node releases, node 18.3.15, you can set that level of granularity, as well, too, which is really useful.