Docker Tutorials for Beginners: Learn How to Use Docker
You are on the right page if you are a DevOps Engineer looking to learn and understand Docker from the basics to the intermediate level. You may be starting out in the DevOps niche, and one of the major tools that would help you thrive is Docker.
As previously mentioned in articles on our blog page, Docker is a containerized virtual machine for running heavy applications that your local machine can’t handle. Look at Docker vs Kubernetes.
Nobody cares if a DevOps Engineer hacks it in the cloud computing niche, but if they don’t understand Docker, they will not be able to hack it in the outside world of cloud computing.
If you do not know the functions of Docker yet, check out our blog article explaining What is a Docker Container.
Let’s walk through the Docker basics before we move to how to use Docker.
Docker Tutorial: A Beginner Guide
We have understood Docker and its functions for cloud computing engineers. We know why Docker is needed to run online and offline servers as a virtual machine.
One thing you need in order to use Docker is some programming experience. You might not be great at programming, but you should know how to punch codes into the command line. You need to have basic knowledge of Git.
The journey to mastering Docker begins from understanding basic things about Docker. As we move forward, you will understand more things about Docker.
Docker is a platform for building, running, and shipping applications via virtual machines. It helps you test applications to ensure they work correctly on other machines, both local and online. Some application code fails on local machines for different reasons – like software version mismatch, one or more files missing or altered, or different configuration settings.
If your application needs a particular version of MongoDB, Node.JS, and React.js/ Next.js, all of the Docker tools for this can be included in your package, and you can run it on any machine that runs on Docker. If your application works in the development stage on Docker, it will work in the testing and production stages.
If your new developer needs to work on your codes, they do not need to downgrade or upgrade their package manager. All they need is to install Docker, “git pull” from your development, testing, or production repository, make edits/updates to the code, and “git push”, and the project is live again with updates.
Docker Containers vs Virtual Machines
Containers are isolated environments for running applications, while a virtual machine, as the name implies, is an abstract of a physical machine. Several virtual machines can be operated on one actual physical machine. For instance, if you are running machines with different operating systems like Ubuntu, MacOS, Windows, and Linux with the help of Hypervisor. This Hypervisor is used to create images and manage existing images of other virtual machines.
We have numerous Hypervisors – like Virtualbox and VMware to run on all operating systems and Hyper-V, specifically created for Windows.
Benefits of Virtual Machines
- Virtual machines can help run applications in isolation, which means you can run multiple virtual machines on the same physical machine with different package managers.
- You can use multiple operating systems without the fear of one colliding with the other.
Benefits of Containers
- Containers allow you to run multiple applications in isolation
- They are lightweight and do not need a whole operating system.
- All containers share the same operating system of the host machine, which means you need to license, patch, and monitor a single operating system.
- When the container is lightweight, it can start up quickly
- Containers need less hardware resources. They do not need specific CPU cores, RAMs, or storage. The host can run hundreds of containers side-by-side.
Docker Tutorial: Architecture
Let’s look at the architecture of Docker to help you understand its working system. Docker uses a client-server architecture, so it has a client component that sends a request to the server, also known as the Docker Engine, using a REST API. The Docker Engine works on building and running Docker Containers. The Docker Container is a process like other processes that run on your PC, but it has its uniqueness. We will touch on these unique aspects; keep reading.
As mentioned above, containers do not contain a full-blown OS; they depend on the host machine’s OS. In real terms, they share the host kernel, which is the core of an operating system. A kernel manages all application and hardware resources like the memory and CPU.
Every operating system has its kernel or engine, and these kernels have different APIs. We cannot run a Linux app on a Windows PC because this application will take responses from the kernel of a Linux PC. Let’s move to the installation of Docker.
Getting Started with Using Docker
Installation
Docker can be installed on Windows, Mac, and Linux, and it’s available in several editions, including Community Edition (CE) and Enterprise Edition (EE). To get started, visit theDocker website and download the appropriate version for your platform, or just Google” Install Docker” or “Download Docker”. Follow the prompt after the Docker, and you are good to go. You need to enable Hyper-V and Containers features before you click install. Once you’ve installed Docker, you should be able to run the `docker` command from your terminal or command prompt. Read up on how to install Docker on Ubuntu.
Basic Docker Commands
Now that the installation is done, let’s look at some basic commands used in installing Docker. Open your command prompt or code editor terminal and ensure you are in the PC’s directory.
Checking Docker version
docker version
Images
A Docker image is a read-only template containing instructions for creating a Docker container. You can think of an image as a snapshot of an application and its dependencies, ready to be run in a Docker container. To list the Docker images on your system, run the following command:
docker images
To download a Docker image from a registry, use the Docker pull command followed by the image name and image tag name (if applicable).
docker image pull [OPTIONS] NAME[:TAG|@DIGEST]
Containers
To create a Docker container, you need to start with an image and customize it as needed. To create a new Docker container from an existing image, use the Docker create command followed by the image name.
docker create [OPTIONS] IMAGE [COMMAND] [ARG...]
This command will start a new container based on the specified image and run the default docker command (if any). To list the running Docker containers, use the following command.
docker container ls or docker ps
To stop a running container, use the docker stop command followed by the container ID or name.
docker stop [OPTIONS] CONTAINER [CONTAINER...]
To push a specific image of a Docker image to a registry, use the docker push command followed by the image name and tag:
docker push [OPTIONS] NAME[:TAG]
To pull a Docker image from a Docker registry here, use the docker pull command followed by the image name and tag:
docker pull [OPTIONS] NAME[:TAG|@DIGEST]
Now that we’ve covered the basics of Docker architecture and command-line tools, let’s move on to building Docker images.
How to Use Docker Tutorial: Development Workflow
Looking at the development workflow, you need to pick an application to run by Docker by adding Dockerfile.
Dockerfile/Docker Image
Dockerfile is a plain text document that Docker uses to package all the application and their components into an image. The image contains everything needed for the application to run on your machine. Typically, the image cuts down an operating system, runtime environment, application files, third-party libraries, and environment variables. We create a Dockerfile and send it to Docker to help package our application into an image. Once the image is achieved, you can now start a container.
Docker Container
A Docker Container is a process due o the fact that it has its file system provided by the image, and your application gets loaded inside a container or a process. Here is how to run your application on your development machine. W use the command docker run.
docker run
Docker in Action
Let’s look at a basic example of a Dockerfile for a Node.js application:
# Use an official Node.js runtime as a parent image
FROM node:14-alpine
# Set the working directory to /app
WORKDIR /app
# Copy the package.json and package-lock.json files to the working directory
COPY package*.json ./
# Install the dependencies
RUN npm install
# Copy the rest of the application files to the working directory
COPY .
# Expose port 3000 for the application
EXPOSE 3000
# Define the command to run the application
CMD [ "npm", "start" ]
To build a Docker image from a Dockerfile, use the docker build command followed by the path to the Dockerfile:
docker build -t image-name
The Dockerfile in the current directory (.) will be used to create a Docker image with the supplied name and tag (-t option). Once the new image has been built, list it using the docker image ls command.
Best Practices for Building Docker Images
When building Docker images, there are note-worthy practices to ensure that the images work efficiently, securely, and are maintainable. Here are some of these best practices:
- Use a small base image: Start with a minimal base image to reduce the image size and minimize the attack surface.
- Use caching: With cache, you can accelerate image-building and eliminate unwanted downloads and installs.
- Minimize layers: Minimize the number of layers in your Dockerfile to reduce the size of the image and make it easier to maintain.
- Remove unnecessary files: Reduce the image’s size and security concerns by removing superfluous junk files.
- Use environment variables: Configure the program with environment variables to make it more versatile and portable.
Basic Docker Containers Commands
Once you have created a Docker container, you can manage it using various commands. Here are some common Docker commands for using containers:
#List all running containers
docker ps
#Stop a running container
docker stop container-id
#Start a stopped container
docker start container-id
#Restart a running container
docker restart container-id
#Remove a stopped container
docker rm container-id
You can also use the docker container command to manage containers. For example, you can use the docker container ls command to list all running containers.
Container Networking
Container networking allows Docker containers to connect with either one container or another. Each container has its own network namespace and IP address by default. The Docker network creates a command that may be used to build a new one using Docker compose on the network:
docker network create network-name
This command will create a new Docker network with the specified name.
When starting a Docker container, use the –network option to connect it to a network:
docker run --network network-name image-name
This command will start the container and connect it to the specified Docker network.
Container Volumes
Docker volumes allow you to save data between two containers, container runs, and transfer data among multiple containers. The docker volume creates command may be used to build a Docker volume:
docker volume create volume-name
This command will start a new Docker volume with your chosen name.
When running a Docker container, use the -v option to mount a volume:
docker run -v VOLUME_NAME:/path/to/mount IMAGE_NAME
This command launches the container and mounts the provided Docker volume to the specified path within the container.
Installing Docker Compose
Docker Compose is included with Docker Desktop for Windows and macOS. If you are running Linux, you can install it using the following commands:
sudo curl -L "https://github.com/docker/compose/releases/download/2.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
This will download the latest version of Docker Compose and install it in
`/usr/local/bin/docker-compose`
Creating a Docker Compose File
A Docker Compose file is a YAML file that defines the services that make up your Docker application itself. Here’s an example Docker Compose file for a simple web application:
version: '3'
services: web:
build: .
ports:
- "5000:5000"
volumes:
- .:/code
environment:
FLASK_APP: app.py
FLASK_ENV: development
db:
image: postgres
This file specifies two services: a web server and a database. The web service is generated from the current directory (. ), exposes port 5000, mounts the current directory to /code inside the container, and adds two Flask application environment variables. The database service uses the Postgres Docker image.
Starting and Stopping Docker Compose
To start the services defined in a Docker Compose file, use the docker-compose up command:
docker-compose up
This command will start all the services defined in the Docker Compose file and display their output in the console. You can stop the services by pressing Ctrl+C.
Use the -d option to start the services in the background and disconnect them from the console:
docker-compose up -d
This command will start the services in the background and return the container name to IDs.
Use the command below to stop the services:
docker-compose down
This command will stop and remove all the containers, networks, and volumes created by
docker-compose up
Conclusion
Docker is an incredible tool for developing, distributing, and executing applications. Using Docker, you can quickly bundle your application and its dependencies into a container that can be executed anywhere, independent of the underlying infrastructure. We went through the fundamentals of Docker, such as how to install it, create Docker images, the Docker environment, run Docker containers, and utilize Docker Compose.
With Docker, you may improve application scalability and resilience and optimize your web development workflow. Docker is an essential tool in your toolbox whether you’re a developer, a DevOps engineer, or a system administrator. So, why not try Docker out and host Docker applications on the ServerMania cloud hosting platform today and see how it can improve your workflow and make your life easier?