Understanding the Different Stages of Hair Loss and its Treatments

Men and women both experience hair loss, which is an everyday concern. In fact, according to a study, by the age of 50, almost 85% of men and 50% of women will have thinning or lost hair…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Develop and Deploy a Python API with Kubernetes and Docker

Docker is one of the most popular containerization technologies. It is a simple-to-use, developer-friendly tool, and has advantages over other similar technologies that make using it smooth and easy. Since its first open-source release in March 2013, Docker has gained attention from developers and ops engineers. According to Docker Inc., Docker users have downloaded over 105 billion containers and ‘dockerized’ 5.8 million containers on Docker Hub. The project has over 32K stars on Github.

Docker has since become mainstream. More than 100K 3rd-party projects are using this technology, and developers with containerization skills are in increasing demand.

This article is the first of a two-part series. In this blog post, you will discover how to use Docker to containerize an application, then how to run it on development environments using Docker Compose. We are going to use a Python API as our main app. The second part will focus more on orchestration, more specifically on how to deploy the same app to Kubernetes.

We are going to install some requirements before starting. We will use a mini Python API here developed in Flask. Flask is a Python framework and is an excellent choice to rapidly prototype an API. Our application will be developed using Flask. If you are not accustomed to Python, you can see the steps to create this API below.

Start by creating a Python virtual environment to keep our dependencies isolated from the rest of the system dependencies. Before this, we will need PIP, a popular Python package manager.

The installation is quite easy — you need to execute the following two commands:

For your information, you should have Python 3 installed. You can verify this by typing:

After installing PIP, use the following command to install the virtual environment:

We are going to build a simple API that shows the weather for a given city. For example, if we want to show the weather in London, we should request it using the route:​

You need to install the Python dependencies called “flask” and “requests” using PIP. We are going to use them later on:

Don’t forget to “freeze” your dependencies in a file called requirements.txt. This file will be used later to install our app dependencies in the container:

This is what the requirements file looks like:

This is the API initial code:

Now, we need to add the convenient code to make the API show weather data about a given city:

The overall code looks like this:

Now if you visit 127.0.0.1:5000/london/uk, you should be able to see a JSON similar to the following one:

Our mini API is working. Let’s containerize it using Docker.

Let’s create a container for our API; the first step is creating a Dockerfile. Dockerfile is an instructive text file containing the different steps and instructions that the Docker Daemon should follow to build an image. After building the image, we will be able to run the container.

A Dockerfile always starts with the FROM instructions:

In the above file, we did the following things:

After creating the Dockerfile, we need to build it using an image name and a tag of our choice. In our case, we will use “weather” as a name and “v1” as a tag:

Make sure that you are building from inside the folder containing the Dockerfile and the app.py file.

After building the container, you can run it using:

The container will run in the background since we use the -d option. The container is called “weather” ( — name weather). It’s also reachable on port 5000 since we mapped the host port 5000 to the exposed container port 5000.

If we want to confirm the creation of the container, we can use:

You should be able to see a very similar output to the following one:

You should be able to query the API now. Let’s test it using CURL.

If the last command should return a JSON:

Docker Compose is an open-source tool developed by Docker Inc. for defining and running multi-container Docker applications. Docker Compose is also a tool meant to be used in development environments, since it allows auto-reloading your container when you update your code, without restarting your containers manually or rebuilding your image after each change. Without Compose developing using only Docker containers would be frustrating.

For the implementation part, we are going to use a “docker-compose.yaml” file.

This is the “docker-compose.yaml” file we are using with our API:

You can see in the above file that I configured the service “weather” to use the image “weather:v1”. Map the host port 5000 to the container port 5000 and mount the current folder to the “/app” folder inside the container.

It is also possible to use a Dockerfile instead of an image. This would be recommended in our case since we already have the Dockerfile.

Now simply run “docker-compose up” to start running the service or “docker-compose up — build” to build it then run it.

In the second part of this tutorial, we are going to discover more details about Docker and Docker Compose, but we are mainly going to see how to use Kubernetes and deploy our API to GKE (Google Kubernetes Engine).

Get MetricFire free for 14 days. No credit card is required.

Add a comment

Related posts:

Nama instagram yang bagus dan kekinian wajib baca

Hello gaes gimana kabar nih kali ini kita akan belajar membuat contoh nama pengguna instagram yang bagus. Apa sih nama pengguna itu? Nama pengguna adalah nama yang melekat pada postingan atau…