In this post, we would like to go over how to run Kafka with Docker. Before starting the post, make sure you have installed Docker (Docker hub) on your computer.
Step 1. Docker Image Setup
Okay, first, let’s create a directory folder to store docker-compose.yml
file.
The docker-compose file does not run your code itself.
# shell
mkdir ~/starter && cd starter
[Option]
You can pull kafka
and zookeeper
images by using this docker pull command, more detailed explanation can be found in the following link - kafka and zookeeper from Docker Hub.
Step 2. Create docker-compose.yml
file
Docker
Instead of pulling images separately, you can write docker-compose.yml
file to pull those simultaneously. What is docker-compose.yml file?
It is basically a config file for Docker Compose. It allows you to deploy, combine, and configure multiple docker containers at the same time. Is there difference between dockerfile and docker-compose?
Yes!
"A Dockerfile is a simple text file that contains the commands a user could call to assemble an image whereas Docker Compose is a tool for defining and running multi-container Docker applications" (dockerlab)
Docker-compose.yml file
Step 3. Run docker-compose
Make sure you run the following command where docker-compose.yml
file is located at.
# shell
docker-compose up -d
Step 4. Run Kafka 😸
There are two options to run kafka from here.
[Option 1] Execute docker container (bash)
[Option 2] Access Kafka directly through command line
Stopping and Cleaning Up Docker Compose
When you are ready to stop Docker compose and you’d like to clean up the container to reclaim disk space, as well as the columns containing your data, run the following command:
# stop docker-compose
docker-compose stop# removing docker-compose
docker-compose rm -v