Docker

For local development it is usually a practise to spawn test containers on a local machine and run them all the time during development.
Also embedded broker/server is also an option, but RabbitMQ and Postgres don’t offer embedded solutions out of the box and setting a custom one would take too much time.
So what you can easily do is spawn local short lived docker containers with these technologies. 
A good and tested solution to this problem is using the Testcontainers test dependency. Test containers
So, with provided RabbitMQ and Postgres images, you just launch containers on Test Suite start and they get automatically destroyed on shutdown.

Setting up a CI/CD pipeline with centralized logging

An application developed by two engineers began to grow and more developers should be onboarded. What previously worked for two people (simple manual deployment on a webserver and searching through textfile logs when problems arise) wouldn’t work for a bigger team and multiple instances of the product deployed in the cloud. On a mission to cope with that as a new developer in the team I started by simplifying the deployment and packaging the application with a web server into a docker container. Because we were already using Amazon Web Services I could use its infrastructure to clusterize our servers. After writing so called TaskDefinitions I could tell the cluster to automatically use our uploaded docker container and deploy it as a service so that it’s available in the web. To make the task of deploying new code easier I used a feature of our web based git-repository manager GitLab. I configured a CI/CD pipeline through a script which builds committed code automatically, uploads it as tagged containers and restarts the AWS services via the AWS command line interface. The only thing that was missing now was a centralized logging solution enabling us to analyze logs from all of our systems. Installing Elasticsearch proved to be the solution. After routing all the docker container’s sysout to the Elasticsearch engine via Fluentd (a data collector and preprocessor), we could analyze and search through our logs easily with a neat web interface called Kibana.

Serving multiple websites on a single host with Docker is against the principles of Docker and micro service architecture. Sure you can do it but what is blocking you from separating it into multiple docker instances. That way, if there is a problem not all of the sites go down. By separating the hosts, you make it easier the diagnose possible problems and minimize down time.

The problem can be solved by using a Nginx reverse proxy. Each application will be exposed through a corresponding sub-domain.

Dockerfile:

FROM nginx:alpine
COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY proxy.conf /etc/nginx/includes/proxy.conf

proxy.conf:

proxy_redirect off;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_buffering off;
proxy_request_buffering off;
proxy_http_version 1.1;
proxy_intercept_errors off;

nginx.conf:

# pm config.
server {
listen 80;
server_name site1.myproject.com;
location / {
include /etc/nginx/includes/proxy.conf;
proxy_pass http://site1_webserver_1;
}
access_log off;
error_log /var/log/nginx/error.log error;
}
# api config.
server {
listen 80;
server_name site1.myproject.com;
location / {
include /etc/nginx/includes/proxy.conf;
proxy_pass http://site2_webserver_1;
}
access_log off;
error_log /var/log/nginx/error.log error;
}
# Default
server {
listen 80 default_server;
server_name _;
root /var/www/html;
charset UTF-8;
access_log off;
log_not_found off;
error_log /var/log/nginx/error.log error;
}

The proxy_pass is the name of the application's docker container.

Technology:

Multiple websites on a single host with Docker

It is not possible to run multiple web applications in Docker at the same time. Since all of them use port 80, only one Docker instance is accessible.

If Hyper-V was manually uninstalled or virtualization was disabled, Docker for Windows will not start.
A solution is to enable the virtualization on your machine. In order to do that you have to access the BIOS, select the Intel Virtual Technology and enable the virtualization.
Save the changes and restart the computer, then Docker can be started and used for deploying.

Starting Docker on Windows

In order to start Docker on Windows, a Hyper-V as well as the Hyper-V Module for Windows Powershell are required to be installed and enabled. Otherwise you get a error message.
Subscribe to Docker