A full CI environment with Docker 1/2

In my recent blog posts I talked about using Docker to build and deploy an image to the Azure cloud. In later posts I advertised to check out WSL2 as your new Docker environment. Today I want to show you one of the major advantages of Docker. You can get your hands on some nice tools quite fast. It’s a stunning advantage as a developer when you need to build a complete development environment from scratch. This can be due to an evaluation or start of a new project. Or you need a tool you didn’t consider to use because of a complicated installation process. With Docker you don’t need to install everything on your host system but as a set of containers. With Docker Compose you can start, stop, update or dump the whole environment with a single command. This comes in handy when you want to quickly board new team members which could be almost as easy as running git clone and docker-compose up.

Getting the containers and starting them is quite fast, however some configuration has to be done to allow flawless communication between the different tools. In this post I want to show you how to set up a continuous integration (CI) environment solely using Docker. I will use the following tools for that: GitLab, Nexus, SonarQube, Postgres, Maven, (Jenkins in part 2).

This first part will only cover setting up the different tools. How to manage the integration into an automated continuous integration pipeline using Jenkins will be covered in part 2. This mostly includes networking.

In this post I will present the work done by a colleague of mine. I basically tried to run his installation and configuration procedure (for his Linux VM) within the WSL2 environment. Check out his other posts about Docker Best Practices!

As I intend to describe the installation and configuration as a working example, I will not go into that much detail about the individual tools used. This would exceed every limit of a single blog post by far. Instead I intend to provide a guide to have a running environment at the end even for users new to each of the tools.

To make the connection to my last post I run Docker Desktop using WSL2. This has the advantage that you don’t need to think about setting up a Linux-VM first when working on a Windows machine. To follow my own best practices when working with WSL2 I decided to put my code and everything to WSL2 instead of working from the Windows file system. All commands should be started from within a WSL2 terminal.

At the end, the complete tool-chain can be started together with this docker-compose file (clone the repo). However, for the first use, we will start the tools one by one for configuration purposes. Additionally, I will provide how to work with each tool from your host. In part 2 we will delegate most of it to Jenkins. After everything is set up correctly, you can start the whole environment by simply run

docker-compose up -d

But that’s a way to go, so lets get started…

Sonatype NEXUS

Nexus is a repository manager to store and manage your build artifacts. Instead of always relying on a remote like Maven central you have full control over your dependencies. It can be used as a proxy to Maven central to share your jars with your team.  Furthermore, it can also be used as a registry for your Docker images or npm artifacts. 

Start the Nexus container (it may take some time)

docker-compose up -d nexus

For the first login to Nexus you need to fetch the initial admin password:

docker-compose exec nexus cat /nexus-data/admin.password

Change the password to your liking and allow anonymous access.

Now we will add a new repository where we also need to add a user with the correct permissions. Go to Administration > Repositories and create a maven2 (hosted) repository as we will use it to publish the build artifacts. Use a mixed version policy and allow redeploy. I assume you chose something like dockerCI-repo as the repository name (id). Add it to your maven-public repository group.

To manage access to this new repo of yours you have to create a role that can access it. Create the role dockerCI-users and assign the privilege :


In addition to allow read access to anonymous users, add the role


Create an active user (Username: dockerCI-user; Password: user) and assign him the newly created role. To check if everything worked out, try to login with the new user. You should be able to see your empty repo.

Use Nexus from host

Now we have to configure Maven to check your Nexus repository manager instead of using the default connection to the central repository. Therefore we have to provide the necessary permission to also allow publishing artifacts to your dockerCI-repo. This can be done using the global settings as well as the project POM. To learn more about it, check here.

To test if everything works out you need to have an example Maven project to work with.

<!--filepath (~/.m2/settings.xml)-->
     <name>Nexus repo manager</name>

In the project POM add the following

<!-- <local project directory>/pom.xml -->

Now you can publish your artifacts to Nexus by deploying any Maven project (make sure the ID’s match!):

mvn deploy

If you get a BUILD SUCCESS you should now also see it in the browser. If you got something about “Unauthorized” in the error message, check if your changes in the config-files match your repo’s parameters and you do not have any typo!

These settings, however, only apply when working from your local machine to access Nexus. Later, when we want everything handled automatically with Jenkins, we must adopt the settings for the container context.

SonarQube and Postgres

SonarQube is a tool to continuously review the quality of your source code. You can just follow predefined best practices or define your own set of rules. This way you can implement a quality gate to your build artifacts. In addition, it helps to detect bugs, security vulnerabilities and highlights parameters like test code coverage vs. complexity. This allows to increase the quality standard of all the committed code. More information about Sonar with Maven.

For storing the data, instead of using the built-in H2, a Postgres database is added to the environment.

As SonarQube uses Elasticsearch, the max_map_count of the VM has to be increased to be able to start the container.

# only for the current session
sudo sysctl -w vm.max_map_count=262144
sudo sh -c "echo vm.max_map_count=262144 >> /etc/sysctl.conf"

Start both containers

docker-compose up -d sonardb sonarqube

If it worked out, you should be able to access SonarQube, with the default credentials (admin/admin).

Use SonarQube from host

To access SonarQube with your local Maven installation, you need to generate a token and store it as a variable.

SONAR_TOKEN=<paste_your_token_here>  //no whitespaces

Go to your project directory and run

mvn sonar:sonar -D sonar.host.url=http://localhost:9000/sonarqube/
                -D sonar.login=${SONAR_TOKEN}

You can directly check if your source code failed or passed all tests within the browser.

Sonar of course also works with other languages and tools.


Naturally you also want to have a source code management tool. In this case we use GitLab. Start it with:

docker-compose up -d gitlab

Give it some time to start, check with docker ps if the container finished starting.

Check if you can access GitLab. If you run into problems with GitLab (always restarting or running unhealthily) try the following commands.

host# docker-compose exec -it gitlab bash 
gitlab# usermod -aG git gitlab-www
gitlab# usermod -aG gitlab-www www-data
gitlab# update-permissions
gitlab# exit
host# docker-compose restart  gitlab 

If you ran into ERROR 404 you may have a issue with your relative path.

You need to select a password (length >= 8) and afterwards login with the root (not admin!) user. Create a new user (user) object and add a project (default-project) to it. Don’t forget to directly assign a password when you use a non-existing email address. Try to login with your new user credentials.

Use GitLab from host

To access your new GitLab repository with your local Docker Git you need to configure a SSH connection for the user. Generate a new ssh-key pair.

ssh-keygen -t rsa -b 4096

Configure your SSH configuration to allow using Git with the port 10022 instead of the default port 22.

Host gitlab
Hostname localhost
IdentityFile ~/.ssh/dockerCI_key
User git
Port 10022

Add your public key to the user settings.

cat ~/.ssh/dockerCI_key.pub

Now you need to add a new remote to your local project.

git remote add gitlab git@gitlab:user/default-project.git

Push your project to the new remote.

git push gitlab master

Check out your new project repo. Now you can also easily clone it when you removed it on your local machine.

git clone ssh://git@gitlab/user/default-project.git

Now you are able to use Nexus, SonarQube and GitLab as local development tools with the complete freedom of an administrator. In the next part I will show you how to integrate those tools with Jenkins into an automated continuous integration pipeline. 

So stay tuned and feel free to ask questions or leave a comment.

Short URL for this post: https://blog.oio.de/FSJem
This entry was posted in Build, config and deploy and tagged , , . Bookmark the permalink.

1 Response to A full CI environment with Docker 1/2

  1. Pingback: A full CI environment with Docker 2/2 | techscouting through the java news

Leave a Reply

Your email address will not be published. Required fields are marked *