DE 

//Cloudogu EcoSystem Docs

Developing the SonarQube dogu

Build SonarQube inside a running CES instance by changing into this repository. Call then cesapp to install/upgrade and start the dogu:

cd /your/workspace/sonar
cesapp build .
cesapp start sonar

Shell testing with BATS

You can create and amend bash tests in the unitTests directory. The make target unit-test-shell will support you with a generalized bash test environment.

make unit-test-shell

BATS is configured to leave JUnit compatible reports in target/shell_test_reports/.

In order to write testable shell scripts these aspects should be respected:

Global environment variable STARTUP_DIR

The global environment variable STARTUP_DIR will point to the directory where the production scripts (aka: scripts-under-test) reside. Inside the dogu container this is usually /. But during testing it is easier to put it somewhere else for permission reasons.

A second reason is that the scripts-under-test source other scripts. Absolute paths will make testing quite hard. Source new scripts like so, in order that the tests will run smoothly:

source "${STARTUP_DIR}"/util.sh

Please note in the above example the shellcheck disablement comment. Because STARTUP_DIR is wired into the Dockerfile it is considered as global environment variable that will never be found unset (which would soon be followed by errors).

Currently, sourcing scripts in a static manner (that is: without dynamic variable in the path) makes shell testing impossible (unless you find a better way to construct the test container)

General structure of scripts-under-test

It is rather uncommon to run a scripts-under-test like startup.sh all on its own. Effective unit testing will most probably turn into a nightmare if no proper script structure is put in place. Because these scripts source each other AND execute code everything must be set-up beforehand: global variables, mocks of every single binary being called... and so on. In the end the tests would reside on an end-to-end test level rather than unit test level.

The good news is that testing single functions is possible with these little parts:

  1. Use sourcing execution guards
  2. Run binaries and logic code only inside functions
  3. Source with (dynamic yet fixed-up) environment variables
Use sourcing execution guards

Make sourcing possible with sourcing execution guards. like this:

# yourscript.sh
function runTheThing() {
    echo "hello world"
}

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
  runTheThing
fi

The if-condition below will be executed if the script is executed by calling via the shell but not when sourced:

$ ./yourscript.sh
hello world
$ source yourscript.sh
$ runTheThing
hello world
$

Execution guards work also with parameters:

# yourscript.sh
function runTheThing() {
    echo "${1} ${2}"
}

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
  runTheThingWithParameters "$@"
fi

Note the proper argument passing with "$@" which allows for arguments that contain whitespace and such.

$ ./yourscript.sh hello world
hello world
$ source yourscript.sh
$ runTheThing hello bash
hello bash
$
Run binaries and logic code only inside functions

Environment variables and constants are okay, but once logic runs outside a function it will be executed during script sourcing.

Source with (dynamic yet fixed-up) environment variables

Shellcheck basically says this is a no-no. Anyhow unless the test container allows for appropriate script paths there is hardly a way around it:

sourcingExitCode=0
# shellcheck disable=SC1090
source "${STARTUP_DIR}"/util.sh || sourcingExitCode=$?
if [[ ${sourcingExitCode} -ne 0 ]]; then
  echo "ERROR: An error occurred while sourcing /util.sh."
fi

At least make sure that the variables are properly set into the production (f. i. Dockerfile)and test environment (set-up an env var in your test).

Test SonarQube Dogu

Due to communication problems caused by self-signed SSL certificates in a development CES instance, it is a good idea to run SonarScanner via Jenkins in the same instance. The following procedure has proven successful:

  1. Install SCM Manager, Jenkins and Nexus in CES:

    cesapp install official/scm
    cesapp start scm
    
    cesapp install official/jenkins
    cesapp start jenkins
    
    cesapp install official/nexus
    cesapp start nexus 
  2. SCMM:

  3. SonarQube

    1. Create a SonarQube token
    2. Navigate as admin to the Security page
    3. Generate token

      • Name: admin_token
      • Type: Global Analysis Token
      • Expires in: 30 Days
    4. Copy generated token
    5. Create a webhook
    6. navigate as admin to the global Webhooks page
    7. Create a new webhook with the [ Create ] button

      • Name: ces
      • URL: https://192.168.56.2/jenkins/sonarqube-webhook
      • Secret: leave empty
  4. Jenkins

    1. Install Sonar-Plugin

    2. Create sonar scanner if necessary

      • Navigate to Dashboard/Manage Jenkins/Tools
      • In the "SonarQube Scanner Installations" section, create an entry via Maven Central
      • name: sonar-scanner
      • The latest version can be used
    3. Configure SonarServer if necessary

      • navigate to Manage Dashboard/Jenkins/System
      • Configure the following in the "SonarQube servers" section

        • Environment variables: yes/check
        • Name: sonar
        • Server URL: http://sonar:9000/sonar
        • Server authentication token: Press add

          • Create credential under the ID sonarAnalyzeToken of type "Secret Text" with the token generated in SonarQube
    4. insert credentials for SCMM and SonarQube in the Jenkins Credential Manager

      • Store admin credentials under the ID scmCredentials

        • SCMM and SonarQube share admin credentials (SCMM in the build configuration, SonarQube in the Jenkinsfile)
      • Pay attention to the credential type for SonarQube!

        • Username/Password for Basic Authentication
    5. create build job

      1. Create 1st element -> Select Multibranch Pipeline -> Configure job

        • Select Branch Sources/Add source: "SCM-Manager (git, hg)"
        • Server URL: https://192.168.56.2/scm/
        • Credentials for SCM Manager: select the credential scmCredentials configured above
      2. save job

        • the Jenkinsfile will be found automatically
      3. if necessary, cancel surplus/non-functioning jobs
      4. adapt and build the master branch with regard to changed credentials or unwanted job days