DE 

//Cloudogu EcoSystem Docs

Developing the SonarQube dogu

Build SonarQube inside a running CES instance by changing into this repository. Call then cesapp to install/upgrade and start the dogu:

cd /your/workspace/sonar
cesapp build .
cesapp start sonar

Integrating and Test the Sonar CAS Plugin within the Dogu

There are two alternatives for testing development versions of the Sonar CAS Plugin (you can find the compiling instructions there):

  1. Replace the plugin version in an already running SonarQube

    • rm /var/lib/ces/sonar/volumes/extensions/plugins/sonar-cas-plugin-2.0.1.jar
    • cp your-sonar-cas-plugin.jar /var/lib/ces/sonar/volumes/extensions/plugins/
    • sudo docker restart sonar
  2. Modify the Dockerfile and build another image with your local plugin version

    • comment-out lines that focus on sonar-cas-plugin
    • add a new line for COPYing your plugin, like so:

      • COPY --chown=1000:1000 sonar-cas-plugin-3.0.0-SNAPSHOT.jar ${SONARQUBE_HOME}/sonar-cas-plugin-3.0.0-SNAPSHOT.jar

Shell testing with BATS

You can create and amend bash tests in the unitTests directory. The make target unit-test-shell will support you with a generalized bash test environment.

make unit-test-shell

BATS is configured to leave JUnit compatible reports in target/shell_test_reports/.

In order to write testable shell scripts these aspects should be respected:

Global environment variable STARTUP_DIR

The global environment variable STARTUP_DIR will point to the directory where the production scripts (aka: scripts-under-test) reside. Inside the dogu container this is usually /. But during testing it is easier to put it somewhere else for permission reasons.

A second reason is that the scripts-under-test source other scripts. Absolute paths will make testing quite hard. Source new scripts like so, in order that the tests will run smoothly:

source "${STARTUP_DIR}"/util.sh

Please note in the above example the shellcheck disablement comment. Because STARTUP_DIR is wired into the Dockerfile it is considered as global environment variable that will never be found unset (which would soon be followed by errors).

Currently sourcing scripts in a static manner (that is: without dynamic variable in the path) makes shell testing impossible (unless you find a better way to construct the test container)

General structure of scripts-under-test

It is rather uncommon to run a scripts-under-test like startup.sh all on its own. Effective unit testing will most probably turn into a nightmare if no proper script structure is put in place. Because these scripts source each other AND execute code everything must be set-up beforehand: global variables, mocks of every single binary being called... and so on. In the end the tests would reside on an end-to-end test level rather than unit test level.

The good news is that testing single functions is possible with these little parts:

  1. Use sourcing execution guards
  2. Run binaries and logic code only inside functions
  3. Source with (dynamic yet fixed-up) environment variables
Use sourcing execution guards

Make sourcing possible with sourcing execution guards. like this:

# yourscript.sh
function runTheThing() {
    echo "hello world"
}

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
  runTheThing
fi

The if-condition below will be executed if the script is executed by calling via the shell but not when sourced:

$ ./yourscript.sh
hello world
$ source yourscript.sh
$ runTheThing
hello world
$

Execution guards work also with parameters:

# yourscript.sh
function runTheThing() {
    echo "${1} ${2}"
}

if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
  runTheThingWithParameters "$@"
fi

Note the proper argument passing with "$@" which allows for arguments that contain whitespace and such.

$ ./yourscript.sh hello world
hello world
$ source yourscript.sh
$ runTheThing hello bash
hello bash
$
Run binaries and logic code only inside functions

Environment variables and constants are okay, but once logic runs outside a function it will be executed during script sourcing.

Source with (dynamic yet fixed-up) environment variables

Shellcheck basically says this is a no-no. Anyhow unless the test container allows for appropriate script paths there is hardly a way around it:

sourcingExitCode=0
# shellcheck disable=SC1090
source "${STARTUP_DIR}"/util.sh || sourcingExitCode=$?
if [[ ${sourcingExitCode} -ne 0 ]]; then
  echo "ERROR: An error occurred while sourcing /util.sh."
fi

At least make sure that the variables are properly set into the production (f. i. Dockerfile)and test environment (set-up an env var in your test).

Test SonarQube Dogu

Due to communication problems caused by self-signed SSL certificates in a development CES instance, it is a good idea to run SonarScanner via Jenkins in the same instance. The following procedure has proven successful:

  1. Install SCM Manager and Jenkins in CES

    • cesapp install official/jenkins; cesapp install official/scm; cesapp start scm; cesapp start jenkins
  2. SCMM:

  3. SonarQube

    1. Create a SonarQube token
    2. Navigate as admin to the Security page
    3. Generate token

      • Name: admin_token
      • Type: Global Analysis Token
      • Expires in: 30 Days
    4. Copy generated token
    5. Create a webhook
    6. navigate as admin to the global Webhooks page
    7. Create a new webhook with the [ Create ] button

      • Name: ces
      • URL: https://192.168.56.2/jenkins/sonarqube-webhook
      • Secret: leave empty
  4. Jenkins

    1. Create sonar scanner if necessary
    2. Navigate to Dashboard/Manage Jenkins/Tools
    3. In the "SonarQube Scanner Installations" section, create an entry via Maven Central
    4. name: sonar-scanner
    5. Version: 4.8.1 (maximum Java 11)
    6. Configure SonarServer if necessary
    7. navigate to Manage Dashboard/Jenkins/System
    8. Configure the following in the "SonarQube servers" section

      • Environment variables: yes/check
      • Name: sonar
      • Server URL: http://sonar:9000/sonar
    9. Server authentication token: Press add

      • Create credential of type "Secret Text" with the token generated in SonarQube
      • do not configure a Secret for the webhook
    10. insert credentials for SCMM and SonarQube in the Jenkins Credential Manager
    11. Store admin credentials under the ID scmCredentials

      • SCMM and SonarQube share admin credentials (SCMM in the build configuration, SonarQube in the Jenkinsfile)
    12. Pay attention to the credential type for SonarQube!

      • Username/Password for Basic Authentication
    13. create build job
    14. Create 1st element -> Select Multibranch Pipeline -> Configure job

      • Select Branch Sources/Add source: "SCM-Manager (git, hg)"
      • Server URL: https://192.168.56.2/scm/
      • Credentials for SCM Manager: select the credential scmCredentials configured above
    15. save job

      • the Jenkinsfile will be found automatically
    16. if necessary, cancel surplus/non-functioning jobs
    17. adapt and build the master branch with regard to changed credentials or unwanted job days

      • An old version (ces-build-lib@1.35.1) of the ces-build-lib is important, newer versions lead to authentication errors
      • an exchange for a newer build-lib is not relevant in the context of smoke tests of SonarQube

Testing the SonarQube Community Plugin

  1. in SonarQube: set up the community branch plugin

    1. as CES shell administrator: download the SonarQube version appropriate community plugin as JAR and move it to /var/lib/ces/sonar/volumes/extensions/plugins/
    2. restart SonarQube
  2. in the SCM Manager: install the editor and review plugins

    • This makes it possible to edit source files without git clone ... ; git commit ...
  3. edit spring-petclinic/ master branch

    • create a sonar-project.properties in the SCM Manager (if not already available)

      • see below for an example file
      • this ensures that SonarQube finds the built .class files
    • enrich the Jenkinsfile in the SCM Manager so that stage("build") and stage("integration test") are also available

      • see below for an example file
      • This ensures that SonarQube also scans PR branches and informs Jenkins about the status
  4. in SonarQube: redeclare the main branch (only if necessary)

    1. navigate to Projects
    2. only necessary if a wrong branch has been scanned
    3. rename the project marked as main to the desired branch, e.g. master
    4. delete the remaining projects
  5. test PR branch recognition

    1. create a new branch in the SCM Manager on a master basis
    2. minimally modify and commit any file (so a PR can be created)
    3. create PR from new branch on master
  6. after PR creation, check SonarQube and Jenkins job for the scan result

sonar-project.properties

sonar.projectKey=spring-petclinic

sonar.sources=./src/main/java
sonar.tests=./src/test/java
sonar.java.binaries=./target/classes

sonar.junit.reportPaths=./target/surefire-reports
sonar.coverage.jacoco.xmlReportPaths=./target/site/jacoco/jacoco.xml

Jenkinsfile

This is just a template! Please see the comments to make necessary changes

#!groovy
@Library('github.com/cloudogu/ces-build-lib@2.2.1')
import com.cloudogu.ces.cesbuildlib.*

node {

    Git git = new Git(this, "admin")
    git.committerName = 'admin'
    git.committerEmail = 'admin@admin.de'
    projectName="spring-petclinic"
    branch = "${env.BRANCH_NAME}"
    Maven mvn = new MavenWrapper(this)

    String credentialsId = 'scmCredentials'

    catchError {
        // Add the usual Checkout, Build, Test, Integration Test stages here
        stage("...") {}
        
        stage('SonarQube') {
            def scannerHome = tool name: 'sonar-scanner', type: 'hudson.plugins.sonar.SonarRunnerInstallation'
            env.JAVA_HOME="${tool 'OpenJDK-11'}"
            withSonarQubeEnv {
                gitWithCredentials("fetch --all", credentialsId)

                if (branch == "master") {
                    echo "This branch has been detected as the master branch."
                    sh "${scannerHome}/bin/sonar-scanner -Dsonar.projectKey=${projectName} -Dsonar.projectName=${projectName}"
                } else if (branch == "develop") {
                    echo "This branch has been detected as the develop branch."
                    sh "${scannerHome}/bin/sonar-scanner -Dsonar.projectKey=${projectName} -Dsonar.projectName=${projectName} -Dsonar.branch.name=${env.BRANCH_NAME} -Dsonar.branch.target=master  "
                } else if (env.CHANGE_TARGET) {
                    echo "This branch has been detected as a pull request."
                    sh "${scannerHome}/bin/sonar-scanner -Dsonar.projectKey=${projectName} -Dsonar.projectName=${projectName} -Dsonar.pullrequest.key=${env.CHANGE_ID} -Dsonar.pullrequest.branch=${env.CHANGE_BRANCH} -Dsonar.pullrequest.base=develop    "
                } else {
                    echo "This branch has been detected as a feature branch."
                    sh "${scannerHome}/bin/sonar-scanner -Dsonar.projectKey=${projectName} -Dsonar.projectName=${projectName} -Dsonar.branch.name=${env.BRANCH_NAME} -Dsonar.branch.target=develop"
                } // add more to your liking
            }

            timeout(time: 60, unit: 'SECONDS') { // Works best with Webhooks, otherwise it needs a sleep which may not work for the async SQ scan
                def qGate = waitForQualityGate()
                if (qGate.status != 'OK') {
                    unstable("Pipeline unstable due to SonarQube quality gate failure")
                }
            }
        }
    }

    junit allowEmptyResults: true, testResults: '**/target/failsafe-reports/TEST-*.xml,**/target/surefire-reports/TEST-*.xml'
}

void gitWithCredentials(String command, String credentialsId) {
    withCredentials([usernamePassword(credentialsId: credentialsId, usernameVariable: 'GIT_AUTH_USR', passwordVariable: 'GIT_AUTH_PSW')]) {
        sh(
                script: "git -c credential.helper=\"!f() { echo username='\$GIT_AUTH_USR'; echo password='\$GIT_AUTH_PSW'; }; f\" " + command,
                returnStdout: true
        )
    }
}