Submit and vote on feature ideas.

Welcome to the new Parasoft forums! We hope you will enjoy the site and try out some of the new features, like sharing an idea you may have for one of our products or following a category.

Synchronizing Multiple SOAtest Servers using Jenkins and GIT

Ramiro Martinez
Ramiro Martinez Posts: 53 admin
edited April 2021 in SOAtest

This post is associated to the following video on the Parasoft customer portal:
https://customerportal.parasoft.com/lightningportal/s/videodetail?id=a3A4P0000047JbVUAU

For a demo of the this infrastructure, follow the link for the video above.

Intro:
In this forum post I will be discussing the requirements and process for creating a CI process for keeping remote SOAtest and Virtualize Server workspace(s) up to date with a source control repository
First an image of the infrastructure I will be discussing in this post

Requirements we need for above scenario:
Source control with webhook capabilities
CI pipeline (Jenkins for this example)
Multiple SOAtest Servers
Continuous Testing Platform

For the integration, we will be looking at the major portions of the scenario, and breaking down the major steps
1. User checks in test artifacts to Source Control
2. Source Control passes a set of parameters to Jenkins triggered by a post-receive web hook
3. Jenkins Job #1 consumes these parameters and set a location for the artifacts to be deployed to. Sends the parameters to Jenkins Job #2
4. Jenkins Job #2 checks out files from repository, to all SOAtest servers

Now to deconstruct these steps:
(Step 1. User checks in test artifacts to Source Control)
This step should not change from your organizations usual source control process. The general idea is that a member of the team pushes updated test cases into the SC repository that represents a portion of the workspace.

(Step 2. Source Control passes a set of parameters to Jenkins)
This step is where this infrastructure become interesting. We need a method to send information regarding the latest push to the CI pipeline, Jenkins, to trigger a source control pull on the remote SOAtest Servers.
To do this we can use source control webhooks to send a request to the Jenkins server with details on the repository that was updated.
Depending on your Source Control/Source Control Manager there are different ways to send this data to the Jenkins server. One is using a post-receive script, which should be available in most source controls and the other method is to use a webhook with parameters plugin if it is available in your source control manager.
Attached is a post-receive shell script that sends the following parameters to Jenkins:
Repository name, branch information, and the reference ID (includes the branch)
This script was created for the following infrastructure:
Linux
Source Control-Git
Source Control Manager-BitBucket
Continuous Integration Tool-Jenkins
Depending on your needs, you may not need all the parameters included in the script.
For example I only use repoID, since I only pull when the master branch is updated since test cases that have not been validated for master branch should not be placed on the automation servers. You may need additional parameters to pull from specific branches and validate the hash for verification. The script is a starting point for those that do not have access to a plugin that supports parameters or does not support the parameters that are required for your scenario.
Lets take a look at a snippet from the script and see what exactly is being sent to the Jenkins server.
cmd=(
curl -v POST --data ""
--user "$jenkinsUser:$jenkinsToken"
"$jenkinsJob/buildWithParameters?token=$jenkinsToken"
--data-urlencode "REPO_ID=$repoId"
--data-urlencode "NEW_HASH=$newHash"
--data-urlencode "REF_ID=$refId"
)
Here the script is building a Jenkins API REST request that will trigger a Jenkins job, and passes the previously mentioned information to Jenkins. For more information on Jenkins job execution with Parameters, see the following link:
https://wiki.jenkins.io/display/JENKINS/Parameterized+Build
We build a URL with the following structure:
http://server/job/myJobName/buildWithParameters? token=JenkinsToken&REPO_ID=Value1& NEW_HASH=Value2& REF_ID=Value3
(3. Jenkins Job #1 consumes these parameters and set a location for the artifacts to be deployed to. Sends the parameters to Jenkins Job #2)
The API request is sent to the Jenkins installation to the job set in the path, this job will parse the parameters and prepare them for the next step.
This step depends on your organizations source control structure in relation to the workspace. Do you have a single repo that is updated with all test cases or multiple repositories that represent a different folder on the workspace?
There are advantages to each, and it is up to the team/organization to decide how they would like organize their own repositories.

Configuration for multiple repositories (single repositories skip to step 4)
In order to keep the SOAtest workspace structure when you have multiple repositories, a process is necessary to track, which workspace folder needs to be updated when a specific SC repository received a push. Otherwise you place everything in the same folder and you have a mess on your hands, if you only a 10-20 TSTs or PVAs then this can be ok, but these infrastructures are for enterprise level solutions, and it is not unusual to have hundreds of TSTs/PVAs, which makes organization critical.

Method 1: To keep the organization in your SOAtest workspace, we recommend creating two different Jenkins jobs. The first Jenkins job is initialized by the post-receive webhook from Step 2, then the job can begin determining the correct location for the repository pull.
To do this, we recommend using a short script with two arrays. The first array has all the repository IDs and the second has the folder locations; the two arrays need to be in sync between repoIDs and locations. Compare the repository ID from the parameters to the list and get the array value, now you have the location in the workspace, so you can trigger a Jenkins pull in the next Jenkins job.
E.g:
#!/bin/bash

echo $REPO_ID
echo $NEW_HASH
echo $REF_ID
#Initialize repositories
RepoNames=("PROS/Repo1" "PROS/Repo2")
RepoDir=(" Workspace1_Location" " Workspace2_Location")
FoundDir=
#Loop repositories until you find a match with the incoming RepoID
for i in "${!RepoNames[@]}"; do
if [[ "${RepoNames[${i}]}" = "$REPO_ID" ]]; then
echo "${i}";
FoundDir=${i};
fi
done

#Store the workspace location into a file, so it can be sent to the next Jenkins job
echo ${RepoDir[$FoundDir]}
WorkspaceDir=${RepoDir[$FoundDir]}
echo WorkspaceDir_Agent=$WorkspaceDir > variables.properties

To pass the workspace location from the script to Jenkins job #2 the use of a plugin is necessary, since it is not visible to the Jenkins job. The EnvInject Plugin fills this role well:
https://wiki.jenkins.io/display/JENKINS/EnvInject+Plugin
Now we suggest using a Post-Build Action to trigger the second Jenkins job, and be sure to add all the previous parameters
E.g:

Variables.properties contains the workspace location that was found in the script.

This will lead to step 4 of the configuration, but before that, I wanted to mention an alternative that may be a fit for teams with multiple repositories, but the number repos are in the low single digits.
Method 2: If you would like to avoid creating the two Jenkins job setup, you can create Jenkins jobs for each specific repository, therefore you can completely skip the above Jenkins job.
The reason this is not the default recommended option is because of the following reasons:
a. With the 2 Jenkins job configuration, the number of repositories can be hundreds without the need of creating additional Jenkins jobs.
b. With method 2 the number of Jenkins jobs will be equal to the number of repositories, which can get cluttered.
c. Maintenance: if there is a major change to your source control configurations, you only need to change a single Jenkins job

(4. Jenkins Job #2 checks out files from repository, to all SOAtest servers)
This is the final step in the diagram, where the new and updated TSTs/PVAs are placed into the correct locations in the workspace. Additionally, we are going off the assumption that you have multiple SOAtest server installations that all need to be updated. This will all be handled using a single Jenkins job and two Jenkins plugins.
This step requires the installation of the following Jenkins Plugins:
https://wiki.jenkins.io/display/JENKINS/NodeLabel+Parameter+Plugin (If you are not using Jenkins then a process for running a job over all agents is necessary)
https://wiki.jenkins.io/display/JENKINS/Git+Plugin (Jenkins[CI tool] plugin for your source control)

The Node and label Parameter plugin give access to additional parameter types. One of these is the Node parameter, which represents a Jenkins agent machine. With node parameters we can select which agent(s) are executed during an automated execution, which is exactly the way we intend to execute this job. One of these options is to select all nodes, so when this job is executed non-manually the Jenkins job runs on each of the Jenkins nodes.
Eg:

Note: Select the “Run next build regardless of build result” radio button
Now that we have the job running on every single Jenkins node, we can focus on pulling the TSTs/PVAs from source control repository.
This step is a bit more fluid depending on what exactly your team is looking to pull from the source control. In my infrastructure, I only update TSTs and PVAs on my remote SOAtest servers after they have been vetted and approved for the master repository, therefore when Jenkins triggers the pull, I simply pull from the master repository. Therefore, I do not need to worry about the branch or the commit hash since I am always pulling from the mainline.
Now we simply add the parameters we have been gathering from steps 2 and 3 and place them in the correct locations.
The repoID is placed at the end of the repository URL, so depending on the RepoId parameter we can pull from any repository on the version-control system that the user has authorization for. To pull a Jenkins parameter use the syntax $ParameterName
If you are pulling from different branches, branch details can be found in the reference ID
E.g:

Since we are placing the data into different folders throughout the workspace, we need to add the additional behaviors: “Check out to a sub-directory” and “Advanced clone behaviors”
In the “Check out to a sub-directory” options add the path to the folders TestAssets(for TSTs) or VirtualAssets(for PVAs) then add one more location to the end using the directory that we found in the first Jenkins job.
In the “Advanced clone behaviors”, enable “Fetch tags” and “shallow clone”, so we only fetch files that actually changed to reduce the time needed to update the workspaces.

Comments