Load Testing Atlassian Jira, Confluence, Bitbucket Part 1

Hello everyone!

I want to share my experience with Atlassian Jira, Confluence and Bitbucket stress testing using the Atlassian dc-app-performance-toolkit .

This will be a series of articles in which I will not talk about the methodology of load testing, but I will only talk about the technical aspects of load testing using the tool.

If you want to learn more about the load testing methodology in the context of Atlassian products, then I talked about this at Highload 2018 .

So, let's move on to the dc-app-performance-toolkit.

This tool allows you to load test Atlassian applications such as Jira, Confluence and Bitbucket. I learned about this tool when I needed to test the Atlassian Jira plug-in for certification on the Data Center, and immediately I liked this tool because I did not have to spend hours setting it up. Everything worked out of the box.

The tool uses Taurus , jmeter and Selenium .

You can use this tool for the following purposes:

  • You are developing plugins for the Atlassian Marketplace. In this case, you can use this tool to certify your plug-in for Data Center.
  • Atlassian Jira, Confluence Bitbucket , , , . . , . - . , Atlassian, Atlassian, , , . , .

The steps that you need to perform to test Atlassian Jira, Confluence, and Bitbucket are the same in most cases, so I will do all the work on Atlassian Jira. If there are any specific steps for the product, I will mention this.

Installation


The first thing to do is get a tool with github:

git clone https://github.com/atlassian/dc-app-performance-toolkit.git

And go to the created folder:

cd dc-app-performance-toolkit

Next, you need to install all the dependencies for using the tool. You can read how to do this in path_to_dc_app_performance_toolkit / README.md .

Configuration files


Before using the tool, please read the documentation for this tool in the path_to_dc_app_performance_toolkit / doc folder .

Here is a brief description of the contents of this folder:

  • 3 md ( Jira, Confluence Bitbucket). , Data Center.
  • : jira, confluence bitbucket , .


jira.yml


Before starting load testing, you need to make changes to the jira.ymlconfluence.yml  or  bitbucket.yml file , which are located in the path_to_dc_app_performance_toolkit / app folder  , depending on which product you are going to test. These are Taurus configuration files. You can read about Taurus here .
I will give comments on jira.yml. Files for other products are built on the same principle.

settings:

settings is a section of the Taurus configuration file. It contains top-level options for Taurus. You can find more information on this section  here .

  artifacts-dir: results/jira/%Y-%m-%d_%H-%M-%S

artifacts-dir - path template that will be used to store load testing artifacts. Here is a partial list of artifacts:

  • bzt.log - the log of the bzt command. This command starts load testing.
  • error_artifacts - a folder for storing screenshots and html tests of Selenium, which ended with errors.
  • jmeter.err - JMeter log.
  • kpi.jtl - JMeter testing data.
  • pytest.out - Selenum test execution log.
  • selenium.jtl - Selenium testing data.
  • results.csv - aggregated test results.
  • jira.yml - jira.yml, which was used for testing.


  aggregator: consolidator

aggregator - the aggregator that will be used to collect test results and transfer the results to the reporting module. You can read more about aggregators here .

  verbose: false

verbose - parameter for launching Taurus in debug mode. In our case, we turn off this mode.

  env:

env allows you to set environment variables. You can read more  here .

    application_hostname: localhost   # Jira DC hostname without protocol and port e.g. test-jira.atlassian.com or localhost
    application_protocol: http      # http or https
    application_port: 2990            # 80, 443, 8080, 2990, etc
    application_postfix: /jira           # e.g. /jira in case of url like http://localhost:2990/jira
    admin_login: admin
    admin_password: admin

The parameters above contain information about your instance that you are going to test. I specified the parameters for the Jira instance, which is located on my computer. These parameters will be used in JMeter, Selenium and scripts.

    concurrency: 200
    test_duration: 45m

These parameters will be passed to the execution section. I will explain the meaning of these parameters when they talk about this section.

  WEBDRIVER_VISIBLE: false

WEBDRIVER_VISIBLE sets the visibility of the Chrome browser during the execution of Selenium tests. We are hiding Chrome.

 JMETER_VERSION: 5.2.1

JMETER_VERSION is the version of JMeter that we will use for testing.

    allow_analytics: Yes            # ,        Atlassian.    Atlassian ,   ,   ,       .

services:

services - section of the Taurus configuration file. Information about which scripts you need to run before starting, after and during the test is indicated here. You can read more about this section here .

  - module: shellexec

Shell executor is used to execute scripts.

    prepare:
      - python util/environment_checker.py
      - python util/data_preparation/jira/prepare-data.py
    shutdown:
      - python util/jmeter_post_check.py
      - python util/jtl_convertor/jtls-to-csv.py kpi.jtl selenium.jtl
    post-process:
      - python util/analytics.py jira
      - python util/cleanup_results_dir.py

Prepare, shutdown and post-process are the stages of the Taurus life cycle. You can read more about the Taurus life cycle here . At each stage, scripts are run. These scripts are:

  • util / environment_checker.py - checks the version of Python and stops the test if the version is not the same.
  • util / data_preparation / jira / prepare-data.py - prepares test data. We will talk about this script in detail in the next part.
  • util/jmeter_post_check.py – , kpi.jtl. , - JMeter.
  • util/jtl_convertor/jtls-to-csv.py kpi.jtl selenium.jtl – results.csv kpi.jtl selenium.jtl. results.csv . : , , 90% , .
  • util/analytics.py jira – Atlassian. Atlassian allow_analytics.
  • util/cleanup_results_dir.py – , .


execution:

execution - section of the Taurus configuration file. Contains scripts that will be executed during testing. You can find more information here .

  - scenario: jmeter
    concurrency: ${concurrency}
    hold-for: ${test_duration}
    ramp-up: 3m

These are the script execution options for JMeter. You can find more information  here .

concurrency is the number of virtual users. This parameter means how many concurrent users JMeter will simulate. In our case, we simulate 200 simultaneously working users.

ramp-up - test overclocking time. We will go to 200 simultaneously working users gradually. In our case, we will reach 200 simultaneously working users in 3 minutes.

hold-for - duration of the test after reaching a given number of concurrent users in the concurrency parameter.

  - scenario: selenium
    executor: selenium
    runner: pytest
    hold-for: ${test_duration}

Selenium test execution options. You can find more information  here .

executor - Selenium.
runner - tests will be carried out using pytest.
hold-for - duration of testing.

scenarios:


scenarios - section of the Taurus configuration file. Here is the configuration for each script from the execution section.

  selenium:
    script: selenium_ui/jira_ui.py

script - the path to the Selenium tests.

  jmeter:
# provides path to the jmeter project file
    script: jmeter/jira.jmx
    properties:
      application_hostname: ${application_hostname}
      application_protocol: ${application_protocol}
      application_port: ${application_port}
      application_postfix: ${application_postfix}
      # Workload model
# the number of actions for an hour. 
      total_actions_per_hr: 54500
# actions and the % of execution within one hour. The sum of all parameters must equal to 100%
      perc_create_issue: 4
      perc_search_jql: 13
      perc_view_issue: 43
      perc_view_project_summary: 4
      perc_view_dashboard: 12
      perc_edit_issue: 4
      perc_add_comment: 2
      perc_browse_projects: 4
      perc_view_scrum_board: 3
      perc_view_kanban_board: 3
      perc_view_backlog: 6
      perc_browse_boards: 2
      perc_standalone_extension: 0 # By default disabled

script - the path to the JMeter project, which will be executed during testing.

total_actions_per_hr the number of actions that will be performed in an hour. One action is to perform one test. The tests that will be run can be understood from the perc_ parameters.
perc_ parameters - percentage of completion of each test. The sum of all perc_ parameters must be equal to 100%.

modules:
  consolidator:
    rtimes-len: 0 # CONFSRVDEV-7631 reduce sampling
    percentiles: [] # CONFSRVDEV-7631 disable all percentiles due to Taurus's excessive memory usage


modules - section of the Taurus configuration file. This section contains a description of all the modules that will be used during testing.

  jmeter:
    version: ${JMETER_VERSION}
    detect-plugins: true
    memory-xmx: 8G  # allow JMeter to use up to 8G of memory
    plugins:
      - bzm-parallel=0.4
      - bzm-random-csv=0.6
      - jpgc-casutg=2.5
      - jpgc-dummy=0.2
      - jpgc-ffw=2.0
      - jpgc-fifo=0.2
      - jpgc-functions=2.1
      - jpgc-json=2.6
      - jpgc-perfmon=2.1
      - jpgc-prmctl=0.4
      - jpgc-tst=2.4
      - jpgc-wsc=0.3
      - tilln-sshmon=1.0
      - jpgc-cmd=2.2
      - jpgc-synthesis=2.2
    system-properties:
      server.rmi.ssl.disable: true
      java.rmi.server.hostname: localhost
      httpsampler.ignore_failed_embedded_resources: "true"

jmeter - parameters of the JMeter module. You can read about the Jmeter module here .
detect-plugins - JMeter Plugins Manager allows you to install plugins for JMeter automatically. We set this parameter to yes, so that Taurus tells Jmeter to install all the necessary plugins automatically.
plugins - a list of JMeter plugins that were used when writing the tests.
system-properties - additional parameters for JMeter.

  selenium:
# version of the chrome driver
    chromedriver:
      version: "80.0.3987.106" # Supports Chrome version 80. You can refer to http://chromedriver.chromium.org/downloads

selenium - parameters for Selenium.
chromedriver is the version of the Chrome driver that we will use for testing.

reporting:
- data-source: sample-labels
  module: junit-xml

reporting - section of the Taurus configuration file. Here you set the settings for the analysis and reporting modules. We use the JUnit xml reporting module. You can find more information on this module here .

Let's summarize.

We have 6 sections in the jira.xml file:
settings - top-level settings and parameters.
You need to change the following parameters in this section:

application_hostname: localhost   # Jira DC hostname without protocol and port e.g. test-jira.atlassian.com or localhost
    application_protocol: http      # http or https
    application_port: 2990            # 80, 443, 8080, 2990, etc
    application_postfix: /jira           # e.g. /jira in case of url like http://localhost:2990/jira
    admin_login: admin
    admin_password: admin

You may also need to change the following parameters:

concurrency: 5
test_duration: 5m

I changed these parameters to small values ​​so that the tests work out quickly.
If you changed test_duration and concurrency, be sure to check the ramp-up parameter in the execution section. You may need to change it too.
services - scripts that will be executed at the stages of the Taurus test life cycle.
execution - execution parameters of test scripts. First, run JMeter, and then Selenium.
scenarios - test script parameters. Perhaps you will like to change the number of operations per hour and the percentage of tests. I’ll talk about how to add or remove tests in one of the following parts.
modules - modules used for tests. We use consolidator, JMeter and Selenium. Taurus provides us with the availability of these modules during testing.
reporting - parameters of the reporting module.
Good. We have changed the configuration file and are ready to run the tests.

Run test


First, activate the Python virtual environment (you can find information about this in the README.md file ), go to the app folder and run the test:

bzt jira.yml

In my case, the test failed with an error:

17:03:26 WARNING: Errors for python util/data_preparation/jira/prepare-data.py:
There are no software projects in Jira

All is correct. My copy of Jira is empty. It is necessary to fill it with data for testing. I will tell you how to do it, in part 2.

All Articles