How do we import outsourced testing. Step-by-step instruction

When the dollar soared in 2014 and the country headed for import substitution, I had to replace the old team of external testers, paid in dollars, with the new one, paid in rubles. In this article I will tell you in detail how we organized this transition process, what difficulties we encountered, how they were solved and what economic effect we got. 


How did the need for outsourcing testing come about?


I have been working at LANIT for 13 years. At first, the testing volumes in my department were small, and I could combine testing 2-3 small projects. Then there were new customers and projects used for ng bigger scale, and I started to burst.

Just at that moment, a company came out to offer us outsourcing testing. Its main specialization was testing, employees were relatively inexpensive compared to experienced testers from Moscow, but despite this, settlements had to be made in dollars. We began to work together and grew together as tasks on projects became more complicated, shared our experience, and I learned a lot from my partners. 

But in 2014 there was a crisis, the dollar soared in a very short time from 32 to 50 rubles, then to 70, and at peak reached about 90 rubles. Such a rise in the dollar directly affected the cost of outsourcing employees.

We are looking for "ruble" testers


At the same time, at the same time, the country's leadership set a course for import substitution of foreign software. Import substitution was also required in testing, since the teams involved in outsourcing were not Russian, and mutual settlements in dollars quickly scooped up the entire testing budget.

Due to the increase in the number of projects in the department, the question arose of expanding the testing teams. The needs of the department for testing specialists could no longer be provided in full by our dollar contractor and we had to look for options for cooperation with other companies. The situation with the growth of the dollar even more quickly pushed us to search for partners in the Russian market.   

Along with import substitution, new challenging tasks also came. It was necessary to find companies on the Russian market that provide testing outsourcing services with high quality work and the ability to quickly scale up, since the testing needs in our department were constantly growing, the package of projects was expanding and it was necessary to find “ruble” testers that would satisfy our needs as soon as possible requests. 

We began to research the market, interviewed colleagues, including those working in other companies, asked for recommendations, held a huge number of meetings with different companies. As a result, those were found who corresponded to our requests for competencies and with whom we were ready to work in the Russian market. Most importantly, they were ready to sell us not just testers, but a range of services. 

Then, preliminary calculations of costs, terms of import substitution were carried out, a work plan was drawn up. After the approval of the import substitution plan by the director of the department and the project manager, work began with Russian teams, their leadership, discussion of the terms of cooperation and my requirements for the team and the quality of work.

Growing frames


The service, which was very important for us, was the ability to quickly grow testers from interns. Of course, these services were provided by the old dollar partners and we did not have a question about the quality of services, the question was their cost and our need to reduce costs. Our new partners had internship programs, competent coaches, a psychologist, a competent HR specialist and quickly found, taught and implemented people invisibly into the project.

When these people completed an internal internship, they showed them to me for an interview. If the immersion in the project, knowledge of testing theory and the implementation of practical tasks on the project were "excellent", we took them to our project team. 

And so the team was assembled, and I was convinced that the new contractor is reliable, independently closes many activities in working with personnel (training, implementation in the project (by agreement it was a minimum of two weeks training at the expense of the contractor), motivation, assistance in solving problems, professional programs development), guarantees contractual relations the stability of rates in rubles in the long term, and also demonstrates customer loyalty. Now it remains to solve the most difficult task - to replace more expensive specialists with cheaper ones without losing quality. 

We form a set of metrics


So, the goal was clear - to replace the team, maintain the quality, the existing process, good relations with the previous contractor, with whom we worked at that time for about five years. The realities of the business are such that it required changes, concessions, flexibility on both sides for the sake of long-term cooperation. While the management of the department and our long-standing partners were trying to look for options on how to continue cooperation in the new economic conditions, the question arose before me: how will we measure quality in order to understand whether it has become better / worse or everything is stable.

In parallel with the process of import substitution, the production process in the department on large projects grew and improved. Therefore, the issue of quality control on a regular basis was especially relevant for us during this period.

To begin with, it was necessary to understand the “total temperature” of the project, what we have for a certain period and what we can measure. Metrics were taken over the past six months, since this is a rather long period for which there have not been any global changes in the composition of the working teams of analytics, development and testing. Such metrics were as relevant as possible. We were able to measure indicators in the work of the current team, as well as indicators in the context of the project.

Team performance was measured monthly by the Jira bug tracker. The main ones were:

  • number of registered defects 
  • how many of them are closed in resolution not a bug, 
  • what are the priorities of defects 
  • how many duplicates are among them, 
  • feedback from analysts through a questionnaire, where they evaluated the work of each tester according to different criteria, 
  • similar feedback from development,
  • % missed defects in the production environment for the month / release, 
  • number of tasks tested 
  • average time for defect registration, 
  • average time for defect validation, 
  • average time for incident handling from escort, 
  • % of errors uncorrected in the release, 
  • regression coefficient and many others.

All data was summarized in tables, which were then analyzed by the project project manager. If some metrics differed from the targets, then measures were taken.

This is how the regular collection of metrics on one of our projects looked like. 

Metrics in the context of the project

And this is how the collection of metrics on the quality of work of the testing team looked like, and for comparison, dollar and ruble teams of contractors were taken.

Team metrics

We collected statistics on the quality of work of the current team, determined the quality criteria and satisfaction with the work of the team by project analysts, developers and the project manager. Now there is an understanding with what indicators the import substitution process began - the very set of metrics with which the result could be compared.

Immersing new people


Then it was possible to prepare materials for immersing new people in the project and gradually transferring competencies and tasks to the new team.

To make the process go smoothly and painlessly, a schedule plan was developed. The key feature was that the main testing competencies were concentrated with me, and I was not just a distributor of tasks for testing resources, but a test manager on projects where import substitution was required. That is, an understanding of the testing strategy, test plans, criteria for starting and completing the test, business priorities - all this was within the framework of my knowledge and competencies, and I could only competently build a plan for the phased replacement of people.

It was decided to compile a knowledge matrix on the project. To do this, all design subsystems / modules / functions were written out, and everyone in the testing group of the old team had to put a self-rating on a five-point scale, as far as he was immersed in the subsystem / module / function. More details about the knowledge matrix in my article “What the testing of the state information system taught us” , there you can find the template of the knowledge matrix and use it in your work.

Then we updated the project wiki and for this we asked the leaders of the testing groups in the old team to supplement or create teaching materials, instructions for immersion in the subsystems / modules / functions, which we identified as the most critical for transferring knowledge. These manuals and instructions were necessary to transfer to the new team in the first place. 

The transferred project knowledge base looked something like this:


In parallel with this process, Skype meetings were planned with leading analysts for the subsystem / module, which were attended by both those responsible for testing from the old and the new team. 

The purpose of the Skype meetings was to explain the business process of the subsystem / module, to highlight the main business functions from the point of view of the analyst, comments, what to look for when testing, which is important for the customer + parallel discussion of new improvements. 

What gave the Skype meetings: 

  • teaming: old + new, 
  • Record of a Skype lecture with the participation of leading analysts of all subsystems / modules, which can be used in the future to immerse new people in the project, 
  • Improving communications “analyst-tester”, 
  • knowledge matrix in a joint team of old and new testers, according to which it was possible to monthly monitor the growth of immersion in the project of new people.

Upon completion of the first stage of introducing new people, the key tasks were completed:

  • methodical materials for immersion in the project were collected,
  • Familiarization of teams and knowledge sharing,
  • new people were initially “brought” under the more experienced old team members who were tasked with monitoring the quality of work on testing new team members,
  • knowledge matrices of new / old test participants have been compiled, which, as they immerse themselves in the project, are updated on a regular basis monthly.

At the second stage, it was necessary to consolidate the acquired knowledge of the process, the project, and interviews were planned, which were also carried out in several iterations. 

First, the person responsible for the subsystem / module from the old testing team interviewed everyone who was immersed in his subsystem / module for knowledge of the business of the subsystem, assessed the quality of the compiled test designs, regression test cases for new improvements, and statistics on missing defects after the test were collected (always someone more immersed selectively checked new improvements to avoid missing defects from new testers). 

Then, the leading analysts evaluated the new testers using a 30-minute interview asking questions about its subsystem. Often, the leading analyst was not required to conduct an interview, if during the work he interacted a lot with the new tester and could evaluate the degree of immersion in the functionality without an interview.

When it became necessary to rotate retiring participants from the old team, we did not coordinate the rotation for their expensive specialists, but carried out such a rotation on our own, expanding the team with new people from Russia.

Another important task was not to offend the testers of the old team, but to clarify that we began to work not only with them, but also to expand the team, attracting other specialists, that there are no questions to the competencies of people, there is a production need to expand the geography of the distributed team and we will do it together .

After half a year of such rotation, the team’s volume, of course, has greatly increased, which was planned. But as soon as the performance indicators of the new testers became similar to those of the old ones, and the satisfaction coefficient of analysts and developers of the project stabilized, the opposite process began - the reduction of the team due to the gradual withdrawal of more expensive testers and replacing them with Russian specialists. 

In parallel, the production process was analyzed and feedback was removed, the most critical problems that had to be solved with each new release were highlighted.

An example of collecting feedback from analysts on the work of testers:


An example of a summary list of identified problems of the testing process, a strategy for solving problems and monitoring the correction of the situation, in order not to impair the quality of testing on a project where the process of import substitution of resources was carried out:


The project management and the director of the department were regularly sent reports with information on the results of import substitution.

The reports included summary information with the results of the analysis of the work of old and new testers, demonstrated progress in the transfer of knowledge and analysis of the quality of the teams according to predefined quality assessment criteria.

The import substitution process lasted for about a year, as the teams were very large at the start of the process, and the projects were significant for the department. This caused certain difficulties in the substitution process, and could lead to a sharp loss in quality if import substitution is carried out quickly. 

Economical effect


In the end, you can calculate how much budget savings have occurred after the process of import substitution. 

At the start of import substitution, the number of dollar testers was approximately 60 people. In the process of import substitution, the team increased approximately 1.5 times at the peak and then returned to the number of 50 people. The size of the team even decreased slightly due to the optimization of testing and increasing competence, which eventually became completely on the side of LANIT. 

The economic benefit from the work done was 35%, but the real advantages are that the task of import substitution spurred us to open our own resource centers for the needs of the department. 

Initially, the centers of competence were conceived to solve problems with the lack of testing staff due to the active development of testing in the department, but over time, specialists from different directions appeared there - support, DevOps specialists, development, analytics.

At the moment, our department has almost completely switched to working with Russian testers. It is important to note that the majority are department employees working in regional resource centers. We have opened their own resource centers for testing in Chelyabinsk, Izhevsk, Perm and used on most of the work on many projects completely close their test teams. By the way, we have a vacancy in the development of automated tests .

Now the development vector of the testing direction in the department is focused on expanding its centers of competence. Attracting outsourcing teams goes by the wayside. We use them to quickly scale as the needs of the project grow. However, the principles of working with large distributed teams in our department remain unchanged, in detail about the testing process with the participation of large teams for GIS projects, is described in my article.  The accumulated experience allows us to quickly introduce large teams into the project, train, build competencies, quickly gain momentum in the speed of testing when rotating teams and / or employees within the department without losing the quality of the product.

All Articles