fbpx
Uncategorized

How We Create Load Tests For Our Customers In Loadero

Loadero is a SaaS (software as a service) for testing web applications and our main target is to provide a powerful tool to our users, so they can create load tests, performance tests, and WebRTC tests for their needs. But sometimes a proper tool is not enough to execute tests: some companies don’t have enough manpower, others might not have enough expertise in the QA field, yet testing is required for every web application. That’s why we also provide test script creation services to our customers, Loadero’s engineers create load tests and performance tests to allow our customers to leverage our experience in the industry and exceptional skills in using Loadero as a testing tool. We create load tests for video communication, e-learning, e-commerce, online events, and many other types of applications, in this blog post we are describing the process of working on such projects.

Usually, our customers are able to run load tests and analyze results themselves, but lack the skill to create a proper test. So in many cases when we are requested to work on a test for our customer we create a test as per requirements, hand it to the customer and provide instructions for running and editing, usually scaling it up for further testing. But sometimes our customers want to also use our experience in the field and ask us to actually run the tests and help with analyzing results. This makes the engagement a little different and adds an extra stage to it, but generally, the work consists of the following steps:

  • Test creation efforts estimation: gathering information about the test flow and the website, estimation of test script creation time frame and price.
  • Offer preparation: quoting estimated engineering work, discussing the approach with the customer, and signing an agreement.
  • Test creation: creating a test script, ongoing communication with the client, configuring test participants, delivering the test, and creating instructions for using it.
  • The test handoff and running the test: creating a project for the client, setting up the test, and running tests if needed.
  • Continued support: making sure the work we’ve done serves the customer’s needs, helping to use the test effectively, and editing it for changing requirements.

Let’s take a closer look at what happens during each of those engagement phases.

Test creation efforts estimation

First of all, we need to get an understanding of the application the customer plans to test with Loadero and for what purpose the customer is going to use the script. We ask our customers to provide a description of the app and of the user journeys to simulate, as they know usual user flows better than anyone. As the first step of gathering the information we usually send out a basic questionnaire about the test script flow and the website that will be used for testing. Another important part is to find out which environment will be used in the test to make sure we explore the correct one. We gather the information to keep in mind while our engineers will be exploring the application. Test scripts in Loadero are used to simulate what real users do when they are using the application, and those user journeys differ a lot. For example, to create a load test for a video communication WebRTC service for 1 on 1 calls we’d simulate users joining a call, spending some time in it for gathering WebRTC metrics, maybe add actions to use some features, such as muting, switching the camera on and off, using screen sharing functionality, etc. But when we are requested to test an e-commerce website, the user flows would be totally different: browsing products, adding them to the cart, making orders. That is why a proper understanding of the application to be tested is essential for creating a relevant test that brings valuable data to the results reports.

After gathering the information about the application and user flows one of our engineers explores the application, creates initial notes on its performance, selectors to be used in the test script, and possible difficulties which can make test creation take longer. 

Offer preparation

Once we have checked out the website and the expected user flow, we get back to the client with the gathered information, if needed, ask for clarifications in the scenario, discuss how the test can also be used in the future, and adjust the estimate. We’d offer some changes to the planned test in order to make the test provide important data, for example, measuring the execution time of some function, taking screenshots at important stages of the test, and strategy of ramping up the test. After that, we are ready to provide an approximate price for our services, which depends on how many hours of our engineer’s work will be required and how many test participants will be used in the planned test runs.

Usually, at this stage the plans can be adjusted, so we are discussing with the customer what would provide him the valuable information about the application’s performance. Additionally, many customers want us to create a load test and jump into load testing their application straight away, but the application isn’t really performance tested before. We usually offer a strategy of starting with small-scale tests and scaling up gradually, this can save our customers a significant amount of money if the application’s scalability doesn’t meet expectations, and smaller tests fail. It is quite a typical case when a customer states that they want to create a load test for thousands of concurrent users, but when actual testing begins we discover that the application can’t handle the load of 100 users. Starting testing by running a 1000 participants test would provide roughly the same result: we find out that the application can’t handle the load, but a test with 100 participants would cost 10 times less. Overestimation of the application capabilities can lead to confusion and an incorrect flow of testing, so we always make sure to provide our advice on the load test scaling-up strategy.

If the customer is ready to continue with the script creation services, then we discuss the final details and sign the quota agreement. For many customers we’d create a load test and hand it to the customer, running the test and analyzing the results would be done by the customer’s team. But if our engineers will also be running tests and analyzing the results, the time to be spent on this has to be estimated as well. We spend a lot of time on the exploration and preparation in order to give a detailed offer, but as the number of planned test runs may depend on the results of the tests, there usually is a range of price in it. Once the project is finished we bill our customers for the actual number of hours spent working on it.

At this stage, we also decide on the communication platform. In the process of working on the test configuration, questions may arise and effective communication can save a lot of time them. We’d use emails with some of our customers, but if the test is complex and a lot of collaboration is required, joining customers’ Slack or Discord can be a lot better choice.

Test creation

Finally, when all the information has been gathered and documents signed from both sides, we are ready to start writing the script and configuring the test. Even though we have gathered all the necessary information regarding the user journey, we are always ready for changes if a customer has an idea of how the planned test could be improved. 

In the perfect world, we’d be able to just work on the test on our own, but in real life, blockers might appear along the way and we work with our customer to remove those. Sometimes we cannot continue the work because of an issue or a bug on the client’s application side, for example, the API rate limiter is not disabled for our IP addresses, incorrectly set up application under test environment, etc. All of these issues can become a huge blocker, especially when we are working with teams in different parts of the world. For example one of our recent clients was from Southeast Asia, which meant that if we encounter an issue in the middle of our working day, then it will be discussed and resolved only on the next day, prolonging the scriptwriting process as there was an 8 hour time difference. We had to automate screen sharing functionality in the user journey, but it was not working, which left us with 2 options – change the user flow or wait for the client’s development team to fix it. For this particular client, we have decided to wait for the fix since this was a very crucial functionality and it was a must to make sure that it works for the end-users.

But sometimes blockers can come from our side as well. For instance, recently one of our clients wanted to dynamically generate JWT (JSON Web Token) to access URLs for testing. As we did not have the JWT library added to Loadero then, it meant that we needed to add it and test it before we can guarantee that it is working properly – all of that in quite a short timeframe, more precisely, in only 1 working day. While adding a library for Javascript, which was used for that customer’s test, would be sufficient, we decided to add such libraries for all the languages Loadero supports to offer that feature for all our users. Occasionally we get requests from our customers to add some features that they are looking for which not only helps them but also makes us develop a better product. We always try to come up with solutions that are beneficial for the customer and are not blocking the script creation any further.

The test handoff and running the test

Once the test script is done and is working stably we prepare to hand the project to the client. The final stage of working on a test configuration includes configuring all test parameters, setting specific asserts for different metrics like video FPS, audio bitrate, CPU and RAM usage. Defining asserts also helps to quickly assess how the application is performing under the load and if something went wrong for the participant, for example, it can be quickly validated if the participant failed the test because it didn’t have incoming and outgoing audio/video connections, instead of checking logs or screenshots for every participant. 

Usually, when our team is working on a test for a client, we create a new project for it and add the client’s Loadero accounts to it when everything is ready. Also, to ensure that everything is clear, we always write a summary of the script’s logic and how to configure it if needed, additionally, we do onboarding calls to give an explanation on how to use Loadero’s web application if necessary.

If our client’s team will be running the test and analyzing the results, our work on the test is finished here. But as was mentioned before, sometimes a client also wants our engineers to run the test, monitor it, and analyze the results reports Loadero generates. In that case, we are scheduling the test run times, deciding on the communication, and executing the tests.

Continued support

Even though the official agreement has ended, we keep in touch with our clients to make sure the test we created serves them well and they can effectively use Loadero to achieve what was planned. In case load tests have unveiled any bugs in the test script, we take the highest priority to fix these bugs so that the client can continue load-testing as soon as possible. We are there for them if any additional technical support is needed, recommendations for choosing a cost-effective pricing plan for their needs, discussing new work agreements for other test cases, or anything else. This part is very important for us, as we get their feedback and do our best to deliver improvements to our tool for them and other users.

Would you like our team to create a load test or performance test for your application too? Let’s see how we can help you with testing your application. Feel free to fill out the test creation request or contact us and let us know about your needs.