Performance Test Reporting

What is Performance Test Reporting?

Performance Test Reporting phase provides an overall test result, test analysis and recommendations to the project team or client from the application’s performance perspective. The outcome of this phase i.e. Performance Test Report document helps to take the GO / NO-GO decision for the whole application or specific business flow.

Purpose of the Performance Test Report:

Performance Test Report comprises of:

  1. A detailed description of performance test results
  2. GO/NO-GO decision
  3. Observations and findings
  4. Recommendations
  5. Identified Defects (Detected/Closed/Open)

Performance Test Report is an important document from the project delivery perspective. The project delivery closure document must have the performance test report. In case, the performance test report gives a NO-GO decision which signals the application is unfit for production, then the application cannot Go-Live.

Accountability:

The Performance Test Lead or Manager has a responsibility to prepare the Final Performance Test Report with the help of the Performance Test Analyst or Engineer. The input of the Performance Test Engineer helps to prepare a quick and detailed report. A Performance Test Engineer works on the ground level and he knows the major and minor points observed during the test. These points may help to prepare a perfect performance test report document.

The Performance Test Lead or Manager has another responsibility to walk through the report in front of the project team/client and get all the required sign-off from project stakeholders.

Approach:

Once all the performance test cycles are completed then a performance tester collects all the results and prepares the final performance test report. There are some important points which need to keep in mind while preparing the Final Performance Test Report:

  1. Use Simple (layman) language in the report
  2. Provide a summary of the overall test cycle
  3. Mention the GO / NO-GO status
  4. Justify the reason for either of the cases (in point 3)
  5. Check whether all the related NFRs meet or not
  6. Mark Pass / Fail to individual test
  7. Give a detailed description of defects along with the current status
  8. Provide proper and accurate recommendations
  9. Attach all the relevant artefacts for individual test
  10. Highlight the performance risk (if any)

GO / NO-GO Decision:

GO and NO-GO status refers to the decision on an application to go live or not. It indicates whether the performance of the application is as per the defined NFRs or not. The following points help to decide the GO or NO-GO status:

  1. GREEN: When all the tests meet the defined NFRs then the overall test result is marked as GREEN which signals GO. It refers that the application/project is good from the performance point of view and can go live.
  2. AMBER: When some of the tests do not meet the defined NFRs then the overall test result is marked as AMBER. In this situation, the performance test manager must:
      1. Analyse the criticality of the functionality
      2. Calculate the deviation of test results from the defined NFRs
      3. Understand the nature of defects
      4. Calculate the percentage of breached NFRs
      5. Investigate the cause of errors
      6. Identify the associated risk (in case of Go-live)

    The additional task for a Performance Test Manager/Lead is to schedule a meeting with project stakeholders and take a combined decision on the GO and NO-GO of the application.

  3. RED: When all the tests do not meet the defined NFRs then the overall test result is marked as RED which signals NO-GO. It refers that the application/project is not fit for production from the performance point of view.

The last step is to present the final performance test report to the project stakeholders. The Performance Test Lead/Manager should walk through the test report in a detailed manner with the justification of the GO or NO-GO decision.

Deliverable:

The final Performance Test Report is the deliverable of the performance test reporting phase which also represents the closure of Performance Testing. Download the template of the Performance Test Report.

Example:

PerfMate is happy to start the reporting phase of PerfProject. He conducted all the tests agreed in the performance test plan and the results are favourable and meet all the NFRs. Following is the summary of the test result:

PerfMate starts the preparation of the Final Performance Test Report. He starts with the overall summary of the tests and provides GO sign-off from the performance testing side. After that, he lists out all the individual test results with his findings and observations. PerfMate attaches the relevant proofs i.e. Interim Test Reports, Heap dump analysis reports, AWR reports etc. for the individual test and gives Pass or Fail status as per the analysis.

PerfMate presents the Performance Test Report to the stakeholders of PerfProject and gets the sign-off from them. After getting all the sign-offs, he closes the Performance Testing ticket (assigned before starting the risk assessment at the initial stage) and attached the final report to the ticket.

That’s the end of the Performance Testing Life Cycle.


7 thoughts on “Performance Test Reporting”

  1. Hi Team,

    We wanted your help to us in determining how we can provide a rationale to our clients for the hardware requirement we seek, based on our performance tests, and we also have some queries from this.

    1) Is there necessary to feed data into DB for before load test execution.
    If it is necessary,
    1) How can i feed data for 500, 1000, 2000 user load test.
    2) What are the things should we ask from client side?
    2) Can we say, “This server(Ex: 8 GB RAM, 4 CPU) infrastructure is valid for 5000 concurrent users”
    If yes,
    1) Is there any extrapolation technique to identify user load effectively?
    2) What is the information to be collected from client side?

    I would be very grateful if you answer these questions.

    Thanks,
    Karthik

    Reply
    • Hi Karthik,

      To understand the hardware requirement you need to contact your system architect because hardware requirement is based on the software architecture, application type, modules, business logic, expected user load, expected user data, data size for each user and many other things. You can not calculate it directly.

      1) If the test scenario read the data from DB then it is must because necessary test data should be in the DB. If the test scenario has only write operation then it is optional; but better to have some dummy data before the test.
      1) Get the DB query from DBA and prepare a script (You can also prepare in LR/JMeter) and add the test data. If you have production data then you can copy it into the test environment. Make sure you do not violate the security policy of user data.
      2) Your question is not clear. Are you referring to NFRs?
      2) As suggested you can not say it blindly.
      1) Refer to https://www.perfmatrix.com/extrapolation-method/
      2) https://www.perfmatrix.com/non-functional-requirement-gathering/

      In addition: You can also refer to https://www.perfmatrix.com/performance-testing-life-cycle/

      Reply
  2. Hi Team,
    I have some questions. Please have a look the below.
    1. With out approaching anyone else like Dev,Business Analysts,NFT architecture how to get the NFR’s and how you consider as the high priority req’s?
    2. while executing for 1 hr load test if see any errors what steps do you take?
    3. How you see memory leaks when execute the test for 1 hr? If yes what error you will see?
    4.If you run the test for 1000 users in test env, will you apply the same/more user load in prod env?
    5.For 50000 users what DB size organisation maintained?
    Please explain the above queries. Thanks

    Reply
    • Hi Aswini,

      1. In that case, you should have at least the basic knowledge of the application/system. Also, you must know the business flow. For that purpose, you can refer to the project documents. Even those are also not available then you have to follow the step-up approach for that application/system: https://www.perfmatrix.com/step-up-performance-test/

      2. First step, you need to analyse the nature of the error. Secondly, correlate the different graphs and try to understand the cause like whether it is due to load or prolonged test period etc. If you have server stats/graph/log access then find out the root cause of the error.

      3. You may or may not confirm memory leak in 1-hour test, better to perform soak test: https://www.perfmatrix.com/what-is-endurance-test/

      4. Test never done on production environment. The performance Test environment should be a replica of the production environment and if it is 100% of the production environment then real-time and real-world load can be applied in the test environment.

      5. It depends on many factors like business logic, number of tables in DB, number of data in each table, the complexity of the system etc.

      Reply
  3. Many thanks for your response back.
    Could you pls clarify the below.
    1.How to integrate Jenkins to LR tool?
    2. Network Virtualization and Service Virtualization?
    Thanks.

    Reply
  4. Hi Team,

    I need to provide granularity report , I ran the test in non-gui mode.
    From the html report need to generate granularity report, could you please help me on this.

    Reply

Leave a Comment