Without knowing the correct definition of performance testing terminologies, it is difficult to understand the basics. This article covers all the important terms which are majorly used in performance testing activities. Following are the terminologies and their definitions:
Performance Testing Tool generates the virtual users which simulate the real user’s activity. Actually, these are the programming threads who perform the activity on the application/website on the behalf of a real user and generate the load on the server. Some performance testing tools use a different name like VUser, User, Thread, User Load etc .to represent the Virtual User.
Response Time represents the time from the first byte of the request sent to the server until the last byte of the response received by the client. In simple words, when a user wants to login on a website, he enters the credential and press “Login” button. Let’s assume the next page loads in 5 seconds, so 5 seconds will be the response time. The response time calculation starts from the moment he presses the “Login” button and ends when he gets the complete response of the page.
Sometimes response time is also called as “Page Response Time” or “Transaction Response Time”. Note that the Response Time is the sum of Network Latency and Server Processing Time.
As per Micro Focus LoadRunner, throughput is the amount of data (in bytes or MBs) that the VUsers received from the server at any given second.
As per Apache JMeter, throughput is the number of requests sent per unit of time by JMeter to the server.
Example: 30 requests per minute.
A transaction is a group of requests belong to a particular page. Ideally, each user action has one separate transaction so that response time can be measured individually. Although one transaction can also have multiple user actions. It totally depends on what to measure.
Example: Login Transaction, Search Transaction, Order Submit Transaction etc.
Transactions per second (TPS)
Transactions per second or TPS shows the number of transactions sent by the users in one second. TPS is one of the key metrics of non-functional requirement which helps to set the expected load on the server. The bigger unit of TPS is Transactions per hour (TPH) which represents the transaction rate at an hourly basis.
Iteration defines the complete journey of a user mentioned in the test case. It is a group of transactions which denotes the end-to-end flow of the user action.
Example: A iteration can have below transactions:
Home Page -> Login -> Search Item -> Select Item -> Order Item -> Logout
The above iteration represents a user journey from Home Page to Logout.
Iterations per second (IPS)
Iterations per second or IPS shows the number of transactions sent by the users in one second. The iterations per hour or IPH represents the hourly rate of iteration.
Example: If a performance tester conducts the testing for order submission on an e-commerce website and wants to know the total number of orders submitted during the test then he can simply check the number of iterations figure; provided that one order was submitted in one transaction.
Think Time denotes the delay between two transactions. Actually, a real-world user takes time to read the content of a web page or fill the details on a web form. Such activities create a gap between the two actions of a user. Think time simulates the same time gap by adding a delay between two transactions.
Think time helps to achieve the user concurrency. Read More >>
Just like think time defines a delay between two transactions, Pacing defines a delay between two iterations. Pacing helps to achieve the required TPS (transactions per second) in a performance test.
Network Latency represents the time taken by the network to transfer the data from one end to another. A channel or network adds some delay while transferring the data between Client and Server. The faster a network, the lesser network latency. Therefore network latency plays an important role in performance testing.
Server Response Time
Once a request reaches from the client to the server then server takes some time to process the request and respond back to the client. Server Response Time denotes the time taken by the server to process the request.
The sum of network latency and server response time provides the overall response time.
Non-Functional Requirement (NFR)
NFRs are the goals and expectations set for an application when it goes under performance testing. Non-Functional Requirement covers the client expectation about the performance of the application under various situations. At the end of the test, a performance tester compares the results against the defined NFRs.
- ABC application should handle 1000 users load.
- ABC application response time should not be more than 5 seconds.
A Performance Test Script is a programming code which automates the real-world user activities. Such a script is developed by a performance testing tool.
The method of communication between a client and the server. All performance testing tool does not have a protocol option.
The selection of protocol depends on the language/technology used for an application.
Example: Web HTTP/HTML, Ajax TruClient, etc.
It is a collection of business processes based on performance requirements. A scenario is a window where a performance tester defines no. of users, test duration, Vuser running pattern, etc. along with business processes
Workload Modelling is a process of distributing the load across the identified test cases which simulates the real-world situation in the performance test environment and fulfil the purpose of the test.
These are the physical machines used to generate virtual users. The quantity of Load Generator depends on the user load and memory footprint of each user.
These are the important performance testing terminologies. Hope, these performance testing terminologies had provided a basic knowledge in an easy and simple way.