How to Retry failed tests in TestNG – IRetryAnalyzer

HOME

TestNG is a well thought Test Framework. It provides a lot of different features which makes the life of a tester a little easy. It happens sometimes that a test execution fails, but the failure is not a product bug, but there can be different reasons for the failure such as the environment is down, third party web service is down, or the browser becomes unresponsive. Imagine a scenario where we need to run a test suite consisting of 100 tests and a few tests failed as a result of a known intermittent environment issue. We know that these tests can pass if rerun a couple of times. So, in this case, the retry functionality of TestNG comes to the rescue. This is one of the best and most frequently used functionality.

In this tutorial let us study how we can implement retry on failed tests in TestNG. In order to achieve this, we have to first understand the org.testng.IRetryAnalyzer interface.

To start with, please add the below dependencies to the Maven Project.

<dependencies>
  
      <dependency>
          <groupId>org.seleniumhq.selenium</groupId>
          <artifactId>selenium-java</artifactId>
          <version>3.141.59</version>
      </dependency>
      
      <dependency>
          <groupId>io.github.bonigarcia</groupId>
          <artifactId>webdrivermanager</artifactId>
          <version>5.1.0</version>
       </dependency>

      <dependency>
           <groupId>org.testng</groupId>
           <artifactId>testng</artifactId>
           <version>7.5</version>
           <scope>test</scope>
      </dependency>

  </dependencies>

IRetryAnalyzer – It is an interface to implement to be able to have a chance to retry a failed test. The definition of this interface is

public interface IRetryAnalyzer {

  /**
   * Returns true if the test method has to be retried, false otherwise.
   *
   * @param result The result of the test method that just ran.
   * @return true if the test method has to be retried, false otherwise.
   */
  boolean retry(ITestResult result);
}

This method implementation returns true if you want to re-execute your failed test and false if you don’t want to re-execute your test.

When you bind a retry analyzer to a test, TestNG automatically invokes the retry analyzer to determine if TestNG can retry a test case again in an attempt to see if the test that just fails now passes. Here is how you use a retry analyzer:

  1. Bind this implementation to the @Test annotation for e.g., @Test(retryAnalyzer = Retry.class)
  2. Build an implementation of the interface org.testng.IRetryAnalyzer

1. Add IRetryAnalyzer to the @Test Annotation

First of all, you need to create a class that implements the IRetryAnalyzer like the below example:

import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;

public class Retry implements IRetryAnalyzer {
	
	int retryCount = 0;
	int maxRetryCount = 2;

	public boolean retry(ITestResult result) {
 
	if(!result.isSuccess()) {                         //Check if test is failed
		
		if(retryCount<maxRetryCount) {                //Check if the maximum number of test execution is reached
			System.out.println("Retrying Test : Re-running " + result.getName() +
 " for " + (retryCount+1) + " time(s)."); //Print the number of Retry attempts
			
			retryCount++;                             //Increase the maxRetryCount by 1
			
			result.setStatus(ITestResult.FAILURE);    //Mark test as failed
         return true;                                 //Rerun the failed test
		} else {
			result.setStatus(ITestResult.FAILURE);    //TestNG marks last run as failed, if last run is max retry
		} 
	  }else {
			result.setStatus(ITestResult.SUCCESS);    //TestNG parks test as passed when the test test passes
			
	  }
	
return false;
	}
}

This example shows that failed test case will run 3 times till it passes. In case it fails the third time, test execution will stop and TestNG will mark this case as failed. We can change the number of tries by changing the value of maxRetryCount.

Using retryAnalyzer attribute in the @Test annotation

The next step is to associate your test cases with IRetryAnalyzer. In order to do this, you need to use the method below.

@Test(retryAnalyzer = Retry.class)
public void verifyLoginPage() {
}

Let us see the complete implementation with the help of the below example.

import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.testng.Assert;
import org.testng.annotations.AfterTest;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;
import io.github.bonigarcia.wdm.WebDriverManager;

public class RetryFailedTests {
	
	WebDriver driver;
	 
    @BeforeTest
    public void setUp() {
    	 
    	WebDriverManager.chromedriver().setup();
    	 
        ChromeOptions chromeOptions = new ChromeOptions();
  
        driver = new ChromeDriver(chromeOptions);
        driver.get("https://opensource-demo.orangehrmlive.com/");
 
        driver.manage().window().maximize();
        driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
    }
 
    @Test(retryAnalyzer = Retry.class)
    public void verifyLoginPage() {
 
        String expectedTitle = driver.findElement(By.xpath("//*[@id='logInPanelHeading']")).getText();
 
        System.out.println("Title :" + expectedTitle);
        Assert.assertTrue(expectedTitle.equalsIgnoreCase("LOGIN Panel !!"));
    }
 
    @Test(retryAnalyzer = Retry.class)
    public void verifyHomePage() {
 
        System.out.println("Username Entered");
        driver.findElement(By.name("txtUsername")).sendKeys("Admin");
 
        System.out.println("Password Entered");
        driver.findElement(By.name("txtPassword")).sendKeys("admin123");
 
        driver.findElement(By.id("btnLogin")).submit();
 
        String newPageText = driver.findElement(By.id("welcome")).getText();
        System.out.println("newPageText :" + newPageText);
        Assert.assertTrue(newPageText.contains("Welcome"));
    }
 
    @AfterTest
    public void teardown() {
 
        driver.quit();
    }
 
}

In the above example, test – verifyLoginPage() will be retried a maximum of 3 times, if the test fails. To run the tests, Right-click on the class and select Run As ->TestNG Suite.

The output of the above program is

2. Implement Interface ITestAnnotationTransformer to retry failed tests

In this case, you would need to implement ITestAnnotationTransformer interface. The implementation of this interface is

public interface IAnnotationTransformer extends ITestNGListener {

  /**
   * This method will be invoked by TestNG to give you a chance to modify a TestNG annotation read
   * from your test classes. You can change the values you need by calling any of the setters on the
   * ITest interface.
   *
   * <p>Note that only one of the three parameters testClass, testConstructor and testMethod will be
   * non-null.
   *
   * @param annotation The annotation that was read from your test class.
   * @param testClass If the annotation was found on a class, this parameter represents this class
   *     (null otherwise).
   * @param testConstructor If the annotation was found on a constructor, this parameter represents
   *     this constructor (null otherwise).
   * @param testMethod If the annotation was found on a method, this parameter represents this
   *     method (null otherwise).
   */
  default void transform(
      ITestAnnotation annotation, Class testClass, Constructor testConstructor, Method testMethod) {
    // not implemented
  }

The transform method is called for every test during the test run. We can use this listener for our retry analyzer as shown below:

import java.lang.reflect.Constructor;
import java.lang.reflect.Method;
import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;

public class RetryListener implements IAnnotationTransformer{

	public void transform(ITestAnnotation arg0, Class arg1, Constructor arg2,Method arg3) {
		
			arg0.setRetryAnalyzer(Retry.class);
		}

	}

Now let us create a class that contains all the tests.

import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.testng.Assert;
import org.testng.annotations.AfterTest;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;
import io.github.bonigarcia.wdm.WebDriverManager;

public class RetryTests {
	
	WebDriver driver;
	 
    @BeforeTest
    public void setUp() {
    	 
    	 WebDriverManager.chromedriver().setup();
    	 
         ChromeOptions chromeOptions = new ChromeOptions();
  
         driver = new ChromeDriver(chromeOptions);
         driver.get("https://opensource-demo.orangehrmlive.com/");
 
         driver.manage().window().maximize();
         driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
    }
 
    @Test(description = "This test validates title of login functionality")
    public void verifyLoginPage() {
 
        String expectedTitle = driver.findElement(By.xpath("//*[@id='logInPanelHeading']")).getText();
 
        System.out.println("Title :" + expectedTitle);
        Assert.assertTrue(expectedTitle.equalsIgnoreCase("LOGIN Panel !!"));
    }
 
    @Test(description = "This test validates  successful login to Home page")
    public void verifyHomePage() {
 
        System.out.println("Username Entered");
        driver.findElement(By.name("txtUsername")).sendKeys("Admin");
 
        System.out.println("Password Entered");
        driver.findElement(By.name("txtPassword")).sendKeys("admin123");
 
        driver.findElement(By.id("btnLogin")).submit();
 
        String newPageText = driver.findElement(By.id("welcome")).getText();
        System.out.println("newPageText :" + newPageText);
        Assert.assertTrue(newPageText.contains("Welcome"));
    }
 
    @AfterTest
    public void teardown() {
 
        driver.quit();
    }
 
}

Once we have the implementation of IAnnotationTransformer, we just need to add it as a listener in the testng.xml. Like this:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Suite">

<listeners>
<listener class-name="com.example.retrydemo.RetryListener"></listener>

</listeners>

  <test name="Test">
    <classes>
      <class name="com.example.retrydemo.RetryTests"/>
    </classes>
  </test> <!-- Test -->
</suite> <!-- Suite -->

Now let us run the tests. Right-click on testng.xml and select Run As -> TestNG Suite.

The output of the above program is

This is pretty much it on this topic. Congratulations on making it through this tutorial and hope you found it useful! Happy Learning!! Cheers!!

Advertisement

TestNG Listeners in Selenium

 HOME

Listener is defined as interface that modifies the default TestNG’s behavior. There are several interfaces that allow you to modify TestNG’s behavior that are called “TestNG Listeners”.  It allows customizing TestNG reports or logs. There are many types of TestNG  listeners available. Here are a few listeners:

  • IAnnotationTransformer 
  • IAnnotationTransformer2 
  • IHookable 
  • IInvokedMethodListener 
  • IMethodInterceptor 
  • IReporter 
  • ISuiteListener 
  • ITestListener 

When you implement one of these interfaces, you can let TestNG know about it with either of the following ways:

  • Using in your testng.xml file.
  • Using the @Listeners annotation on any of your test classes.

ITestListener has the following methods

  • OnStart  Invoked before running all the test methods belonging to the classes inside the tag and calling all their Configuration methods.
  • onTestSuccess onTestSuccess method is called on the success of any Test.
  • onTestFailure onTestFailure method is called on the failure of any Test.
  • onTestSkipped– onTestSkipped method is called on skipped of any Test.
  • onTestFailedButWithinSuccessPercentage– method is called each time Test fails but is within success percentage. Invoked each time a method fails but has been annotated with successPercentage and this failure still keeps it within the success percentage requested.
  • onFinish– Invoked after all the test methods belonging to the classes inside the tag have run and all their Configuration methods have been called.
  • onTestStart Invoked each time before a test will be invoked. The ITestResult is only partially filled with the references to class, method, start millis and status.

Here, I explain the use of listener – ITestListener  in a program mentioned below

Step 1) Create class “ListenerDemo” that implements ‘ITestListener’. Add methods like onTestFailure, onTestSkipped, onTestStart, onTestSuccess to this class

Step 2) Create another class “ListenerTestCases” for the login process automation. Selenium will execute this ‘TestCases’ to login automatically.

Step 3) Next, implement this listener in our regular project class i.e. ” ListenerTestCases “. There are two different ways to connect to the class and interface.

The first way is to use Listeners annotation (@Listeners) as shown below:

@Listeners(com.selenium.testng.TestNGDemo.ListenerDemo.class)

Step 4): Execute the “ListenerTestCases” class. Methods in class “TestPass ” are called automatically according to the behavior of methods annotated as @Test.

Step 5): Verify the Output that logs displays at the console.

import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
 
public class ListenerDemo implements ITestListener {
 
            // When Test case get failed, this method is called.
            public void onTestFailure(ITestResult Result) {
                        System.out.println("The name of the testcase failed is :" + Result.getName());
            }
 
            // When Test case get Skipped, this method is called.
            public void onTestSkipped(ITestResult Result) {
                        System.out.println("The name of the testcase Skipped is :" + Result.getName());
            }
 
            // When Test case get Started, this method is called.
            public void onTestStart(ITestResult Result) {
                        System.out.println(Result.getName() + " test case started");
            }
 
            // When Test case get passed, this method is called.
            public void onTestSuccess(ITestResult Result) {
                        System.out.println("The name of the testcase passed is :" + Result.getName());
            } 
}

In the below test, there are 2 test cases. One Test passes and another fails. When we are executing ListenerTestCases, it will call the ListenersDemo internally.

import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
importorg.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
 
@Listeners(com.selenium.testng.TestNGDemo.ListenerDemo.class)
public class ListenerTestCases {
 
            static WebDriver driver;
 
            @Test
            public void TestPass() {
                        System.setProperty("webdriver.chrome.driver",
                                    "C:\\Users\\Vibha\\Desktop\\SeleniumKT\\chromedriver_win32\\chromedriver.exe");
                        driver= new ChromeDriver();
                        driver.get("https://opensource-demo.orangehrmlive.com/");
                        driver.findElement(By.name("txtUsername")).sendKeys("Admin");
                        driver.findElement(By.name("txtPassword")).sendKeys("admin123");
                        driver.findElement(By.id("btnLogin")).submit();
                        String dashboardTitle = driver.findElement(By.id("welcome")).getText();
                        Assert.assertTrue(dashboardTitle.contains("Welcome"));
            }
 
            @Test
            public void TestFail() {
                        System.setProperty("webdriver.chrome.driver",
                                    "C:\\Users\\SingVi04\\Desktop\\SeleniumKT\\chromedriver_win32\\chromedriver.exe");
                        driver= new ChromeDriver();
                        driver.get("https://opensource-demo.orangehrmlive.com/");
                        driver.findElement(By.name("txtUsername")).sendKeys("Admin");
                        driver.findElement(By.name("txtPassword")).sendKeys("admin123");
                        driver.findElement(By.id("btnLogin")).submit();
                        String dashboardTitle = driver.findElement(By.id("welcome")).getText();
                        Assert.assertTrue(dashboardTitle.contains("Hello"));
            }
}

Output
TestFail test case started
The name of the testcase failed is :TestFail
TestPass test case started
The name of the testcase passed is :TestPass
PASSED: TestPass
FAILED: TestFail
java.lang.AssertionError: did not expect to find [true] but found [false]

To execute this program, we need to Right-click and select Run as – TestNG.

There is another way to execute the Listener class, which is using testng.xml. There is a Listener class named “ListenerDemo” where the implementation of various methods of the listener is present. If we want to run the tests using testng.xml, then there is no need of mentioning Listeners in the ListenerTestCases class.

<?xml version = "1.0"encoding = "UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<suite name = "TestSuite">
<listeners>
<listener class-name ="com.selenium.testng.TestNGDemo.ListenerDemo"/>
</listeners>
 
<test name ="Test">
<classes>
<class name="com.selenium.testng.TestNGDemo.ListenerTestCases"/>
</classes>
</test>
</suite>

The test execution result will look like something shown below.

TestNG generates the various types of reports under the test-output folder. Open “emailable-report.html”, as this is an HTML report open it with the browser. It will look like something below.

TestNG also produce “index.html” report, and it resides under the test-output folder

There is another example of Listener –ITestResult.

We are done! Congratulations on making it through this tutorial and hope you found it useful! Happy Learning!!

How to generate Random Variables in JMeter

HOME

This tutorial tells how random variable can generate and pass as a part of a request in JMeter. Suppose, in the load test, there is a requirement to pass random or different values to a specific parameter in the requests, how this can be achieved. One of the ways is to pass different values from a .csv file that contain different values and use them in the requests. The second way is to add Random Variable in JMeter, which generates random values at every run for requests. 

 

 

Implementation

Variable name: The name we are going to use to invoke the variable. “ReqCacheKey” in this example.

Output Format: The format for the variable. You can set the desired length of the number. I have set 0000000000 in order to work with a five-digit number. You can also use USER_000.

Minimum and Maximum: The range we want to set for the variable.

Seed: The seed for the random number generator. A seed is the first input that the number generation function receives to start the random generation. Here, it is ${__time()}. This will randomly generate the value like 1256078934, 9863457201

Per Thread: Is important to consider this option. If you set it as True, the threads will share the value. This means that there will be threads with the same value. If you require the variable to be different each time, this could cause problems for us. If you want to always generate a different value, you have to set it as False.

Create a Test Plan in JMeter by following below-mentioned steps

Step 1 –  Add Thread Group

Select Test Plan on the tree

Add Thread Group                     

To add Thread Group: Right click on the “Test Plan” and add a new thread group: Add -> Threads (Users) -> Thread Group

In the Thread Group control panel, enter Thread Properties as follows: 

Number of Threads: 1 – Number of users connects to the target website
Loop Count: Infinite  – Number of times to execute testing
Ramp-Up Period:
Duration: 5 sec

Step 2 –  Adding JMeter elements  

The JMeter element used here is HTTP Request Sampler. In HTTP Request Control Panel, the Path field indicates which URL request you want to send

Add HTTP Request Sampler

To add: Right-click on Thread Group and select: Add -> Sampler -> HTTP Request

Below-mentioned are the values used in HTTP Request to perform the test

Name – HTTP Request 
Server Name or IP – localhost
Port – 8010
Method – POST
Path – /demo/helloworld

Step 3 –  Add a Random Variable

 To add: Right-click on Thread Group and select: Add -> Config Element-> Random Variable

The sample Request is shown below. cacheKey variable is parameterized. ReqCacheKey in Random Variable will generate the random values, which will be passed to the parameter cacheKey present in the request body

Seed is ${__time()}. This generates the value randomly, like 1256078934, 9863457201. If I keep Seed blank, then when the tests run multiple times, it has the same set of random values (repetitive values), we don’t want the same repetitive values, so seed is not blank

Step 4 – Adding Listeners to Test Plan

Listeners

They show the results of the test execution. They can show results in a different format such as a tree, table, graph or log file

We are adding the View Result Tree listener

View Result Tree – View Result Tree show the results of the user request in basic HTML format

To add: Right click Test Plan, Add -> Listener -> View Result Tree

Step 5 – Save the Test Plan

To Save: Click File Select -> Save Test Plan as ->Give the name of the Test Plan. It will be saved as .jmx format

Step 6  – Run the Test Plan

Click on Green Triangle as shown at the top to run the test

Step 7 – View the Execution Status

Click on View Result Tree to see the status of Run. A successful request will be of a Green colour in the Text Section

In the below image, we can see that the cacheKey value is 1917449705 which is generated by Random Variable

Congratulations on making it through this tutorial and hope you found it useful! Happy Learning!! Cheers!!

JMeter Authorization with access token

HOME

Authorization with a dynamic access token is used to pass dynamic response content to subsequent requests. This is used to validate API authorization.

In this post, we will discuss fetching an access token (dynamic response) with the help of JSON Extractor and passing it as a parameter in the subsequent request using BeanShell Assertion.

To achieve this, we need to create 2 Thread Groups:

Thread Group 1 – To generate Access Token  
Thread Group 2 – To pass Access Token to Request 

How to set up JMeter to perform the above test

Step 1 – Add Thread Group 1: Thread Group – Authorization Token Generation

1. Add Thread Group

We should provide the name of the Thread Group. In this case, this thread group is used to generate the token, so named Token Generation. We want to generate only 1 token, so the Number of Threads, Ramp-up period, and Loop Count are 1 only.

2. Add HTTP Request Sampler

In the HTTP Request Control Panel, the Path field indicates which URL request you want to send

 To add: Right-click on Thread Group and select: Add -> Sampler -> HTTP Request

Add valid credentials in the parameters section.

3. Add HTTP Head Manager 

The Header Manager lets you add or override HTTP request headers like can add Accept-Encoding, Accept, Cache-Control

To add: Right-click on Thread Group and select: Add -> Config Element -> HTTP Read Manager

Add Authorization as Headers in Head Manager 

4. Add JSON Extractor

To extract the authentication token from the request, we are going to use JMeter JSON Extractor. The process of extracting a variable from a response works as mentioned below:

First, the server sends back a response, then a post-processor, like the JSON Extractor is executed which extracts part of the response and put it into a variable like ${token}.

To add: Right-click on Thread Group and select: Add -> Post Processors -> JSON Extractor

The JSON extractor requires us to follow a few steps, so we can process the JSON correctly.

1) Name – JSON Extractor
2) Apply to – we will use the defaulted Main Sample Only. The option is: The main sample only – the assertion only applies to the main sample
3) Name of created variables – BEARER
4) JSON Path Expressions – access_token

5. Add BeanShell Assertion 

An advanced assertion with full access to JMeter API. Java conditional logic can be used to set the assertion result.

To add: Right-click on Thread Group and select: Add -> Assertions -> BeanShell Assertions

Add below-mentioned script in the Script section of BeanShell Assertion

${__setProperty(BEARER, ${BEARER})};

Step 2 – Add Thread Group 2: Thread Group – Main Request

1. Add Thread Group

Provide a name to this Thread Group. I have also provided the number of threads, ramp up, and duration in the thread group as shown in the image

We can also parameterize the values of the number of threads, ramp-up period, and duration using a JMeter property called ___P. You can ask why we are using the property function in JMeter. It is because this makes the JMeter script configurable. We can pass any value through the command line without making any changes in the script.

___P – This is a simplified property function that is intended for use with properties defined on the command line. 

If no default value is supplied, it is assumed to be 1. The value of 1 was chosen because it is valid for common test variables such as loops, thread count, ramp-up, etc.

${__P(group1.threads)} – return the value of group1.threads

${__P(THREADS,1)} – This THREADS value will be passed through command line. If no value is passed, by default, it will chose 1 .

Similarly, ramp-up and duration are parameterized.

${__P(THREADS,1)}
${__P(RAMPUP,1)}
${__P(DURATION,1)}

2. Add HTTP Request Sampler

Below-mentioned are the values used in HTTP Request to perform the test

Add a valid request body in the Body Data section (if the request is POST).

3. Add HTTP Head Manager

We have previously extracted the token from the Token Generation request. Now, it’s time to reuse it in the header section of HTTP Head Manager.

Below-mentioned is the values used in the HTTP Request to perform the test.

Authorization = Bearer ${__property(BEARER)}

Step 3 – Adding Listeners to Test Plan

Listeners

They show the results of the test execution. They can show results in a different format such as a tree, table, graph, or log file

We have added listeners – View Result Tree 

View Result Tree – View Result Tree shows the results of the user request in basic HTML format

To add: Right-click Test Plan, Add -> Listener -> View Result Tree

Step 4 – Save the Test Plan

To Save: Click File Select -> Save Test Plan as ->Give the name of the Test Plan. It will be saved as .jmx format.

Step 5  – Run the Test Plan

Click on Green Triangle as shown at the top to run the test.

Step 6 – View the Execution Status

Click on View Result Tree to see the status of Run. A successful request will be of a Green colour in the Text Section

Here, we can see that the Token Generation request is successfully processed.

The below image shows that the Main Request is successfully executed too.

Congratulation!! We are able to add an authorization token generated by a request and add it to another request and processed the request using JMeter. 

How to send POST requests in JMeter

 


We can perform GET as well as POST operations in JMeter. In this tutorial, we will only explain how we send POST HTTP requests in JMeter. In the previous tutorial, I explained how we can send GET request in JMeter.

Create a Test Plan in JMeter by following below-mentioned steps


Step 1 –  Add Thread Group

  • Select Test Plan on the tree
  • Add Thread Group                                                                                                                               To add Thread Group: Right-click on the “Test Plan” and add a new thread group: Add -> Threads (Users) -> Thread Group

In the Thread Group control panel, enter Thread Properties as follows: We will take an example of row no 5

Number of Threads: 5 – Number of users connects to the target website
Loop Count: 5  – Number of times to execute testing
Ramp-Up Period: 5 – It tells JMeter how long to delay before starting the next user. For example, if we have 5 users and a 5 -second Ramp-Up period, then the delay between starting users would be 1 second (5 seconds /5 users).

Step 2 –  Adding JMeter elements  

The JMeter element used here is HTTP Request Sampler. In HTTP Request Control Panel, the Path field indicates which URL request you want to send


Add HTTP Request Sampler
To add: Right-click on Thread Group and select: Add -> Sampler -> HTTP Request

The below-mentioned are the values used in HTTP Request to perform the test

  • Name – HTTP Request 
  • Server Name or IP – localhost
  • Port – 8000
  • Method – POST
  • Path – /demo/helloworld

Add HTTP Head Manager

The Header Manager lets you add or override HTTP request headers like can add Accept-Encoding, Accept, Cache-Control

To add: Right-click on Thread Group and select: Add -> Config Element -> HTTP Read Manager

The below-mentioned are the values used in Http Request to perform the test
Content-type = application/json
accept – application/json

Step 3 – Adding Listeners to Test Plan

Listeners

They show the results of the test execution. They can show results in a different format such as a tree, table, graph, or log file
We are adding the View Result Tree listener

View Result Tree – View Result Tree show the results of the user request in basic HTML format
To add: Right-click on Test Plan, Add -> Listener -> View Result Tree

Aggregate Report

It is almost the same as Summary Report except Aggregate Report gives a few more parameters like, “Median”, “90% Line”, “95% Line” and “99% Line”.

 To add: Right Click on Thread Group > Add > Listener > Aggregate Report

Step 4 – Save the Test Plan

To Save: Click File Select -> Save Test Plan as ->Give the name of the Test Plan. It will be saved as .jmx format.

Step 5  – Run the Test Plan

Click on the Green Triangle as shown at the top to run the test.

Step 6 – View the Execution Status

Click on View Result Tree to see the status of Run. A successful request will be of Green colour in the Text Section.

Click on Response data and Response Header to view other information about Response.

 

Click on Aggregate Report Result to see the aggregated status of Run.

How to send GET Request in JMeter

HOME
 
 
What is Apache JMeter?
 
The Apache JMeter™ application is open source software, a 100% pure Java application designed to load test functional behavior and measure performance. It was design for testing Web Applications but has since expanded to other test functions.
It can used to simulate a heavy load on a server, group of servers, network or object to test its strength or to analyze overall performance under different load types.
 
How to send GET HTTP Request in JMeter?
We can perform GET as well as POST operation in JMeter. In this tutorial, we will only explain how we can perform GET operation.
 
Create a Test Plan in JMeter by following below mentioned steps
Step 1 – Add Thread Group
 
  • Select Test Plan on the tree
  • Add Thread Group                                                                                           
  •           To add Thread Group: Right click on the “Test Plan” and add a new thread group: Add -> Threads (Users) -> Thread Group

In the Thread Group control panel, enter Thread Properties as follows: We will take an example of row no 5.

  • Number of Threads: 5 – Number of users connects to the target website
  • Loop Count: 5  – Number of time to execute testing
  • Ramp-Up Period: 5 – It tells JMeter how long to delay before starting the next user. For example, if we have 5 users and a 5 -second Ramp-Up period, then the delay between starting users would be 1 second (5 seconds /5 users)

Step 2 –  Adding JMeter elements

The JMeter element used here is HTTP Request Sampler. In HTTP Request Control Panel, the Path field indicates which URL request you want to send

Add HTTP Request Sampler

      To add: Right-click on Thread Group and select: Add -> Sampler -> HTTP Request.

Below mentioned are the values use in HTTP Request to perform the test

  • Name – HTTP Request 
  • Server Name or IP – localhost
  • Port – 8010
  • Method – GET
  • Path – /demo/helloworld/demo

Step 3 – Adding Listeners to Test Plan

Listeners – They shows the results of the test execution. They can show results in a different format such as a tree, table, graph or log file

We are adding  View Result Tree listener

View Result Tree – View Result Tree show results of the user request in basic HTML format

         To add: Right click Test Plan, Add -> Listener -> View Result Tree

Complete Test Plan will look like as shown below

Step 4 – Save the Test Plan

           To Save: Click File Select -> Save Test Plan as ->Give name of the Test Plan. It will be save as .jmx format

Sample .jmx File

Step 5  – Run the Test Plan

             Click on Green Triangle as shown below to run the test

Step 6 – View the Execution Status

             Click on View Result Tree to see the status of Run. Successful request will be of Green color in the Text Section

Sample of Failed Request. Failed request will be of Red color in View Result Tree under Text option. This screen sows the reason for the failure of the request like Connection refused here.