Search This Blog

Welcome to Machers Blog

Blogging the world of Technology and Testing which help people to build their career.

Thursday, October 23, 2008

Functional Testing

Emily H. Halili

JMeter is found to be very useful and convenient in support of functional testing. Although JMeter is known more as a performance testing tool, functional testing elements can be integrated within the Test Plan, which was originally designed to support load testing. Many other load-testing tools provide little or none of this feature, restricting themselves to performance-testing purposes. Besides integrating functional-testing elements along with load-testing elements in the Test Plan, you can also create a Test Plan that runs these exclusively. In other words, aside from creating a Load Test Plan, JMeter also allows you to create a Functional Test Plan. This flexibility is certainly resource-efficient for the testing project.

This chapter will give a walkthrough on how to create a Test Plan as we incorporate and/or configure JMeter elements to support functional testing. This chapter assumes that you have successfully gone through Chapter 5, and created a Test Plan for a specific target web server. We will begin the chapter with a quick overview to prepare you with a few expectations about JMeter. Later, we will create a new Test Plan similar to the Test Plan in Chapter 5, only smaller. The Test Plan we will create and run at the end of this chapter will incorporate elements that support functional testing, exclusively.

Preparing for Functional Testing
In this regard, I need to highlight that JMeter does not have a built-in browser, unlike many functional-test tools. It tests on the protocol layer, not the client layer (i.e. JavaScripts, applets, etc.) and it does not render the page for viewing. Although, by default that embedded resources can be downloaded, rendering these in the Listener | View Results Tree may not yield a 100% browser-like rendering. In fact, it may not be able to render large HTML files at all. This makes it difficult to test the GUI of an application under testing.

However, to compensate for these shortcomings, JMeter allows the tester to create assertions based on the tags and text of the page as the HTML file is received by the client. With some knowledge of HTML tags, you can test and verify any elements as you would expect them in the browser.

Unlike for a load-testing Test Plan, it is unnecessary to select a specific workload time to perform a functional test. In fact, the application you want to test may even reside locally, with your own machine acting as the "localhost" server for your web application. For this chapter, we will limit ourselves to selected functional aspects of the page that we seek to verify or assert.

Using JMeter Components
We will create a Test Plan in order to demonstrate how we can configure the Test Plan to include functional testing capabilities. The modified Test Plan will include these scenarios:

Create Account—New Visitor creating an Account
Log in User—User logging in to an Account

Following these scenarios, we will simulate various entries and form submission as a request to a page is made, while checking the correct page response to these user entries. We will add assertions to the samples following these scenarios to verify the 'correctness' of a requested page. In this manner, we can see if the pages responded correctly to invalid data. For example, we would like to check that the page responded with the correct warning message when a user enters an invalid password, or whether a request returns the correct page.

First of all, we will create a series of test cases following the various user actions in each scenario. The test cases may be designed as follows:

With the exception of the Configuration elements, Listeners, and Assertions, which we will add later, our Test Plan will take the form that you see in the following screenshot (See Figure 1).

Using HTTP Proxy Server to Record Page Requests
As in recording requests in Chapter 5, you will need to include the HTTP Proxy Server element in the WorkBench. Some configuration will be required, as shown in the following snapshot (See Figure 2).

Configuring the Proxy Server
Simulating Create Account and Login User scenarios will require JMeter to make requests for the registration and login pages that are exposed via HTTPS. By default, HTTP Proxy Server is unable to record HTTP requests. However, we can override this by selecting (checking) the Attempt HTTPS Spoofing checkbox.

Selecting Add Assertion will be especially useful as we add specific patterns of the page that we want to evaluate as a later part of this exercise. The Capture HTTP Headers option is selected to capture the Header information as we begin recording. However, to make the recording neater, we will keep this option unchecked.

In addition, since we do not require images in our testing, in the URL Pattern to Exclude section, add these patterns: .*\.jpg, .*\.js, .*\.png, .*\.gif', .*\.ico, .*\.css, otherwise these image files, which are not necessary for our testing, will be recorded causing unnecessary clutter in our recording.

Adding HTTP Request Default
A useful addition to this element is the HTTP Request Default element, a type of Configuration element. Since this Test Plan will employ multiple HTTP request elements targeting the same server and port, this element will be very useful. The web server name will not be captured for each HTTP Request sampler record, since the Request Default element will retain this information. With a little configuration change in this element, it allows the Test Plan to run even when the application is the deployed to a different server and/or port. The following snapshot is the HTTP Request Default element that we will use for this exercise. (See Figure 3).

As we use this default element, our subsequent recording never needs to append the Server name. The result of our recording of the first page is shown in the following snapshot (See Figure 4).

Adding HTTP Header Manager
Another very useful default element is the HTTP Header Manager Configuration element. This element can either be added to the Test Plan and configured manually as an afterthought, or we can simply use the recorded Browser-derived headers element as included in the recording. For convenience, we will choose the latter option. Once the Proxy Server records the homepage request, stop the recording. You will find a Header Manager for this page is being captured, as Browser-derived header. Simply click and drag this element to the top of the current scope of the HTTP Proxy Server. Notice that I have removed the Referer, since we want to create a default for the remaining HTTP Requests. Following is a snapshot of this change. (See Figure 5).

Now you may de-select the Capture HTTP Headers option in the Proxy Server element, since we have the default header.

Let the Recording Begin...
Let us proceed with the recording following the test cases in the previous table as our guide. As you record each page, select the specific tags or page elements the correctness of which you want to validate and add them to the Patterns to Test section in the Response Assertion element of each sampler. This may take most of your recording time, since as you record, you need to decide carefully which page element(s) would be the most effective measure of correctness. There are plenty of developer tools available to help you in this possibly tedious task. My favorite is the Inspect Element feature in Firebug, a Firefox browser add-on by Mozilla. You may choose patterns that you would expect to see or otherwise by selecting or deselecting the Not option at Pattern Matching Rules section.

After recording is completed, you may rename and organize your samplers, as you move them to the Test Plan (refer to the following figure). You may want to add a few more Configuration elements in your Test Plan, as in my sample shown in the following snapshot (See Figure 6)

I have added User Defined Variables, two more Listeners, and a Constant Timer with a constant delay of 2 seconds after the request for each page was completed. The Assertion Results listener is used with the Response Assertion elements, to summarize the success or failure of a page in meeting the validation criteria defined in each Response Assertion.

Adding User Defined Variables
The User Defined Variables (UDV) element as shown in the following snapshot is particularly interesting with regards to the test case design we drafted earlier in the table. It allows you to plug values to variables being used in various locations in the Test Plan. The JMeter Test Plan we have created will implement the exact values assigned to different variables. Following is a snapshot of the UDV I have set up for our Test Plan. (See Figure 7)

How do we use these variables in the Test Plan? Simply use the format ${Variablename} anywhere in the Test Plan that we want to use the value of a Variable.

For example, in the HTTP Request Sampler following CREATE ACCOUNT | Test Step#6: Register Valid User, as you can see below, the parameter password has value ${VALID_PWD}, referring to the corresponding variable assigned in UDV. (See Figure 8)

We may also use the variables set in UDV in other elements, namely Response Assertions. This feature is particularly useful when the assertion depends on varying values, such as when we want to verify URLs, verifying user names, account no, etc.—depending on the values we want to include throughout the entire testing. The following snapshot may give us a clear idea of how a UDV can be used in an Assertion element. The URL variable defined in UDV is used in the Patterns to Test section of this Assertion, as part of a complete page element that we want to verify in the page Sampler. (See Figure 9)

Running the Test
Once the assertions are properly completed, we are expecting that running our Test Plan would pass all the assertions. Passed assertions will not show any error in Assertion Results | Listener installed within the same scope. As for all Listeners, results as captured by the Listeners can be saved and reproduced at a later time. Following is a sample explaining what passed Assertions would reveal as the Test is executed. (See Figure 10)

On the other hand, a failed Assertion would show an error message in the same Listener as the following snapshot illustrates. (See Figure 11)

Since a page error or Page not found error is a real risk in web applications, a failure may originate from such an error, and not just because of a failed Assertion. We can view more information about the sampler that contains the failed Assertion to investigate the origins of a failure. A View Results Tree Listener records the details of requests and logs all errors (indicated by the red warning sign and red fonts).

The following figure shows that the page was available and page request was successful, however, the assertion failed. (See Figure 12)

This chapter provided visual means for you to understand the capabilities of JMeter tools that support functional testing, as we directly wrote and implemented a JMeter script. We have demonstrated building a Test Plan to contain functional validations (or assertions) by incorporating various essential JMeter components, particularly the 'Response Assertion' element and 'Assertion Result' Listener. By using the 'User Defined Variable' Configuration element, we have also parameterized several values in order to give our Test Plan better flexibility. In addition, we have observed the result of these assertions as we performed a 'live' run of the application under test. An HTTP Request sampler may require to be modified, if there are any changes to the parameter(s) that the sampler sends with each request. Once created, a JMeter Test Plan that contains assertions can then be used and modified in subsequent Regression tests for the application. The next chapter will let us see various ways that a JMeter script can be further configured and tweaked so that it supports better portability and testability. Chapter 7 will describe various methods and tools available in JMeter that support more advanced and complex testing requirements.

No comments: