Search This Blog

Welcome to Machers Blog

Blogging the world of Technology and Testing which help people to build their career.

Tuesday, September 8, 2009

The A-B-C's of software testing models

Summary:-
This article provide you brief on testing methodologies in various software development models
Theme:-
One doesn't have to spend much time in the software industry to become familiar with several software development models. Some of the most commonly known include water fall, Iterative ,test-first or test –driven development (TFD or TDD), and Extreme Programming (XP). Interestingly, one needs to have a rather diverse set of software development experiences and needs to pay rather close attention to those experiences to realize that there are just as many models for testing software as there are for developing software -- and that the testing model a particular project follows need not be dictated by the software development model.
Categories of testing activities
To aid in this discussion, let's agree to think about software testing in terms of five general categories of activities:
1. Researching information to improve or enhance testing -- This information may come from specifications, use cases, technical design documentation, contracts, industry standards, competing applications, or almost anything else that is likely to improve a tester's ability to test the software deeper, faster or better.
2. Planning and/or designing tests -– This category would encompass such activities as writing test cases, developing test strategies, writing test plans, creating manual test scripts and preparing test data.
3. Scripting and/or executing tests –- Here is where tests are actually executed and/or automated. This is what most non-testers think of when they hear someone talk about software testing.
4. Analyzing test results and new information –- Not all tests produce results that clearly pass or fail. Many tests result in data that can only be understood by human judgment and analysis. Additionally, changing specifications, deadlines or project environments can make a test that had been clearly passing fail without anything changing in the software. This category is where this type of analysis occurs.
5. Reporting relevant information -- Reporting defects and preparing compliance reports are what come to mind first for most people, but a tester may need to report all kinds of additional information.
Again, these five categories are intended to be simple in order to make our discussion about testing models easier. They aren't intended to supplant your current terminology.
Testing waterfall-style
Just like developing software using the waterfall model, testing waterfall-style is a fundamentally linear process except for a minimal feedback loop created by the need to fix some of the problems in the software that are indicated by failing tests. Visually, that feedback loop is equivalent to the small eddy current at the bottom of a real waterfall.

Waterfall-style testing is rarely chosen voluntarily anymore. It is commonly a side effect of some logistical challenge that kept the testers from being able to interact with the application or the developers prior to the first -- and what they hope will be the only -- build of the software. Waterfall testing is occasionally appropriate for situations where it is reasonable to hope the software will "just work," such as applying a service release or a patch to a production application.
Testing, iterative-style
Iterative testing is similar to iterative development in that many of the test iterations happen to coincide with development releases. In that regard, it is like a bunch of waterfall testing cycles strung end to end. Testing iterations differ from development iterations in that there can be iterations prior to the first software build, and there can be multiple test iterations during a single software build. Another difference is that unlike a development iteration, a test iteration can seamlessly abort at any point during the iteration to return to a research mode. While a development iteration can also abort and restart at any time, doing so is quite likely to jeopardize the project schedule.
Iterative software testing is extremely common in the commercial market, though it has many variants. The V-Model, the spiral model, and Rational Unified Process (RUP) based testing are all derivatives of an iterative testing approach. Iterative testing generally works well on projects where software is being developed in pre-planned, predictable increments and on projects where the software is being developed and released in such rapid or unpredictable cycles that it is counter productive for testers to plan around scheduled releases.
Testing, agile-style
Agile-style testing more or less eliminates the element of pre-determined flow from the test cycle in favor of shifting among the five basic activities whenever it adds value to the project to do so. For example, while analyzing the results of a test, the tester may realize that his test was flawed and move directly back to planning and designing tests. In a waterfall or iterative flow, that test redesign would wait until after the current results were reported and preparations were being made for the next test iteration.
Agile-style testing can be implemented as an overall approach or as a complement to any other testing approach. For example, within an iterative test approach, a tester could be encouraged to enter a period of agile testing, side-by-side with a developer, while tracking down and resolving defects in a particular feature.
Agile-style testing is significantly more common than most people realize. As it turns out, this model is what is going on in the heads of many testers all the time, regardless of the external process they are following. Be that as it may, this approach isn't very popular with managers and process improvement specialists because it is misunderstood by many non-testers, and few testers following this process are able to express what they are doing in a manner that gives stakeholders confidence that they are actually doing organized and thoughtful testing.
End of document

No comments: