Search This Blog

Welcome to Machers Blog

Blogging the world of Technology and Testing which help people to build their career.

Wednesday, October 1, 2008

Virtualization and the impact on testing and software quality

There are substantial strategic opportunities in leveraging virtual infrastructure to improve the delivery and management of environments for software development and testing.

By Julian Brook SQS Software


--------------------------------------------------------------------------------

There seems to be no escaping the “virtual” buzzword at the moment, where everything that was once real now appears to be virtual; software running on virtual machines with virtual memory and virtual storage is used by virtual teams in virtual meeting rooms on virtual LANs. This is largely due to the current large-scale enterprise uptake of virtual infrastructure as a result of its proven benefits, perhaps aided and abetted by the proliferation of articles on the subject.

This trend towards virtual infrastructure has important benefits, implications and opportunities for our IT projects, strategy, quality assurance and ultimately our business competitiveness. The benefits of reduced hardware, power, cooling and administration costs, the associated green credentials and increased flexibility and continuity have been discussed widely.

Disadvantages have also been widely presented including increased server sprawl (exactly what we were trying to reduce), security implications and new performance and management challenges.

In this article we look at the substantial strategic opportunity presented by leveraging virtual infrastructure to improve the delivery and management of environments for software development and testing.

What is virtual infrastructure?

Virtual infrastructure falls into three broad categories:

Server or platform virtualization: Basically, this enables one physical server to appear as many servers. Ultimately, it has resulted in a paradigm shift from considering servers as a physical commodity to a logical commodity or service. This is the main type of virtualization that is commonly referred to today. Common implementations include: Microsoft Hyper-V and Virtual Server, VMWare Server, ESX Server, Citrix Xen amongst many others.

Storage virtualization: This enables the presentation of a single logical abstraction of a pooled set of underlying physical storage. This single view of data enables enterprise wide data de-duplication, backup and restore and management. Examples include: IBM System Storage SAN NetApp, EMC Invista, Veritas Storage Foundation and NetApp Data OnTap.

Network virtualization: The combination or division of one or more local networks into a virtual network to provide a single logical view of a related set of resources and functionality. This can be done across separate physical resources across a wide corporate network or by grouping together a set of resources within a virtualised server\platform.

The importance of an environment to software quality

At is most abstract, testing comprises defining a set of inputs that will result in a set of expected outputs according to the requirements and specifications of the system under test. In addition to the inputs and expected outputs, testing requires an environment containing the system under test to process the inputs into output. Without this environment testing cannot take place.

The environment comprises a number of elements that define the state of the system under test and enable testing to take place. This state comprises:

Application version(s): This includes versions and patch levels across all application layers from operating system to application frameworks and business applications.

Application(s) configuration: This defines how the applications communicate and operate and can significantly affect the behaviour under test.

Data: The data utilized by the system under test also affects the validity of the testing. If we are considering performance testing do we have a representative volume? Does the data volume have the data variance and distribution we require? Do we have sufficient data to exercise the functionality we wish to test?

Infrastructure: The network, processor, peripherals, storage and their interfaces can all affect the validity of testing. For example, do we require a 64-bit processor or 32-bit processor to ensure our compatibility testing is valid?

It is critical that this environment has a defined state that aligns with the test objectives. This environment definition then becomes another test asset alongside test documentation, test data and test scripts. Without this well-defined environment, testing is unrepeatable and unable to be audited.

Environment management is concerned with the service of delivery of environments in a defined state to support the teams and activities across the software development lifecycle.

Environment management challenges and virtualization

Many IT projects have encountered environment related issues and defects that impact the test and development teams and are the cause of significant delays to projects. It has been estimated that up to 40% of effort during the software development life-cycle is wasted due to these types of issues. Follow this link for an outline of these challenges and how virtualization can reduce the impact of these.

Server virtualization “snapshots” provide a full audit capability by being able to re-instantiate the exact test environment used during testing. This is important to many organizations in industries such as the life sciences sector, where audit compliance is critical to business.

Other organizations utilize storage virtualization and clone differencing technology to provide complete copies of very large datasets to more test environments while utilizing less storage. Clone differencing is where multiple instances of a dataset are delivered by referring to a static base copy of the data. Each instance maintains only the changes made to the original data set.

New challenges

Where virtualization has been introduced, inevitably new challenges have arisen. This has included:

· The need to manage and contain the proliferation of image templates. Also the cloning of images tends to require more than just a file copy. How do we deal with IP address conflicts and naming conflicts? If we change IP addresses how do we reconfigure the applications in the templates to point to the systems they need to?

· That virtualization has sometimes been introduced to deliver pure data centre cost savings by the data centre and infrastructure teams without including the users and customers of the infrastructure in the design of the solution. This approach can lead to an ineffective implementation resulting in new issues. For example, with server virtualization “guest” servers run on underlying shared infrastructure. Issues to single infrastructure components that previously typically would have only affected a single environment can now impact every environment.

· Considering if virtualized environments are suitable for the testing being conducted? For example on performance testing, how does the workload of other environments on the same shared host physical infrastructure impact an environment? Does the virtualized infrastructure provide guaranteed computing resources to the environment? How does it compare to the go live scenario? More and more often the go live scenario is into a virtualized environment itself.

· A virtualized environment involves a simulation of the underlying processor to a greater or lesser extent. Does a virtualized environment provide a realistic environment for the testing being conducted?

· What are the license implications for the applications and systems within the virtualized environments? These challenges have resulted in the creation of markets for new tools. The biggest challenge is the management of virtualized environments and there are now many solutions available in this space. Some of these are geared towards data centre management goals and others towards environment management service delivery. Examples of these virtualized lab management tools for environment management include Surgient’s VQMS or VMware’s Lab Manager and these provide functionality such as:

· Managing and organizing a library of application configuration and versions

· Overcoming the challenges around cloning application configurations (i.e. IP address\naming conflicts, network isolation)

· Providing environment scheduling capabilities and guaranteed resource availability for reservations

· Allowing end users to self serve their environment needs, further reducing environment management support overhead

· Facilitating the sharing of environments between users

· Allowing for test automation tools to schedule and provision environments for testing at the point of execution and tearing down and releasing resources once testing has completed

· User access control to application configurations

A magic bullet?

It is clear however, that virtualization is not a magic bullet. There are challenges that are not addressed by virtualization.
One issue that repeatedly arises with environment management is the difficulty and effort required to install and configure software within environments. More often than not, this is typically a manual process that has grown organically as systems and applications have been developed over time; the pace of development is so fast and no value is attached to the delivery of this information or process. I n some cases the knowledge no longer exists in the organization as to how a system is built; the knowledge has disappeared with staff that have long since gone. This short term speed gain does not yield long or even medium term results and quality in production soon becomes affected.

Access control can be enhanced by the use of virtualization and other virtualization lab management tools but are no substitute for policy and full audit capability. The amount of problems that result from quick fixes introduced to environments through the incorrect route is substantial and does not yield any long term gain.

Finally, virtualization does not detract from the need of defining a process for environment management that aligns with an organization’s software development lifecycle, business objectives and teams. This process needs to be an integrated and coherent integration with change control, configuration management processes and the overall release management.

Without good process, just as with automation tools, we end up being able to operate the same bad process only quicker.

Conclusion

Virtualization addresses many of the challenges encountered in providing and managing development and test environments. Theresa Lanowitz, Voke Research states that development and QA time can be reduced by as much as fifty percent. However, virtualization is only one component of an overall test environment strategy. There are challenges that are not addressed by virtualization.

Further, virtualization does not help the environment management team integrate with other release management activities such as change control and configuration management and with the other teams that the environment management team is dependent on.

An environment management strategy must consider the tools available, including virtualization, and ensure that process is adjusted to maximise the benefits and minimize the impact of the new challenges that the tools and virtualization brings.

This will ensure that environments of a known state are provided when they are required by the development and test teams. Virtualization and the impact on testing and software quality

No comments: