STAG SOFTWARE – In Pursuit of Cleanliness

Date:   Sunday , October 03, 2010

There has been significant progress in the field of software testing in recent years. However, one finds it hard to provide a logical means to get closer to perfection. A typical approach to testing based on the activity-based model consists of strategizing, planning, designing, automating, executing, and managing. Over the years, the industry has moved from completing these activities in one go in to an agile version consisting of these activities done in short increments and the field of software testing has been littered with jargons, process models, and tools. Yet the notion of ‘guarantee’ seems elusive.

Is it possible to guarantee quality?

‘Guarantee’, implies that the deployed software will not cause business loss and that the means of validation can be proven to be correct. It is generally understood that testing is a process of uncovering defects done via a good mix of techniques, tools, and people skills. To make guarantees, it is imperative that the approach to evaluation be sharply goal-focused. Goal-focused evaluation means that one should have clarity as to what potential defects we have to go after. Once the potential defects are discerned by employing a scientific approach, it is possible to arrive at an effective validation strategy, a complete set of test cases, better measures of cleanliness (quality), and appropriate tooling. It is to achieve this quality of guaranteed clean software that STAG Software is striving on. Founded in 2000, STAG Software is a boutique test engineering company headquartered in Bangalore. Having pioneered a scientific approach to testing – Hypothesis based testing (HBT), it is attempting to swim against the current by defying the traditional approach to testing. Today, a decade since its humble beginnings, HBT is being slowly accepted as a more result-oriented option by the industry.

HBT – Hypothesis Based Testing

Hypothesis Based Testing (HBT) is a test methodology based on solid scientific principles, It is like a Sherlock Holmes-ian approach to testing ? hypothesize potential defects and then scientifically construct the strategy, test cases, measures, and tooling. HBT is a personal test methodology consisting of six stages of cleanliness. It is powered by STEM™ (STAG Test Engineering Method), a scientific test engineering method consisting of eight personal thinking disciplines aided by thirty two core concepts that aid scientific enquiry. Evaluation of a software or system in HBT consists of setting up cleanliness criteria, then identifying potential defect types, staging them in an optimal order to create a cost-effective staged evaluation model, formulating the various types of test and corresponding techniques, a formal design process to ensure that the test cases are sufficient, and then automating them as needed.

“The core theme of HBT is hypothesizing potential defects and then scientifically constructing the strategy and test cases, measures, and tooling. It is a goal-centered methodology wherein the goal of software cleanliness is set up (i.e. a collection of cleanliness criteria), potential defect types that can impede the cleanliness criteria identified, and then activities performed to ensure purposeful testing that is indeed effective and efficient,” explains T. Ashok, CEO, STAG Software. Based on sound engineering principles geared to deliver the promise of guarantee of cleanliness, its core value proposition is about hypothesizing potential defects that may be present in the software and then allowing one to engineer a staged detection model to uncover the defects faster and cheaper than the other typical test methodologies.
“The concept of applying HBT on software is similar to visiting the doctor where he or she hypothesizes potential problems based on one’s symptoms, performs diagnostic tests to confirm the hypothesis, and then prescribes the treatment regimen,” quips Ashok. “It’s a personal test methodology that fits any development methodology and weaves into any organizational test process. The business promise of HBT is to slash test and support costs and accelerate development.”
HBT has been applied by STAG in various applications in areas like Mobile, Healthcare, ERP, Media, eLearning, and Supply Chain Management over the last nine years. This has resulted in lowered defects escapes (up to 10x lower), increased test coverage (at least 3x), better RoI on automation, lower support costs (by 30 percent) with no increase in effort, time or costs. HBT has been implemented in various process models including Agile. HBT has not only been applied to validating products, but has been applied to validating requirements and architecture, quite different from the typical review or inspection process. HBT has also been applied in assessment of test assets to ascertain effectiveness and improve them.

One such example is of a company that provides online banking solution. The company has three major products catering to over 100 top financial institutions (FI) including the top five FIs in the world. It has a successful product line, growing rapidly with major releases almost every year, incorporating new features to cater to the various needs of the market place. As the code base evolved, the test assets were also modified to reflect the changed product. The challenge faced was that most of the test cases were not detecting defects and the rate of uncovering new defects was low.
As the product became huge and the company decided to re-architect it in order to enable rapid feature addition with low risk. That is when the company decided to have a re-look at its test assets and re-architect the same to increase the test coverage, improve defect finding ability, and ensure that the test assets were future-proof. It had about 8,000 test cases then.

The company approached STAG to analyze the existing test cases for completeness and modifiability and re-architect the same after filling the gaps and to ensure that the future test cases were easily pluggable. Applying STEM, STAG performed a thorough assessment of the existing test assets and discovered holes in the same. Using the STEM Test Case Architecture (STEM-TCA), it re-engineered the test cases by first grouping them into features, then by levels of tests and segregating into various types of tests, and then finally by separating them into positive and negative test cases. During this process of fitment of existing test cases into the STEM-TCA, STAG uncovered quite a few holes. These were filled by designing 5,000 test cases additionally. Not only did the STEM-TCA increase the test coverage by uncovering the missing test cases, it also provided a sharper visibility of the quality as the test cases were well organized by specific defect types. This improved the test coverage by about 250 percent and the technical management staff was confident about the adequacy of test assets and was also convinced about its future upgradeability and maintainability.

In another example, early implementation of HBT yielded significant results for a global chip major. STAG had to set up an effective validation practice for their video decoder software. The challenge that the company faced with their complex product that involved both hardware and software and later system integration on multiple realtime OS on various platforms was that of high defect escapes i.e. post-release field defects.

Ashok and his team spent about a month understanding their domain and the associated technologies. Post this, a detailed analysis yielded interesting data – test cases were primarily conformance oriented, coverage of test cases was suspect, escaped defects seemed to propagate from early stages, and finally the process of validation was loose. Having understood the types of defects that were being found and the post-release defects, they figured out the various types of probable defects and the various combinatorial aspects that needed to be considered to form a test case. The team then staged the validation as consisting of three major levels, the first one at API level, the next one at a system level, and the last one made up of a customer-centric level that involved using reference applications.

Applying the HBT approach to test design, the test cases were developed, yielding about 6,000 test cases at level one and about 800 at the subsequent levels. Also, whereas the ratio of +ve vs –ve test cases was earlier towards the +ve side; after their re-design, the ratio shifted to 60:40 percent at the lower and about 85:15 percent at the higher levels. Moreover, the number of test cases increased significantly by a factor of 1,000 percent, allowing for a larger and deeper net to catch many more serious defects. Over the next 9 months, the rate and number of defects detected increased dramatically, resulting in post-release issues reducing by a jaw-dropping 10x times.
Once the test effectiveness problem was solved, the yield of defects increased, the focus shifted to streamlining the process by setting up proper gating in the test process and creating a centralized Web based test repository, and finally setting up a strong defect analysis system based on Orthogonal Defect Classification (ODC) method. This enabled a strong feedback system, resulting in shifting the defect finding process to earlier stages of SDLC and thereby lowering cycle time. Complementing this, STAG focused on setting up a custom tooling framework for automating this non-UI based software, resulting in a significant cycle time reduction – an entire cycle of tests on a platform took only less than 15 hours of time.

STEM is the basis for intelligent testing to deliver compelling business results as it enables early detection of defects and enhances ability to find defects by at least three times, resulting in reduction of cost and time by over 30 percent.

“We establish a clear goal and then perform activities that will indeed get us to the goal. The goal translates to ‘uncovering potential defects’. These potential defects are hypothesized and all the later activities are about proving that the hypothesized defect(s) do or do not exist,” explains Ashok.

One of the USPs of HBT is its simplicity. Since it is a methodology governed by 32 core concepts and disciplined processes, it is easier to learn and defies the common notion that effectiveness in testing depends on the experience of the test staff. It is a general belief that the more experienced people in the QA team is the better for the quality of work. At STAG, teams are not merely assembled based on number-of-years-of-experience, but on what key disciplines are needed to perform the role and therefore the required competency. STAG evaluates the fitness of a person to a team based on his or her role and therefore the required competencies to perform the role. Measuring the increase in a QA’s competency while working with HBT, STAG derived a competency model – CREAM.

CREAM enables it to evaluate an individual’s test-related competencies in terms of test lifecycle activities, technology aspects, and business domain aspects. Based on this model, competency gaps are analyzed, appropriate training plan for the individual is formulated, and the person is trained.

Engaging the Customers

As a method oriented organization, the art of customer engagement is as unique as the company itself. Since the industry is yet to fully understand the company’s approach, STAG first educates its customers about the better quality and RoI by deploying STEM and HBT. At present, STEM and HBT are primarily deployed by the company’s customers only in certain parts of a project. Ashok has also started to license the methodology to companies who wish to deploy it internally and customize it to their requirement. These include software development companies and other independent service providers amongst which is one of the top five Japanese system integrators that today deploys STEM as a part of its services.

“We are very serious about the economics of testing and its contribution to the bottom-line of our customers. At every instance, we question ourselves the value being delivered and how it has impacted our customer’s business,” says Ashok. When engaging with a customer, STAG pays a lot of attention to understanding. HBT takes a scientific approach to the act of understanding the intentions or expectations by identifying key elements in a requirement or specification and setting up a rapid personal process powered by scientific concepts to quickly understand the intentions and identify the missing information. This enables the engineers to come up with intelligent questions leading to a deeper understanding. This results in rapid ramp-up, higher business value being delivered, and leveraging this understanding to build a rich asset base.

STAG offers unique solutions and services like test assessments, validation suites, release worthiness assessment, people competency assessment (in testing), test-case re-engineering, custom tooling for test bench, requirement and architecture validations, in addition to standard offerings like outsourced QA Lab, managed validation, functional test automation, and performance assessment.

Developing a Methods Company

Software testing, though critical in software development, for years has been considered the most jaded and mundane task in software development life cycle (SDLC). Though India has carved a niche for itself in the testing industry, it is true that most entrepreneurs have shied away from getting into the business testing. Industry watchers attribute two probable reasons for this ? one, the general misconception that testing, unlike software services or products business, is not a lucrative proposition and the other is the feeling that testing would figure at the low end of the market.

Neither of these fears is based on facts, says Ashok whose passion for testing resulted in STAG Software. Today, STAG has firmly positioned itself as a boutique company offering testing solutions and services, which he believes is quite different from those of other vendors that were founded at the same time.

Most testing services players compete on the ‘volume game’. Every project is won on the basis of the size of the team or number of service locations they can offer to their customers. “This is the fundamental difference between STAG and other independent testing vendors,” Ashok points out.

STAG essentially is a ‘methods’ company and not a product or services one. The core foundation of the company lies in the fact that it develops methodologies or concepts that can be used in testing irrespective of the domain or technology.

This ideology of working on testing methods was influenced by IBM Rational that had worked long years to set up methods for software development and over the years came up with a standard called UML (Unified Modeling Language). Today, UML is the basic framework around which software is developed despite what the technology may be.
A Masters in computer science from the Illinois Institute of Technology, Ashok had always had a knack for research oriented approach to problems. It was while leading the worldwide software test analysis group (WW:STAG) at Verifone India, formerly a division of Hewlett Packard, that he started dabbling with the idea of finding more effective methods for testing.
“WW:STAG was a large independent test organization in Verifone/HP responsible for certifying client server and electronic commerce software. This group built ground up a test group well recognized within and outside the company. It was the ideal platform for me to experiment on the scientific approach to testing,” says Ashok.

As the software becomes complex day by day, there is also an increase in bugs or software defects that slip into a software program when it is written. Traditionally, software testing, depending on the testing method employed, can be implemented at any time in the development process. However, most of the test effort occurs after the requirements have been defined and the coding process has been completed. As such, the methodology of the test is governed by the software development methodology adopted. “As software services companies graduate from plain coding work to full development (not necessarily developing shrink wrap products) testing becomes crucial. But unfortunately, many programmers are forced to do minimal testing to beat the last minute project deadline,” he says.
It was the lax approach towards testing and the challenges faced by quality analysts that sparked Ashok’s thought process: could there be a methodology for testing that could work as a scanner wherein if one ran a piece of code, it could immediately highlight the bugs. Thus, he came up with the concept of hypothesis based testing.

Today, a decade since its beginning, STAG has successfully worked with over 100 customers and conducted around 350 cycles of various types of test viz., functional testing, load/performance/stress/volume test, reliability/scalability test, test automation, Beta testing, Installation testing, configuration/compatibility testing, L10N testing, document testing, API testing, and more.

Having worked on 170+ projects in the areas of enterprise applications, embedded software, and test automation with customers spanning industries ? Avionics, BFSI, Consumer Electronics, eLearning, ERP, Healthcare, Logistics, Mobile/Wireless and CDMA, Realty, Semiconductor, Shipping, and popular Internet search engines – STAG truly proves that its methodologies are domain independent. The ultimate recognition for the company is STEM 2.0 being featured in the Unisys Technology Review, a well known journal in the Japanese software industry and academia, and part of the Japanese National Archive. “Japan is a research and technology oriented industry and it is a huge honor to be recognized by them,” Ashok smiles.
With its path breaking research oriented approach to software testing, STAG has managed to disrupt the way testing has been practiced over the decades. “I believe this is quite an achievement but there is still some time before the industry as a whole adapts it,” says Ashok. In order to evangelize the adoption, STAG itself has taken the initial steps by offering a full fledged education division to train people in STEM and HBT and make them industry ready. The programs offered from its Bangalore and Chennai facilities have already proved quite effective as a majority of the trainees are being absorbed by the company’s clients themselves. Also, he plans to publish all the research on STEM and HBT and bring out a book soon that could perhaps one day be an academic reference in software testing.

Ashok often quips that software testing is like the ‘ugly duckling’ that not many prefer to step into, but it is just a matter of time that it will transform into a beautiful swan and hopefully STAG will be the catalyst.