The Test Center inspects five worthy tools for keeping your services squeaky clean
SOAP is the currency of the SOA marketplace – for now, anyway. Though SOAP's significance may diminish as Web services evolve, its importance for the time being is unquestionable. Therefore, a substantial portion of the QA work by Web service providers and consumers must entail verifying the accurate exchange of SOAP messages. Not surprisingly, several SOAP-focused Web service testing tools have appeared.
I had an opportunity to look a five such tools: AdventNet's QEngine, Crosscheck Networks SOAPSonar, iTKO’s LISA, Mindreef's SOAPscope Server, and Parasoft's SOAtest.Readers of my earlier reviews of open source Web service testing apps will recall that those products required a relatively technical command of XML, SOAP, and WSDL (Web Service Definition Language). That is less a requirement with these tools; virtually all provide a user-friendly means of manipulating SOAP request-and-response data in ways that insulate the user from hands-on XML work.
Fundamentally, testing a SOAP-based Web service involves three activities: constructing a SOAP request, submitting it, and evaluating the response. As easy as that sounds, it is anything but. An effective SOAP-testing tool cannot simply rely on a user-friendly mechanism for building requests. It must also enable the user to organize and arrange requests in realistic sequences, provide a means of altering request input values, and intelligently tweak requests so as to expose the Web service to a range of good and bad usage scenarios. In short, you want the tool to run the Web service through a reasonable approximation of real-world activity.
In addition, the tool must be equipped with a collection of gadgets for evaluating responses. Such gadgets should include everything from simple string matching to executing an arbitrarily complex XQuery on the SOAP payload.
All of the tools reviewed here provide variations on the preceding capabilities. All make valiant attempts to shield the user from direct exposure to XML, and some keep users entirely in a protective GUI so that coding is never necessary. Meanwhile, most of the tools supply "authorized personnel only" doorways into more advanced testing functions that involve scripting, feeding request data from databases, parsing and filtering results, and so on.
Most also provide conformance verification of the format of SOAP messages and Web service WSDL files to the growing list of Web-service related standards and specifications – primarily the profiles from the Web Services Interoperability Organization (WS-I) and the WS-* specifications from the likes of OASIS and others. Some also offer load-testing capabilities so that you can unleash a squad of virtual clients on the Web service and measure its response to the increased traffic. Some take a "holistic" approach to Web service testing, recognizing that SOAP-based Web services are not the only form of service being presented on the Web.
The offerings are complex, and each could support an entire review on its own. I've done my best to cover the distinguishing features of each tool. You should consult the associated comparison matrix for a high-level glimpse of some of the more important characteristics.
AdventNet Qengine 6.8
QEngine's UI is a browser, which means that when you launch the tool, you're really launching an application server. In this case, the application server is Tomcat, which also fires up an instance of MySQL server for the tool's data storage system. QEngine tries to mitigate the inconveniences of running in a browser by installing the QEngine toolbar browser plug-in, which adds buttons that make it easier to control the QEngine system.
Logging into QEngine ushers you into the suite manager screen. From there, you can either create or import a new test suite, or choose to work with an existing suite. If you choose the latter, a dialog materializes, giving you the option to work on Web functionality, Web performance, Web services functionality, or Web services performance. In other words, a single suite can host up to four categories of tests, and these categories are never really aware of one another. So, for example, if you're doing Web functionality test work, you don't see the Web services performance tests within that same suite. This takes some getting used to.
Scripts perform actual test execution. When you add a new Web service to a test suite, QEngine queries the Web service's WSDL and from that generates a set of basic test scripts – one for each Web method on the service. Select a script from a suite's explorer tree, and the generated source code appears in an editor window. QEngine's scripts are written in Jython, the implementation of the Python language that executes in a Java virtual machine.
The prebuilt test scripts are extremely spartan; you have to flesh them out for them to be useful. This is a two-step process. First, you supply the content of the requests using a pair of menu selections: DataSet Configuration and Parameterization. Dataset Configuration lets you set the sources of your input data as either a database (QEngine supports Oracle, SQL Server, or MySQL) or a CSV file. After you've configured your datasets, choose Parameterization and you can set specific input values to be supplied either by the dataset you just configured or by manually entered values.
Second, you add response analysis to the script. This requires coding, but QEngine helps out with a large selection of built-in response-processing functions. A Function Generator dialog simplifies choosing the right method to call. Select a function from the categorized list, and the dialog provides a description and an input parameter list. In effect, it fills out the function call for you and pastes it into the script.
Even with QEngine's hand-holding, validating a response is not particularly easy, unless you choose a simple processing function. To really get inside the response, you have to pretty much peel apart the XML, so you might want to keep an XPath manual on hand. When you execute a script, the results are gathered into a report summarization screen. It's loaded with links that you can click to drill down into the specifics of the success or failure.
QEngine strikes a good balance between reliance on the UI and raw coding. Through the Parameterization dialog, you can enter test data without having to wrestle with XML. But when XML-wrestling is unavoidable, you can drop into Jython code and wreak whatever havoc you deem necessary. However, QEngine's user interface is not easy to navigate; more than once, I found myself unable to backtrack to a known location. In addition, inactivity for a period of time (which I was unable to deduce) logs you out. You have to log back in and crawl back to where you left off.
Crosscheck Networks SOAPSonar 3.0.5
SOAPSonar offers numerous ways to feed data into SOAP requests. Using the tool’s Automation Data Source, you can input data from an SQL database via ODBC (Open Database Connectivity), an Excel spreadsheet, or a raw file. If you want to reuse data in a previous SOAP request, you can configure the request to use a recalled entry, which propagates data from a previous response into subsequent requests. For the ultimate in data-generation flexibility, you can call upon an automatic data function. Such functions are drawn from an extensive library that ranges from manually entered data series to arbitrarily complex user-defined data generation algorithms.
SOAPSonar's fundamental testing unit is a test case. Normally, test cases are organized into suites; however, SOAPSonar's project tree view allows you to work with test cases in a sort of staging area. Test cases appear as nodes in the tree, attached to the parent node of WSDL-based Web service to which they apply. You can craft as many test cases against a given WSDL as you wish. Once you have a test case that has been verified and deemed ready, it can be moved into a test suite.
SOAPSonar operates in one of several modes. You choose the Current mode from a menu selection, and typically, you run SOAPSonar in QA mode, which provides functional testing. However, if you switch to Performance mode, SOAPSonar's load-testing features are enabled. You can define multiple virtual clients and configure them to execute tests against target Web services. SOAPSonar can also exercise a Web service's resilience to security attacks, using a patent-pending XSD Mutation technology. XSD Mutation modifies an outgoing SOAP request in ways that expose the Web service to known assaults such as SQL injection, XML bombs, and so on.
Although SOAPSonar is, in accordance with its name, primarily a SOAP-testing application, it can test REST (Representational State Transfer)-style Web service interfaces as well. For a given REST-style test case, you can enter comma-separated name-value pairs that the tool assembles into the request URL. You can also use SOAPSonar's entire range of input data creation capabilities to generate input values for the request, giving you the ability to craft as rich a set of REST-style tests as SOAP-style tests.
Once your army of tests is created, you can automate their execution, provided you purchase the additional APC License Component. This component adds features to the standard SOAPSonar package that include a command-line interface for integrating test scripts with the Windows Task Scheduler.
There is also a free edition of SOAPSonar: Personal Edition. It lacks features such as WS-Security validation, performance testing, and vulnerability testing. A comparison between the Personal Edition and the Enterprise Edition is available at the company's Web site.
SOAPsonar presents itself as the critical tool you need to fulfill Crosscheck Networks' vision of a Web service testing way of life: the “four pillars of SOA deployment diagnostics.” There's the functional pillar: verifying a given request produces the correct response and that a Web service fulfills its design requirements. Second is performance: measuring a Web service’s throughput and response times. Third, compliance: verification of adherence to recognized standards. Last is vulnerability: ensuring that the Web service is tolerant of and resistant to malformed requests. This is a fine collection of Web service testing principles, and SOAPSonar does an admirable job of upholding them.
iTKO LISA 3.6e
LISA's learning curve is smooth and easy. The tool imposes a cyclic test development process. Create a new test case, and LISA builds a test structure consisting of skeletal test steps that act like bookends – one is the start step, the other is the end step. Import a WSDL through the Web Service Step Wizard, and you're handed a list of that WSDL's Web methods. Select a method to work on, and the wizard opens an object editor to supply input data, simultaneously adding a new step between the bookends. Enter test data for the request, then submit the request to the Web service and see if what comes back looks right. If not, go back, tweak the step (or correct the Web service method), and try again. There are more details to this, of course – compliance testing, for one, but LISA supports that as well.
Apply the above process repeatedly for the different Web methods on the WSDL, and ultimately you'll have a complete test case for a specific WSDL. Test cases can, in turn, be gathered into a test suite, which is really just a kind of folder in the LISA environment.
LISA provides a healthy collection of test step types, though in most test cases, the majority of steps are of the "Web service execution step" type: Send a request, examine the response, and determine success or failure. Other step types can verify a Web service's compliance with various standards, execute external Java classes, or even call command-line scripts.
In addition, each test step can be adorned with a variety of filters and assertions. The former is provided to parse the content of response messages. For example, you might apply a filter to fetch a specific response value and store it in a property for use in later steps. The assertions manage verification of response data, WSDL, and message conformance. Also, the assertion section of a step specifies whether it has passed or failed, and it identifies whether execution control should proceed to the next step or to some other step.
The full LISA product is very Java-aware. It can generate JUnit tests, functional tests of Java classes, database tests via JDBC, and EJB tests. The free version, WS-Testing, is limited to generating only test cases and for Web services only. In addition, some test steps types are unavailable (it lacks any J2EE-related test step types, for example).
For all the initial ease of learning LISA, navigating the UI is sometimes bumpy. For example, when entering a new value for a field in a test step, there is no obvious way to save that value, nor to cancel the change. I found the only way to cancel input was to select a different node in the explorer, then dismiss the dialog that asked if I really wanted to do that.
The LISA documentation makes a big deal of no-code test development, as if that is the high road to simultaneously simplifying and accelerating test development. Perhaps, but while LISA's pure UI-approach does have the benefit of live interaction and is more accessible to QA engineers inexperienced at coding, it has limitations that a tool with easier access to scripting does not. Some testing nuts can only be cracked by a well-sharpened piece of code.
Mindreef SOAPscope Server 6.0
SOAPscope Server, like QEngine, is a thin-client-based tool. Behind SOAPscope's browser UI is a Tomcat server, girded by an RDBMS (relational database management system) that can be MySQL, Oracle, Microsoft SQL Server, or the embedded Apache Derby database. (Derby is supplied with SOAPscope but not recommended for even moderately large installations.)
SOAPscope Server's service spaces are the overarching containers of testing assets. An administrator will use service spaces to organize users into groups. Within a service space, member users can create one or more workspaces in which to store their, well, work.
Inside a workspace you'll find WSDL contracts, tests, notes, and other ancillary material needed to support actual testing. Typically, a workspace corresponds to a WSDL: When you create a new workspace, the first prompt you encounter is for a WSDL URL. You can, however, add more WSDL contracts to the workspace once it is created. Once you've imported a WSDL into a workspace, you can begin adding messages to that workspace. A message is really a SOAP request/response pair, created when you invoke a Web method on a WSDL. The invocation also optionally creates an "action" within the workspace.
The distinction between "message" and "action" gets a bit tricky. Messages are a kind of journal of your interactions with the Web service; each message is a request/response pair stored in a list each time you invoke a Web method. Actions are also messages, but they are kept in a separate area in the GUI. More importantly, actions can be arranged in an arbitrary sequence and replayed in that sequence. Such a replayed sequence is a test script. SOAPscope lets you fortify scripts with realistic behavior; for example, you can pass values among actions in a script so that a subsequent request is altered based on a preceding response.
Messages become actions when you enable a Recording flag (from a menu selection), so you tweak a message until you arrive at a request/response pair that you've determined will make an acceptable test instance. Then you turn on Recording and invoke the message; the associated action is created in the actions section, where it can be incorporated into a script.
Invoking a Web method to generate messages and actions is fall-off-a-log easy. SOAPscope is a very visual environment; similar to LISA, there is little or no programming involved. Request inputs can be either fixed values (entered manually), global values, or fed in from a data grid (such as a small spreadsheet built into SOAPscope). Also, SOAPscope can perform load testing, but you must first purchase the Mindreef Load Check module. The workloads employed by a load test are scripts imported from workspaces, and they're configured to expose the target Web service to different quantities of virtual clients executing those scripts.
SOAPscope can test the client side of Web services via the SOAPscope Server. You create a set of requests and responses – a request/response pair in this context is called a reaction – and store these pairs with the SOAPscope Server. Point the client to be tested at the server and submit requests. The server will search through the reactions; when it locates one with a stored request that matches the incoming request, it sends back the associated response. The upshot is a mock Web service – not particularly elaborate, but effective.
SOAPscope's documentation is first-rate, particularly helpful to anyone new to SOAP and Web services. And noncoding QA engineers will welcome the tool's code-free environment.
Parasoft SOAtest 5.1
As with other tools, Parasoft’s SOAtest will jump-start a project by automatically creating test cases. Point SOAtest at a WSDL; the tool will crawl it and build a set of positive and negative tests for each Web method on the WSDL, so SOAtest's initial boost propels your testing further from the launch pad than the other tools.
SOAtest's auto-generated tests, though minimal, are instantly executable and are actual tests, not completely empty test skeletons. Each test node in the tool's explorer is accompanied by a “traffic object” subnode that retains the most recent request/response pair generated by that test. So you can execute a test, then click on the traffic object to examine the exact exchange in Literal (unadulterated XML), Tree (a navigable tree explorer), or Element (pseudo code) representation.
All the tests associated with a single Web service are represented as a test suite node, and within that will be yet more test suite nodes, each representing tests on a particular Web method. Only when you get down to single request/response nodes do suites become simply tests.
Although Parasoft follows the standard pattern of representing projects in an explorer tree, nonleaf nodes in the tree are not simply containers. Adjusting attributes on a parent node alters the behavior of all child nodes accordingly. For example, select the test suite node corresponding to a complete Web service, and a tabbed configuration window appears. From that window, you can alter the execution order of all child test suites, setting them to either run sequentially or in parallel, or specifying that, say, Test B execute only if Test A succeeds. Hosts of other attributes – what version of SOAP to use (1.1 or 1.2), message encoding, default timeout, and more – are similarly alterable. This makes for quick and easy manipulation of project-wide behavior.
When you identify a WSDL on which to create a collection of test suites, you must tell SOAtest of the messaging pattern – synchronous or asynchronous – employed, and SOAtest will create tests suited to the pattern. Also, you can set SOAtest to enforce policy configurations on a project's WSDLs and SOAP messages. A policy configuration is a set of predefined rules, roughly analogous to coding standards. SOAtest will alert you when a WSDL or SOAP message violates one of the rules, so policy configuration provide a means of enforcing best practices among all your organizations Web-service assets.
SOAtest lets you turn a set of functional tests (which is how tests start their lives) into a load test with near flick-of-a-switch ease. Simply select the Load Tests item from the menu and choose how the load is applied (that is, whether the number of virtual users increases linearly, is steady, or tracks a more complicated curve). Once the load testing has started, you can view real-time graphs that plot the current load, as well as request-response turnaround times. You can halt the testing at any point, change the subset of suites being used, and restart.
SOAtest is overrun with useful support tools. Examples are an XML beautifier, various XML encryption tools, and a module that manages callbacks for asynchronous HTTP testing. SOAtest also supports client-side testing. Point it to a WSDL, and it will examine the WSDL structure and create "stubs" that will mimic the Web service's behavior. SOAtest handles testing coming and going, and it has an overall feel of maturity missing from several of the other tools.
Peering through the suds
This is a complicated field from which to draw a favorite. If your testing involves more than just Web services, and your development is primarily Java, then tools such as LISA or SOAtest are worth considering, in that they'll let you drive two nails with one hammer. If, however, you are only interested in SOAP-based Web service testing, and your QA staff is relatively new to the technology, SOAPscope is the obvious choice.
On the other hand, your choice may be based more on your philosophy of testing. Specifically, which is better: coding tests or building them visually (in other words, from menu and button selections within a GUI)? Tools such as SOAPSonar, LISA, and SOAPscope have chosen the latter to escape the former. But shielding the user from code necessarily places that user in a constrained environment. Those tools tacitly assume that their GUIs are flexible enough to meet all or most testing demands. I am not so sure. I favor the tools – QEngine and SOAtest – whose scripting capabilities are easily accessed. And of those two, SOAtest does the better overall job.
Finally, you should take advantage of the fact that almost all of these tools – SOAPscope being the notable exception – offer free versions that enable you to sample their capabilities with no time limitations.
Ease of use (20.0%)
Overall Score (100%)
|AdventNet QEngine 6.8||8.0||7.0||7.0||8.0||7.0||7.0|
|Crosscheck Networks SOAPSonar 3.0.5||8.0||8.0||8.0||8.0||8.0||9.0|
|iTKO LISA 3.6e||9.0||7.0||7.0||8.0||7.0||8.0|
|Mindreef SOAPscope Server 6.0||8.0||7.0||8.0||8.0||9.0||7.0|
|Parasoft SOAtest 5.1||9.0||8.0||8.0||8.0||8.0||9.0|
Windows 7 is suddenly telling users it isn't genuine -- and it has nothing to do with Windows being...
Windows users are reporting significant problems with four more October Black Tuesday patches
The larger design is very welcome, but there's much more to the iPhone 6 than a bigger screen
Sponsored by Rackspace
Sponsored by Nuage Networks
Sponsored by Fibre Channel Industry Association
InfoWorld picks the best hardware, software, development tools, and cloud services of the year
A series of cascading design woes threatens to overwhelm what should be a simple setup for a...
The 2015 Technology of the Year Awards is an embarrassment of riches, with more inventive new products...
Here’s what to do immediately after you install the Windows 10 January Technical Preview