The Test Center inspects five worthy tools for keeping your services squeaky clean
SOAP is the currency of the SOA marketplace – for now, anyway. Though SOAP's significance may diminish as Web services evolve, its importance for the time being is unquestionable. Therefore, a substantial portion of the QA work by Web service providers and consumers must entail verifying the accurate exchange of SOAP messages. Not surprisingly, several SOAP-focused Web service testing tools have appeared.
I had an opportunity to look a five such tools: AdventNet's QEngine, Crosscheck Networks SOAPSonar, iTKO’s LISA, Mindreef's SOAPscope Server, and Parasoft's SOAtest.Readers of my earlier reviews of open source Web service testing apps will recall that those products required a relatively technical command of XML, SOAP, and WSDL (Web Service Definition Language). That is less a requirement with these tools; virtually all provide a user-friendly means of manipulating SOAP request-and-response data in ways that insulate the user from hands-on XML work.
Fundamentally, testing a SOAP-based Web service involves three activities: constructing a SOAP request, submitting it, and evaluating the response. As easy as that sounds, it is anything but. An effective SOAP-testing tool cannot simply rely on a user-friendly mechanism for building requests. It must also enable the user to organize and arrange requests in realistic sequences, provide a means of altering request input values, and intelligently tweak requests so as to expose the Web service to a range of good and bad usage scenarios. In short, you want the tool to run the Web service through a reasonable approximation of real-world activity.
In addition, the tool must be equipped with a collection of gadgets for evaluating responses. Such gadgets should include everything from simple string matching to executing an arbitrarily complex XQuery on the SOAP payload.
All of the tools reviewed here provide variations on the preceding capabilities. All make valiant attempts to shield the user from direct exposure to XML, and some keep users entirely in a protective GUI so that coding is never necessary. Meanwhile, most of the tools supply "authorized personnel only" doorways into more advanced testing functions that involve scripting, feeding request data from databases, parsing and filtering results, and so on.
Most also provide conformance verification of the format of SOAP messages and Web service WSDL files to the growing list of Web-service related standards and specifications – primarily the profiles from the Web Services Interoperability Organization (WS-I) and the WS-* specifications from the likes of OASIS and others. Some also offer load-testing capabilities so that you can unleash a squad of virtual clients on the Web service and measure its response to the increased traffic. Some take a "holistic" approach to Web service testing, recognizing that SOAP-based Web services are not the only form of service being presented on the Web.
The offerings are complex, and each could support an entire review on its own. I've done my best to cover the distinguishing features of each tool. You should consult the associated comparison matrix for a high-level glimpse of some of the more important characteristics.
AdventNet Qengine 6.8
QEngine's UI is a browser, which means that when you launch the tool, you're really launching an application server. In this case, the application server is Tomcat, which also fires up an instance of MySQL server for the tool's data storage system. QEngine tries to mitigate the inconveniences of running in a browser by installing the QEngine toolbar browser plug-in, which adds buttons that make it easier to control the QEngine system.
Logging into QEngine ushers you into the suite manager screen. From there, you can either create or import a new test suite, or choose to work with an existing suite. If you choose the latter, a dialog materializes, giving you the option to work on Web functionality, Web performance, Web services functionality, or Web services performance. In other words, a single suite can host up to four categories of tests, and these categories are never really aware of one another. So, for example, if you're doing Web functionality test work, you don't see the Web services performance tests within that same suite. This takes some getting used to.
Scripts perform actual test execution. When you add a new Web service to a test suite, QEngine queries the Web service's WSDL and from that generates a set of basic test scripts – one for each Web method on the service. Select a script from a suite's explorer tree, and the generated source code appears in an editor window. QEngine's scripts are written in Jython, the implementation of the Python language that executes in a Java virtual machine.
The prebuilt test scripts are extremely spartan; you have to flesh them out for them to be useful. This is a two-step process. First, you supply the content of the requests using a pair of menu selections: DataSet Configuration and Parameterization. Dataset Configuration lets you set the sources of your input data as either a database (QEngine supports Oracle, SQL Server, or MySQL) or a CSV file. After you've configured your datasets, choose Parameterization and you can set specific input values to be supplied either by the dataset you just configured or by manually entered values.
Second, you add response analysis to the script. This requires coding, but QEngine helps out with a large selection of built-in response-processing functions. A Function Generator dialog simplifies choosing the right method to call. Select a function from the categorized list, and the dialog provides a description and an input parameter list. In effect, it fills out the function call for you and pastes it into the script.
Even with QEngine's hand-holding, validating a response is not particularly easy, unless you choose a simple processing function. To really get inside the response, you have to pretty much peel apart the XML, so you might want to keep an XPath manual on hand. When you execute a script, the results are gathered into a report summarization screen. It's loaded with links that you can click to drill down into the specifics of the success or failure.
QEngine strikes a good balance between reliance on the UI and raw coding. Through the Parameterization dialog, you can enter test data without having to wrestle with XML. But when XML-wrestling is unavoidable, you can drop into Jython code and wreak whatever havoc you deem necessary. However, QEngine's user interface is not easy to navigate; more than once, I found myself unable to backtrack to a known location. In addition, inactivity for a period of time (which I was unable to deduce) logs you out. You have to log back in and crawl back to where you left off.
Crosscheck Networks SOAPSonar 3.0.5
SOAPSonar offers numerous ways to feed data into SOAP requests. Using the tool’s Automation Data Source, you can input data from an SQL database via ODBC (Open Database Connectivity), an Excel spreadsheet, or a raw file. If you want to reuse data in a previous SOAP request, you can configure the request to use a recalled entry, which propagates data from a previous response into subsequent requests. For the ultimate in data-generation flexibility, you can call upon an automatic data function. Such functions are drawn from an extensive library that ranges from manually entered data series to arbitrarily complex user-defined data generation algorithms.
SOAPSonar's fundamental testing unit is a test case. Normally, test cases are organized into suites; however, SOAPSonar's project tree view allows you to work with test cases in a sort of staging area. Test cases appear as nodes in the tree, attached to the parent node of WSDL-based Web service to which they apply. You can craft as many test cases against a given WSDL as you wish. Once you have a test case that has been verified and deemed ready, it can be moved into a test suite.
SOAPSonar operates in one of several modes. You choose the Current mode from a menu selection, and typically, you run SOAPSonar in QA mode, which provides functional testing. However, if you switch to Performance mode, SOAPSonar's load-testing features are enabled. You can define multiple virtual clients and configure them to execute tests against target Web services. SOAPSonar can also exercise a Web service's resilience to security attacks, using a patent-pending XSD Mutation technology. XSD Mutation modifies an outgoing SOAP request in ways that expose the Web service to known assaults such as SQL injection, XML bombs, and so on.
Although SOAPSonar is, in accordance with its name, primarily a SOAP-testing application, it can test REST (Representational State Transfer)-style Web service interfaces as well. For a given REST-style test case, you can enter comma-separated name-value pairs that the tool assembles into the request URL. You can also use SOAPSonar's entire range of input data creation capabilities to generate input values for the request, giving you the ability to craft as rich a set of REST-style tests as SOAP-style tests.
Once your army of tests is created, you can automate their execution, provided you purchase the additional APC License Component. This component adds features to the standard SOAPSonar package that include a command-line interface for integrating test scripts with the Windows Task Scheduler.
There is also a free edition of SOAPSonar: Personal Edition. It lacks features such as WS-Security validation, performance testing, and vulnerability testing. A comparison between the Personal Edition and the Enterprise Edition is available at the company's Web site.
SOAPsonar presents itself as the critical tool you need to fulfill Crosscheck Networks' vision of a Web service testing way of life: the “four pillars of SOA deployment diagnostics.” There's the functional pillar: verifying a given request produces the correct response and that a Web service fulfills its design requirements. Second is performance: measuring a Web service’s throughput and response times. Third, compliance: verification of adherence to recognized standards. Last is vulnerability: ensuring that the Web service is tolerant of and resistant to malformed requests. This is a fine collection of Web service testing principles, and SOAPSonar does an admirable job of upholding them.
iTKO LISA 3.6e
LISA's learning curve is smooth and easy. The tool imposes a cyclic test development process. Create a new test case, and LISA builds a test structure consisting of skeletal test steps that act like bookends – one is the start step, the other is the end step. Import a WSDL through the Web Service Step Wizard, and you're handed a list of that WSDL's Web methods. Select a method to work on, and the wizard opens an object editor to supply input data, simultaneously adding a new step between the bookends. Enter test data for the request, then submit the request to the Web service and see if what comes back looks right. If not, go back, tweak the step (or correct the Web service method), and try again. There are more details to this, of course – compliance testing, for one, but LISA supports that as well.
Apply the above process repeatedly for the different Web methods on the WSDL, and ultimately you'll have a complete test case for a specific WSDL. Test cases can, in turn, be gathered into a test suite, which is really just a kind of folder in the LISA environment.
LISA provides a healthy collection of test step types, though in most test cases, the majority of steps are of the "Web service execution step" type: Send a request, examine the response, and determine success or failure. Other step types can verify a Web service's compliance with various standards, execute external Java classes, or even call command-line scripts.
In addition, each test step can be adorned with a variety of filters and assertions. The former is provided to parse the content of response messages. For example, you might apply a filter to fetch a specific response value and store it in a property for use in later steps. The assertions manage verification of response data, WSDL, and message conformance. Also, the assertion section of a step specifies whether it has passed or failed, and it identifies whether execution control should proceed to the next step or to some other step.
The full LISA product is very Java-aware. It can generate JUnit tests, functional tests of Java classes, database tests via JDBC, and EJB tests. The free version, WS-Testing, is limited to generating only test cases and for Web services only. In addition, some test steps types are unavailable (it lacks any J2EE-related test step types, for example).
For all the initial ease of learning LISA, navigating the UI is sometimes bumpy. For example, when entering a new value for a field in a test step, there is no obvious way to save that value, nor to cancel the change. I found the only way to cancel input was to select a different node in the explorer, then dismiss the dialog that asked if I really wanted to do that.
Supreme Court's decision is bad news for developers targeting the U.S. market, who will now have to...
Siri gets smarter. Apple Watch gets much more useful. And is Apple Music poised to kill other streaming...
People who have it don’t want it. People who want it don’t have it. Here's how to go from iconed to...
CoreOS, Red Hat, Ubuntu, VMware, Rancher, and Microsoft put unique twists on the container-focused,...
Slack, Jive, and Symphony hope business-oriented collaboration in the Millennial style will displace...
The community around the R language is the real deal -- not just another feel-good open source...
A manager ignores overtime rules and insists on a 40-hour workweek from the department regardless of...