Opensphere Blog

Regression Tests on Messaging and Webservice
base for integration projects and SOA initiatives.
Simple and with minimal overhead.

Opensphere and testing in agile

Software development has changed a lot compared to the very beginnings of the IT industry. Software companies tend to move away from the waterfall development model, in which each and every stage was started only after the previous stage has finished. This doesn’t really work in today’s world. To remain competitive, software companies have to be more flexible and waterfall model didn’t allow for much of flexibility.
Agile has it, but it’s not for everyone as in case of short development sprints, it requires fast reactions to the changing situation. In agile, everything happens almost simultaneously: development, testing and documenting. And everything starts with...
Planning

Planning is about estimating efforts and committing to implementing certain user stories in current sprint. Efforts estimations include both - implementation and testing. Once a team commits to a certain user story, it defines test cases which will cover the acceptance criteria. Let’s try using Openshpere for that...

File -> New project...
Let’s start from creating a new project where we will store our work in this sprint...


Now let’s click the icon to add a new test suite.

Since the plan is to have the test cases related to the given user story grouped under the same test suite, we will name the test suite after the user story and put a brief description of that user story along (Opensphere has some more advanced editing options here, so you’re not limited to plain text). Now we can move on to creating test cases. We can do that either by right-clicking the newly created test suite node and selecting the ‘Add Test Case’ option from the context menu, or by clicking the icon on the toolbar.

The test case name should also be descriptive so those who don’t feel like reading the description can get the idea right away. The description isn’t mandatory, but it should be there, so it’s clear what this test covers and how it should behave. Furthermore such description will be visible to other parties once the executed tests are published in HTML format. Now we can move on to modeling the test case logic. Since the success requires a proper entry in the database, we will need to query that DB for given entry and compare with given data. SQL Query Compare component should do the trick.

We have to provide the reference data...


…and specify how to retrieve the actual data for comparison.


Let’s put a dummy WS Client components and connect everything to reflect the test case workflow.

The dummy component can be replaced as soon as the WSDL is ready. The plan for the WS Client is to send a SOAP REQUEST message to the upcoming web service that will – amongst other things - create an entry in the database and notify the WS Client about the success of the operation. Updating our test case will be only a matter of providing the WSDL file and defining the triggering messages. This is done in the Operation Invocation Messages tab of the WS Client component properties window. Defining the message is as easy as clicking the icon and selecting the message defined in the WSDL. The only thing missing now is the actual web service which implements the user story. But since it’s been only 5 minutes since we started, I seriously doubt that it’s already available...

Blog tags: 
Share/Save

User login