In my previous posting I described how to get started with the freely donwloadable  IBM Mobile Technology Preview. I took one of the samples provided and ran it in an Android emulation environment. That worked nicely to test some features of the samples but the speed of emulation is quite slow. This doesn’t make for a productive Code/Test cycle. In this article I want to describe an alternative approach using a Chrome Plugin known as Ripple. Ripple was created by the gloriously named Tiny Hippos, who were acquired by Research in Motion this year.

Hybrid Applications

One key aspect of the IBM Mobile Technology Preview is the use of PhoneGap to enable building Hybrid applications. That is applications that run in a browser and yet which exploit native device capabilities such as GeoLocation and the Accelerometer. In effect PhoneGap provides an device-independent JavaScript API hooking into a device-specific native implementations on each platform. The expectation is that the bulk of a Hybrid application is implemented in JavaScript that is portable across a wide range of devices. The Dojo mobile APIs will detect the particular device and hence allow suitable style-sheets to be selected giving a visual appearance that fiits the device standards. Hence we have a vision (to use an old phrase) of  “write once, run anywhere”. In my team’s experience this, with careful design, this vision can be realised.

When developing such Hybrid applications we can greatly accelerate the development/unit test cycle if we can simply run the JavaScript in a browser. This is where Ripple comes in: it provides an emulation of different device types and an implementation of the major PhoneGap APIs running directly in Chrome. I am going to show how to use Ripple to run the Mysurance sample, which uses GeoLocation in Ripple.

One note: some aspects of the Technology Preview, notably the Notification framework, depend upon Java code. We cannot test these aspects using Ripple as it currently stands. I believe that Mysurance is typical of most Hybrid applications in having very significant portions of the code in JavaScript and hence benefiting from testing in Ripple.


The application assets (web pages, JavaScript, css and images) are held in a assets directory. We need those to be accessible to the browser. It simplifies some of my other testing if I serve these files from a web server such as Apache or IBM HTTP Server.

I add these entries to my httpd.conf:

Alias /mysurance "C:/IBM/MobileTechPreview/samples/mysurance/eclipse-project/assets/www"

<Directory "C:/IBM/MobileTechPreview/samples/mysurance/eclipse-project/assets/www">
    AllowOverride None
    Options None
    Order allow,deny
    Allow from all

and restart Apache. I can now point my browser at


and see the Application

Ripple Installation

You can install Ripple into a running Chrome instance from this Ripple download site. On completion the Ripple icon is visible in the toolbar


The first time you access an application with Ripple installed you see this


which offers a number of possible runtimes, I choose PhoneGap, and then can select the Ripple icon and choose to Enable Ripple.


The Mysurance application now shows in a device emulator, along with various control capabilities.



If you explore the Ripple control capabilities you will see that you can select different devices, emulate the accelerometer and send events. Here I’m going to focus on the GeoLocation. Expanding that section I see


With a default location of Waterloo Station. We can change this, if we know the latitude and longitude of our desired location. I will choose a place in Yorkshire, finding it’s coordinates from this site.


Ripple now shows the new location


Back in the application I pick Accident Toolkit and from this menu


select Call the Police. This brings up a map centred on the location we specified to Ripple, with nearest police station identified.


Select the station gives us the contact details.



This does seem to be a very promising approach to testing some aspects of Hybrid applications.


In my last couple of postings I was considering various ways of Testing RESTful services, and specifically described using JUnit to test the service, exploiting the Apache Wink Client libraries. I was interested to see a comment from “Lior” who wanted to run tests as part of an automated build process, and hence wanted to avoid the need to start a server in order to host the service under test. “Ben F” then suggested using a Mock Library to emulate the HTTP Request and Response objects. This reminded me of a few Unit Testing principles and that took me to a whole different way of Unit Testing the service implementation classes. I believe that this approach, which I’ve implemented using JMock, is the key to creating a robust set of true Unit Tests.

So before getting to the explanation of the testing approach, I’d like to look briefly at the meaning of the tests I was previously writing . If, as I now believe they were not “True” Unit Tests what were they? And why does that matter.

Integration Tests and Unit Tests

The essence of the previous testing was that we were testing fully functional code, deployed to WebSphere Application Server (WAS). In fact I had some degree of cheating, in that I was not using a real persistence store but a very simple in-memory collection, but in a real development, I would have been testing my real, functioning code, making database calls etc. I have been doing something much closer to Integration Testing – in that I’m exercising the code I write after is is integrated with much infrastructure code (and my persistence layer.)

There’s no doubt in my mind that such Integraation Tests are valuable, and I am very comfortable in using JUnit for this purpose. Having a battery of such tests is extremely useful for testing in later stages of development, for example we can use them for Soak testing, Performance testing and, by submitting multiple instances simultaneously Stress testing and Concurrency testing too. However, there are several costs to developing Integration tests. First and foremost, as Lior observed, there is a dependency upon a significant stack of software – we can’t run the tests without starting WAS and a Database, and even in the RAD environment this takes some time.  Furthermore the tests tend to be brittle, their correctness depends upon many components. There is much material elsewhere on this topic, for example here (in this reference, the testing we have been doing is termed Functional Testing), so I won’t labour the point and instead cut to the chase:

I want to write Unit tests. That is, tests which explicitly test the code I write to implement my service. While designing these Unit tests I will trust my dependencies such as Apache Wink, WAS and the database and make no attempt to test those dependencies.

We can then see that our Integration Tests serve a significant purpose in verify that our trust in our dependency components is justified. However the ratio of Unit Tests to Integration Tests can be quite heavily biased towards Unit Tests, provided that we can trust our dependencies – and if our dependencies themselves have good Unit tests then we can indeed trust them. [This approach is something of an over-simplification. It is remarkable how many defects are discovered when we first integrate pieces of an application. Hence I am a strong believer in early Integration Testing.]

My challenge then is to write Unit tests for the service code, and the rest of this post will explain how. The beauty of the JAX-RS programming model is that it makes this testing very easy.

Unit Testing POJOs

The key features of programming models such as JAX-RS (and EJB3 and JPA) is that we have simple Plain Old Java Objects (POJOs) that are annotated to enable the application server container runtime to provide infrastructure services. This service implementation code is executable stand-alone, and hence we can test it directly, without using any application server.

    public class BookEditionResources {

    @Produces( MediaType.APPLICATION_JSON)
    public Response getEditionByIsbn(
                     @PathParam("isbn") String isbn ) 
        BookEdition edition
                = EditionStore.getInstance().getEdition(isbn);
        if ( edition != null){   
            return Response.ok(edition).build();
        } else {
            return Response.status(Status.NOT_FOUND).build();

Specifically we can call

        getEditionByIsbn( /* a test value */ )

from our tests. However, things are not quite right for true unit testing, there’s a little work to do:

  • The use of Response objects ties us to the Wink APIs. Although we can use these APIs in Unit tests, the use of Exceptions as discussed in this posting would be make for a cleaner, more easily tested implementation
  • The service classes are currently packaged in a WAR file, and hence are not publically callable by our tests. Some repackaging of the code is needed.
  • There is a dependency on Edition Store, as things stand we’ll not be able to test without whatever persistence mechanism it uses – this leads to the use of Mock objects

I’ll address each of these in turn, first I’ll adjust the implementation delegating the detail of the Response creation to the Wink runtime.

POJO Business Methods

The goal here is to write natural Plain Old Java code, with no special logic relating to it’s role in implementing a RESTful service – all the RESTful characteristic are added by annotation. So first I adjust my service method:

    @Produces( MediaType.APPLICATION_JSON)
    public BookEdition getEditionByIsbn (
             @PathParam("isbn") String isbn
             ) throws NotFoundException {
        BookEdition edition
            = getEditionStore().getEdition(isbn);
        if ( edition != null){
            return edition;
        } else {
            throw new NotFoundException("No such isbn:" + isbn);
The important changes here are:

  • The return type is BookEdition not Response, we are delegating Response consturction to Wink.
  • I throw a NotFoundException when the supplied isbn cannot be found.

hence I create a NotFoundException class, extending WebApplicationException and also create a Mapper class

public class NotFoundExceptionMapper
   implements ExceptionMapper<NotFoundException>
    public Response toResponse(NotFoundException ex) {
        System.out.println("Not found Mapping " + ex);
         return Response.status(Status.NOT_FOUND)

Which I register in my WEB-INF/application configuration class.


This combination of implementation, exception and mapper can be deployed and executed. However it’s not quite ready to test as the packaging needs adjusting, but before we get to that first let’s look at the Unit Tests we’d like to write.

Unit Testing a Service Method

Stripping away all annotations we have this service method

     public BookEdition getEditionByIsbn (
             String isbn ) throws NotFoundException {
        BookEdition edition
            = getEditionStore().getEdition(isbn);
        if ( edition != null){
            return edition;
        } else {
            throw new NotFoundException("No such isbn:" + isbn);

We can see here that the method has the following responsibilities

  • To pass the String isbn parameter to the getEdition() method. Note that this responsibility of the method is very simple: just pass the parameter. [This does make us a little suspicious, should there be some validation of that input? In more realistic implementations inputs might well be validated.]
  • If a non-null the value is returned from getEdition(), return that value.
  • If getEdition() returns null then generate a NotFoundException, which includes a diagnistic indicating which isbn was not found. [This is the stateless service design principle, the client can interpret this response with having retained any record of the original request.]

These responsibilities indicate the major tests that we want to do each of which will require control over the behaviour of getEdition(), which I achieve by using JMock, as I will explain later.:

  1. Verify that the getEdition() method received the expected value. Testing this will require mocking, which I will explain later.
  2. Verify that a non-null value returned from getEdition() is indeed returned. Again, testing this will require a mocking.
  3. Verify the correct production of a NotFoundException.

To give a flavour of the kind of test I am writing, here is a test of the second case, with one crucial detail, the mocking, elided:

        final String isbn = "0863699936";
        final BookEdition expectedEdition
              = new BookEdition("a", "b", isbn, new Date());

        // TODO: Initialise a mock to return that  expectedEdition

        // invoke the service method
        BookEdition retrievedEdition = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

The last two lines are the meat of the test, we directly invoke the service method, as a simple Java call, no HTTP involved, and verify that the expected value is obtained.

Next I want to explain now the Unit Test code here is able to compile and execute against the service code, which will eventually be deployed to WAS as a Web Application. To enable that I needed to do a little restructuring of my projects in RAD.

EAR, WAR, JAR and Project Structure

We have three executable pieces to consider: The Service EAR file that will be deployed to WAS, the Client code and the Tests. I need to enable the Client and Test code to use some of the classes contained in the EAR. I achieve this by packaging the service classes in JAR files, which are included in the EAR and also referenced stand-alone by Clients and Tests. Note that the EAR and each JAR file correspond to separate projects in RAD – when we determine the JAR files we are building we also determine the RAD project structure.

This diagram shows the relationships:


The major features are:

  • The service implementation classes, including the Resource Manager, are moved from the WAR file into their own JAR file, which is included in the EAR as a Utility JAR. Utility JARs in the EAR file are visible to the WAR and hence can still be referenced from the application configuration file.
  • The Wink jar files previously included in the WAR file are now moved to the EAR file also as as Utility JARs. [WAS classloaders are organised such that Utility JARs can “see” each other, but they cannot “see” inside WAR files.]
  • The Entity classes, the JAXB-annotated service payloads are in their own JAR file. This separation allows us to distribute the payload definitions to development teams working on Client code. Such teams (should) not need to see any details of service implementation.
  • Hence, Clients need only the Entity JAR, while Test code needs both Entity and Implementation JARs.

Setting-up these relationships in RAD requires a few adjustments.

Utility JARs in the EAR

The objective is to set up the EAR with Utility JARs as described above.

Ear Structure

My steps were:

  1. Drag the 8 Wink jars (json-20080701.jar etc.) from WEB-INF/lib directory of the Web Application project to the EAR project.
  2. Create an Implementation and Entity projects and move the source code from the Web Application project to those projects. At the moment, the Entity project contains just the payload class,
  3. Add the Implementation and Entity projects as Utility Jars in the EAR. Do this in the EAR project’s JEE Module DependenciesJeeModuleDependencies 

WAR file accessing EAR Utility JAR

The Web Application has no classes of its own and hence no compile-time dependencies, however it does need the utility JARs on runtime classpath specified in the Manifest. This is most easily specified via the JEE Module Dependencies of the Web project.


I add all the Utility Jars.

Unit Test Project – ClassPath

The unit test project needs compile-time and runtime access to the Service implementation and entities and also JUnit and JMock libraries. For simplicity of launching I construct a Build Path that contains everything needed at runtime, even though some of these items are not needed at compile-time.

First the references to the service components.

Project Dependencies

The service components are created from projects in my RAD workspace and so it’s just a matter of adding the Entity and Implementation projects to the Build Path


Next the libraries containing JUnit and JMock.

Test Libraries

I am using JMock 2.5.1 (download) and this in turn requires JUnit 4.4 (download), which is not the version of JUnit delivered with RAD 7.5. Hence I  downloaded both JMock and JUnit.

I then added the following JARs to the Test project’s library path. From Junit


And from JMock


Giving this:


Now to use JMock to remove the dependency on the persistence layer.

It’s a Mockery

In order to test without the full persistence layer we need to find a way to “Inject” a mock in its place, and for that we need some cooperation from the service implementation. As things currently stand our service implementation was not designed for testing and so we have no easy way to inject the mock.

So first, a touch of refactoring …

Enable Injection with Dependency Getter

Currently we have this code, using a singleton pattern to get at the persistence layer. [It was quick demo code, all criticisms may be taken as accepted.]


There are several possible injection techniques we could use ( Martin Fowler  lists a few ) I’m going to use a technique that can be introduced without impacting any existing users of the service.

Rather than directly use the Singleton, I introduce an accessor function:

    protected EditionStore getEditionStore() {
        return EditionStore.getInstance();

The service code is modified to:


You will notice that the Getter is protected, this allows the Unit test to …

Inject a Mock by Overriding the Getter

I create an anonymous class, derived from the class we are testing, the BookEditionResources resource manager and override the getter to inject the mock.

manager = new BookEditionResources(){
            protected EditionStore getEditionStore() {
                return mockStore;

The derived class implementation methods now all use the mockStore because they use the getter. To complete the picture I need to code to create the mockStore object and code to control and verify the mock’s behaviour.

Setting up the Mock

The full initialisation for the tests looks like this, I’ll explain the details in a moment, but first just not that the bulk of the work is done in the setup()  method, which is annotated with @Before specifying that it is initialisation code run Before each test.

public class InjectingTest  {

     Mockery mockery;
     EditionStore mockStore;
     BookEditionResources manager;

    public void setup() {
        mockery= new JUnit4Mockery() {{

        mockStore = mockery.mock(EditionStore.class);

        manager = new BookEditionResources(){
            protected EditionStore getEditionStore() {
                return mockStore;

The major features here are:

  • @RunWith(JMock.class) annotation instructs JUnit to use a JMock test executor.
  • The Mockery mockery is the factory for the mock objects that the test will use. It is an instance of JUnit4Mockery, that is initialised with a (don’t you love the term?) Imposteriser 
  • The role of the imposteriser ClassImposteriser.INSTANCE is to enable the mocking of concrete classes – by default JMock would only enable the mocking of Interfaces.
  • The mockery then is used to mock the EditionStore, mockStore = mockery.mock(EditionStore.class);
  • that mockStore  is the object I use in the overridden getEditionStore() method.

Setting Expectations

I can now define the tests, whose execution depends upon particular mock object behaviours. To illustrate that, first a test of a successful find

Expection of a Successful Retrieval

    public void testRest() throws Exception {
        final String isbn = "0863699936";
        final BookEdition edition
              = new BookEdition("aTitle",
                                new Date());

        context.checking(new Expectations() {{
            oneOf (mockStore).getEdition(isbn);

        BookEdition retrievedEdition
                   = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

The test creates an example edition object with a known isbn, and then configures the mock object. Our test is in two parts:

  1. We want to verify that the resource manager’s getEditionByIsbn() method correctly invokes the persistent store with the isbn parameter (yes in this case this is a trivial case, but imagine a more realistic service implementation with serious transformations or complex conditionality)
  2. We want the mock object to return a known, specific value back to getEditionByIsbn(), then later we will test that the expected value was passed back.

hence I specify two mock expectations. In the mockery.checking(new Expectations() {{  …  }}); initialiser I write:

            oneOf (mockStore).getEdition(isbn);

which specifies that the mock should expect exactly one invocation of the getEdition() method, and furthermore I specify the exact expected value of the parameter. Then I also specify what the mock should return with:


We now have a fully specified mock for this test case and can code the test:

        BookEdition retrievedEdition
                   = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

This is conventional JUnit code, and in the context of our JMock environment should exercise exactly the Unit under test, with no external dependencies. This test includes the assertion checking that the expected value was returned. One final thing remains … did the mock receive exactly the expected calls? That is, were the mockery’s expectations satisfied.

Verifying That the Expectations are Satisfied.

Checking the mockery expectations is something I need to do after every test and so that’s best done in a JUnit @After method:

    public void verify(){

Expectations of Not Finding an ISBN

In a similar way I can set up the mock to emulate the case of a lookup of an unknown ISBN

    public void testNotFound() {
        final String isbn = "noSuch";

        context.checking(new Expectations() {{
            oneOf (mockStore).getEdition(isbn);

        try {
           BookEdition retrievedEdition
                      = manager.getEditionByIsbn(isbn);
           fail("unexpected success");
        } catch (NotFoundException n){
            assertTrue(n.toString().indexOf(isbn) >=0);

The mockery is simply configured to return null, which the defined value in the case of an unknown ISBN. The test itself is written to expect an NotFoundException to be thrown, and hence to fail if the lookup should succeed. And the test also checks that there is a meaningful error in the exception – the expectation is that the exception should contain the unfound isbn.


That’s a pretty long explanation of something that in practice takes a very few minutes to set up. If right at the start we plan for testing, create projects in the kind of structure shown here, and design implementation for dependency injection then creating Mocks and Unit Tests is very little effort.

In the previous post I described using the Apache Wink Client facilities to create a JUnit test for an operation provided by my example Library RESTful Service. The heart of test was

        String resoruceUri =
        Resource editionResource
                   = libraryClient.resource(resoruceUri);

        BookEdition edition = editionResource.get(
                      new EntityType<BookEdition>(){}
        assertEquals("0863699936", edition.getIsbn());

where we create a URI for a particular Edition of a Book from its ISBN and attempt to GET the representation of that Edition’s state, eventually verifying that the expected data is retrieved.

One of the beneficial effects of writing Unit Tests is that it focuses the mind on edge cases and error conditions. On seeing this test, we wonder “What should happen if the ISBN is not found?”. That question leads us to exploit several additional features of the the Apache Wink programming model in both service implementation and test client. 

Service Implementation Errors (1) – HTTP Response Codes

The service can deal with the case of an unknown ISBN by returning an HTTP Response Code 404, which is defined in the HTTP protocol as Not Found.  This is simple and unambiguous, consistent with the spirit of the RESTful use of Web principles, and the client needs no additional information. Other errors require different handling as I’ll explain later.

One other point, there is an alternative Response Code that we might use in some situations: 410, Gone. This might be useful in cases where a record has been in the system and has been marked as deleted.

So, how does a JAX-RS service return a Response Code such as 404 or 410? Again, the JAX-RS programming model makes implementation easy, but we do need to adjust the service method’s return type to accommodate the two possibilities of success (in which case a representation of an Edition is returned) and a Not Found response.

  @Produces( MediaType.APPLICATION_JSON)
  public Response getEditionByIsbn(
         @PathParam("isbn") String isbn)
     BookEdition edition 
          = EditionStore.getInstance().getEdition(isbn); 
     if ( edition != null){
        return Response.ok(edition).build(); 
     } else {
        return Response.status(404).build(); 

Hence I specify a return type of Response. Then in the implementaiton use either the Response.ok() or Response.status() factory methods to create the ResponseBuilder object. This builder object can be used to specify various attributes of the response and finally is used to build() the Response. Note the “chaining” approach to using the ResponseBuilder.

(For brevity, I used the literal 404, the enum would make for more readable code.)

That’s now enabled the service to send a 404 response when an unknown ISBN is requested. So, back to the test client to check for that response.

Client Error Handling (1) – HTTP Responses

The Wink APIs offer a ClientResponse object, used like this

    public void testFindBad(){
        Resource editionResource = makeEditionResource("SILLY");
        ClientResponse response = editionResource.get();       
        assertEquals(404, response.getStatusCode());

If the service returns success HTTP Response Code, such as 200 OK, or 201 Created then the ClientResponse get() method gives access to the entity data

    BookEdition edition = editionResource.get(
                     new EntityType<BookEdition>(){} );

Summary so far: the Response class gives the service provider the capability of returning meaningful response codes and its client counterpart ClientResponse enables clients to detect these response codes. Now let’s move on to another class of problem, I call these Transient Errors.

Service Implementation Errors (2) – Transient Errors

Service implementations are likely to depend upon other software components, which may fail. For example a dependency on an Database whose server might be unavailable. I term such error scenarios “Transient Errors”. The implication being that the same request by the client, if resubmitted in the future will succeed. As a service designer we must consider how to deal with such error conditions.

Just one aside about this concept, I would also treat some errors detected by  the service’s own implementation as transient. For example suppose that the implementation had some defensive programming, such as a default case in a switch statement .

  switch ( getRole() ) { 
   Role.Tinker:  // code 
   Role Tailor: // code 
   default: // unexpected, happens if code is defective 

What to do in the default case? I treat this as a a Transient Error, it will be resolved by installing a corrected version of the service implementation, then if the client retries their request it will succeed. From the client’s perspective this is equivalent to a Database being unavailable and then restarted – eventually a retry succeeds.

So, given that the service implementation detects a Transient Error condition what should it do? I see two responsibilities:

  1. On the server log suitable diagnostics in order to enable operators to detect and rectify the problem.
  2. Return a meaningful error to the client.

The production of diagnostics is a system-wide issue that should be addressed by explicit architectural and design policies, by default I tend to use java.util.logging but I’ll not dig further into that area here. I do want to look a little more deeply at the second item: the returning of error information to the client. I will explore three different approaches.

First, we can simply throw an exception:

    throw new WebApplicationException(

This is will result in a simple response to the client with a 503 Response Code. In many situations this is sufficient. The client doesn’t know what is broken, but can at least call a help desk and report the problem. However sometimes giving a little more information can either assist in error reporting or serve to clarify the nature of the problem. Reading the HTTP specification concerning response codes we see the recommendation:

Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has erred or is incapable of performing the request. Except when responding to a HEAD request, the server SHOULD include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. User agents SHOULD display any included entity to the user. These response codes are applicable to any request method.

That leads us to the second approach. In the server we can produce an error response with a String entity payload like this

    return Response.serverError()
                   .entity("Database XXX unavailable")

And then in the client access the message:


This approach is likely to be sufficient for GET operations. However when we come to POSTs and PUTs more detailed responses are needed. That leads to the third possibility, using a structured Entity response rather than a simple message. First let’s build the test for POSTing of a new Edition, then look at the responses the create request might give.

Unit Test for POST

The test is constructed as follows. First I do some preparatory work:

  • create a suitable method and annotate it as a JUnit test
  • create the client Resource object for the Editions URI
  • create the Bean containing the details for the new Edition

    public void testPost() {

    Resource editionResource = libraryClient.resource(

    BookEdition theEdition = new BookEdition(
              /* title, isbn etc */

Then I use the chained resource builder approach to specify that I am sending and accepting JSON  content, and pass the Java Bean as the edition data

    ClientResponse response = editionResource 

That calls the RESTful service operation and makes the response available. Now we can add assertions about the response code


And we can obtain the entity from the response and make assertions about its content too, that’s just the same as the GET example at the start of this post.

Now, what about error conditions. This is first and foremost a question of service design: what error conditions might there be in this case? What responses should the service give.

Service Errors in a POST operation

Just as with the GET case, infrastructure errors may occur and these can be treated as Transient Errors and reported using either of the two techniques I described: throwing a WebApplicationException or creating a serverError response, and populating it with an error message.

There are two other conditions to consider:

  1. Suppose that a record for the ISBN already exists: what should the service do?
  2. Suppose that the supplied data is invalid: for example in this case an invalid ISBN might be supplied. More generally, we can conceive of quite complex field-level validation and cross-field validation.

Duplicate Records

Error handling in a distributed, loosely-coupled, system is greatly simplified if services are idempotent. That is, a request may safely be issued more than once. This means that if a client submits a request and then does not see a timely response (perhaps due to a network failure) or the response is lost (perhaps due to a client crash) the client may safely resubmit the request.

If we have adopted the principle of idempotence then it follows that  an insertion encountering a duplicate record is not necessarily an error. It might simply mean that the client sent an insertion, which worked, but failed to see the response and hence has resubmitted the request. However, even in this case there are two possibilities to consider, depending upon whether the  state exactly matches the request.

Existing Record Matches

In this case the system is already in the requested state, most likely because the current request is a resend of a previously successful request. It is reasonable to treat this as a success. The next case is less obvious, this is when the system record state differs from the request content.

Existing Record Does Not Match

There are many possible scenarios that might lead to situation where there is an existing record in place that does not match the current request:

  • A different request (perhaps from a different user’s UI) has created a different record. The difference might be fundamental (a completely different book) or some small semantically trivial difference (“JRR Tolkien” versus “J.R.R Tolkein”)
  • Our previous request succeeded but then subsequently a different request updated the record
  • The current request is mistaken, it’s using the ISBN of a different Edition

In all these case it seems clear that it is unreasonable for the system, without Human intervention, to decide to replace the current record state with that requested. Hence the system cannot be taken to the requested state and we would return an error code. More on the error code in just a moment, but first an aside about an extension to this line of thinking.

Interleaved Requests

The underlying assumption behind the scenarios I explored above is that the system may receive requests in an order different from that in which they were issued and that optimistic locking approach will apply when designing RESTful services. That is we will not design the distributed system along these lines:

  1. User in UI attempts to GET a record
  2. System returns 404, NOT FOUND and LOCKS preventing anyone else from inserting
  3. User POSTs new record, which therefore cannot encounter a duplicate

Although such an approach would simplify our reasoning we find that it simply does not scale and requires excessive coupling between components. Instead I assume:

  • Requests from two users will be delivered in an arbitrary order
  • Requests from the SAME user may be delivered out of order

The second condition enables us to use highly scalable infrastructure design: we can have stateless, clustered servers, multiple routers, and use store-and-forward asynchronous transport. Generally, we find that scalability and reliability are easier to achieve if we use idempotent services that do not strictly order requests. [There are business cases where request ordering is essential, and in these cases some aspects of service design may be simplified at the expense of greater constraints on the infrastructure.]

An implication of the arbitrary ordering of requests is that the service may need to provide a way of ensuring sensible ordering of request processing. Consider a service that controls a mobile phone, allowing a customer to enable and disable a phone. A customer loses their phone, sends a DISABLE request. Then very soon afterwards they find the phone, and so send an ENABLE request. It’s clear that it would be very annoying for the customer to have those two requests processed out of order … the phone end-state would be DISABLED. With additional information in the request such as the time of the request the service can identify a “stale” request and hence intelligent (in)action.

Handling Duplicates in the Service Implementation

To return to the service implementation, I now add duplicate detection and return suitable error conditions. First the main flow of the service implementation:

    try {
            BookEdition added 
                          =  EditionStore.getInstance().addEdition(edition);
            return Response.ok(added).build();
        } catch (DuplicateException e) {
                   BookEdition alreadyThere = e.getAlreadyThere();

This now catches a DuplicateException from the addEdition() method, which contains the state of the pre-existing record. Thats now extracted into the alreadyThere variable.

Now by using the equals() method, we can determine whether this record matches the requested state.

            if ( edition.equals(alreadyThere )){
                return Response.ok(alreadyThere).build();
            } else {

If the state does match then we treat this as a succesful (idempotent) insertion. We return the alreadyThere value, accommodating the possibility that the persisted state is enriched beyond the values specified in the insertion request.

Now for the case where the pre-existing record does not match the requested insertion. In this case I think that a 409- CONFLICT is a reasonable  response code, and we note that the HTTP specification says for a 409:

The response body SHOULD include enough information for the user to recognize the source of the conflict. Ideally, the response entity would include enough information for the user or user agent to fix the problem; however, that might not be possible and is not required.

Hence we need to return something more than a string. I propose to return an object that contains both the requested data and the currently persisted state. It’s pretty obvious why we would return the current state, but why also include the rest? The idea is that we want the services to be stateless in the sense that the response can be understood in isolation, without reference to conversational state. Our response includes enough context so that the client can understand the response as it stands.

So I want to return an object of some complexity. I view “Already Inserted a Different Value” as a special case of an Optimistic Lock violation, and hence I’m going to use a more general purpose object to hold the two values of interest, the current request and the value we find in the store.

     @XmlRootElement(name = "OptimisticFailure")
     public class OptimisticFailure {
         private BookEdition m_requested;
         private BookEdition m_unexpectedValue;

         // and getters, setters and zero-arg constructor

Note that we can imagine presenting this information to the user, saying “You asked to create a record like that, but we found one like this. Would you like to make any amendments?”

So, how do I code the return of such an object as part of the error processing? For this I use a specialised Exception Class.

Throwing Custom Exceptions

First I defined a Custom Exception, this is derived from WebApplicationException

    public class LibraryDuplicateInsertionException 
                extends WebApplicationException {
    private OptimisticFailure m_failureDetails;
    public OptimisticFailure getFailureDetails() {
        return m_failureDetails;

    public LibraryDuplicateInsertionException(
                           BookEdition request,
                           BookEdition alreadyThere ) {
                  = new OptimisticFailure(request, alreadyThere);

I can then complete the service implementation:

            if ( edition.equals(alreadyThere )){
                return Response.ok(alreadyThere).build();
            } else {
                throw new LibraryDuplicateInsertionException(
                                                     edition, alreadyThere

Before running this code one further task remains, and Exception Mapping Provider is needed.

Exception Mapping Providers

An Exception Mapping Provider is responsible for creating a Response object from the Exception information. JAX-RS specifies that the provider must implement the Interface ExceptionMapper<T> and so I created this Provider class:

public class LibraryDuplicateInsertionExceptionMapper  
  implements ExceptionMapper<LibraryDuplicateInsertionException>
    public Response toResponse(
               LibraryDuplicateInsertionException ex)
         return Response.status(Status.CONFLICT)


  • The use of the @Provider annotation
  • the toResponse() method has responsibility for transforming the exception to a suitable response. This is a very simple example giving a single response code 409 – CONFLICT and returning a simple payload; the Exception was designed to have suitable data available.
  • The OptimisticFailure  returned by getFailureDetails(). already had JAXB annotations

Finally I announce this new provider to the Wink runtime. You may recall that Wink servlet’s parameter refers to the WEB-INF/application property file, which originally contained just the name of the ResourceManager class.  I add the name of the new provider class to this file.



I can now execute test sequences such as

  1. Delete record with ISBN 1234  – this is idempotent and so always succeeds whether or not the record exists.
  2. Insert record with ISBN 1234 – should succeed, no record can exist
  3. Insert identical record with ISBN 1234 – as it is identical, although it is a duplicate insertion is treated as succesful
  4. Insert a record with ISBN 1234 but differences in other fields. This fails with 409 – CONFLICT and a payload containing current record and new requested record.

Validation Errors

It should be clear that validation errors can be addressed in the same way, returning a complex object that contains the error details. I think a simple collection of Name/Value pairs may be the simplest way to capture a set of validation results. For validation of simple Bean the name can be either a specified field or “global” for more general validation errors.

In a previous posting I described how I exercised my RESTful services using the Firefox extension Poster and also by calling the service from JavaScript in the browser using the Dojo framework. Poster is especially useful because we can very quickly exercise a service. However for testing I want something more:

  1. To repeatedly run a set of tests covering a variety of inputs.
  2. To specify expected outcomes for these test. Using just Poster I must inspect the values by eye, which is error prone.

In other words I want to use a tool such as JUnit. To do that we need to be able to write Java clients of our RESTful services. This is not something directly addressed by the JAX-RS specification, however the Apache Wink framework does include APIs that enable writing clients of a RESTful service. So in the remainder of this post I will explain how I created tests for my service.

Wink Libraries

Previously, when using Wink in a JEE application deployed to WebSphere I copied Wink libraries into WEB-INF/lib directory of the Web Application, and hence these libraries were available at both compile time and runtime. The JUnit tests run as stand-alone Java applications and can pick up the Wink libraries from classpaths specified in RAD. There’s quite a few ways of doing that, for example simply referring to the JARs whereever Apache Wink was downloaded, or creating a User Library. My preference is to set up projects that can be checked into CVS and checked out by any developer, with no dependencies on external downloads.

Hence I created an Apache Wink project and copied the Wink download to that project.


I then set up the project build path to refer to the JARs needed from that project. I select the project and then

    rightClick –> Properties –> Java Build Path –> Libraries Tab

Then Add Jars and select JAR files from the Apache Wink project.


These JARs include the APIs we use directly and so are needed on the Build Path


Additionally, we need the JSON and logging libraries at runtime. It’s convenient to add these libraries to the Build Path (even though we don’t need them at build time) because then simple “Run As” commands pick up the correct runtime classpath.

    slf4j-simple-1.5.8.jar (or slf4j-jdk14-1.5.8.jar from here)


Note that If we now check the ApacheWink and the test project into CVS we can create a Project Set File for those two projects which can be checked out by any developer and they immediately have a complete working test program, with no external dependencies.

Next I completed setting up the Build Path by enabling the use of JUnit 4 …

JUnit 4

Still in the Libraries tab and click Add Library, select JUnit and click Next. From the drop-down select JUnit 4 and click Finish. I prefer JUnit 4 to JUnit 3, it’s use of annotations simplifies developing tests.

Client Code

Now to write the test. We are going to invoke the service operation which retrieves the state of a Book Edition with a particular isbn


And in our client we need a Java object that corresponds to the state returned. In the server code we already have a suitable Java Bean and so I simply copied that code into my Test (that’s just a temporary expedient, clearly it would be better to refactor shared code to a common library.)

The test code looks like this:

package org.djna.library.test;

import static org.junit.Assert.assertEquals;

import org.apache.wink.client.EntityType;
import org.apache.wink.client.Resource;
import org.apache.wink.client.RestClient;
import org.junit.Test;

public class EditionTester {
    public void testExample() {
        RestClient libraryClient = new RestClient();
        String resoruceUri =
        Resource editionResource
                   = libraryClient.resource(resoruceUri);
                   "Accept", "application/json"
        BookEdition edition = editionResource.get(
                      new EntityType<BookEdition>(){}
        assertEquals("0863699936", edition.getIsbn());


There are a few points worth noting here. First concerning the JUnit tests:

  • import static org.junit.Assert.*; Java 5 enables static imports giving access to the various assertions that express the checks performed by the test.
  • The @Test annotation specifies that this method is a test. Hence I can request “Run As Junit” on this class and the method here will be executed.
  • The assertEquals() method checks that the edition retrieved from the service has the expected ISBN.
  • Some of the code here is standard “setup” code and when we have a few more test methods we’d refactor it into a setup method, which would be annotated with @Before.

Now to look at the Wink client code. This APIs used in this test code are worth studying, they could be useful in application code too. It’s clear that writing a client of a RESTful service is very easy.

  • RestClient libraryClient = new RestClient(); gives us client object to work with  …
  • and then we can use the client to specify a resource Resource editionResource = libraryClient.resource(resoruceUri);
  • Before invoking the method we need specify the MediaType that we can accept in the response. If you look at the service code you’ll see that the @Produces annotation specifies only JSON responses and by default this is not the requested type. So we have editionResource.header("Accept", "application/json" );
  • Finally we invoke the service, specifying the expected response Class: editionResource.get( new EntityType<BookEdition>(){} );


Creating JUnit tests for the RESTful service using the Wink client APIs is straight forward. There’s a few more test to write here, for example what happens if we specify a resource that does not exist? And I also need to test the edition creation POST operation. That’s for the next posting.

Having used Apache Wink to  implement my basic JAX-RS service, which simply retrieves some information from Library, details of a Book Edition given its ISBN, I next want to implement an operation to add an Edition to the library. This operation will use a POST method to the resource


that is, the resource that specifies that entire collection. Once again, the JAX-RS programming model makes the implementation of the service operation very easy, but I stumbled across a minor annoyance in testing the operation from Dojo, so in this posting I’ll take a side trip into the Dojo JavaScript libraries.

JAX-RS POST implementation

I already have a resource implementation class and so I just need to add a method to process the POST request

public BookEdition addNewEdition(BookEdition edition) {
    return EditionStore.getInstance().addEdition(edition);

There are a few points to note here.

  • The input parameter edition is not annotated. Contrast this with the GET example,  getEditionByIsbn(@PathParam("isbn") String isbn), which enabled access to an element from the URI. Here the edition parameter is mapped to the request body, that is the content which is POSTed. Such a parameter is termed an Entity parameter. A method may have at most one entity parameter.
  • I have decided to permit both JSON and XML payloads to be Posted. JSON is very convenient from Dojo, as we shall see, but other clients may prefer XML. The Wink JAX-RS implementation will deal with deserializing JSON or XML depending upon the MediaType of the request.
  • As a design decision, I have chosen to return the state representation of the added edition. In this case it’s overkill as the state was passed in and we make no modifications in adding it to the Library. However in general it’s quite possible that there is some enrichment of data as it is inserted into the store. Hence I use the general pattern of returning a representation of what has been stored.

Testing with Poster

We can quickly test this method using a Firefox extension: Poster.


Specifying an Edition in XML

      <author>John Power</author>

and the content type application/xml. The response is as expected, containing a representation of the Edition we added. In the same way, we can provide a JSON payload

      {"title": "Invincible",
       "author": "John Power",

And content type application/json. The content type of the response is determined by the HTTP Header field Accept with a value of application/xml or application/json.


Testing from Dojo

One very likely scenario for using REST services is to call  an operation from an Browser-based rich client application. The Dojo JavaScript libraries are one possible technology for constructing such applications.

This is some sample code which invokes the addNewEdition() operation:

      var myEdition = {"Edition":{“author”:”x”, “title”"isbn":"44"}};
      var xhrArgs = {
                url: http://myhost/LibraryWink/library/editions,
                postData: dojo.toJson(myEdition),
                handleAs: "json",
                headers: { "Content-Type": "application/json"},
                load: function(data) {
                           =  "Message posted.";
                error: function(error) {
                            = "Error :" + error;
                           = "Message being sent…"; 
        var deferred = dojo.rawXhrPost(xhrArgs);

I’ll quickly summarise a few key features here, but first note in particular the actual POST request:


You note that I use rawXhrPost rather than


I can find no documentation that explains why rawXhrPost() is needed, but after much experimenting and imprecating I discovered that rawXhrPost() does indeed work and that xhrPost() sends no payload.

The other salient points:

  • postData: dojo.toJson(myEdition) converts the payload object to JSON.
  • headers: { "Content-Type": "application/json"} sets the MimeType in the header, hence allowing JAX-RS to correctly interpret the JSO string – the handleAs property relates to the interpretation of any response from the service
  • The url: http://myhost/LibraryWink/library/editions is our resource URI

Next – Unit Testing

So, that’s two ways of exercising a REST service, however as a test mechanism both leave a few things to be desired. In the next posting I’ll move on to using JAS-RS client