In the previous posting I used the Jackson JSON Processor libraries to parse and format JSON strings. The Java representation of the JSON String being a HashMap whose contents are named attributes some of which are themselves HashMaps, or indeed arrays of HashMaps. This is a very flexible representation, but writing client code that works with the data is (to my eyes) quite cumbersome:

        HashMap<String,Object> payload
                    = mapper.readValue(from, typeRef);

        Boolean success = (Boolean) payload.get("success");
        System.out.println("success " + success);

        ArrayList<HashMap<String,Object>> artists
              = (ArrayList<HashMap<String,Object>>)
        System.out.println("artist count " + artists.size());

        for (HashMap<String,Object> oneArtist: artists){
            System.out.println("artist name " + oneArtist.get("name"));

We see access to attributes such as “success” and “artists” and the requirement for the coder to cast the values to the appropriate type. Note that any errors in either the names of the attributes or their types will not be discovered until runtime.

Coming from an Object Oriented background my personal preference would be to have Object representations of attributes. I would like to have objects such as Artist and Albumn – I feel uncomfortable having


code scattered across my business logic. (Your tastes may differ!) This article shows how to use Jackson to populate such objects, and in particular how to customize the mapping to Java objects, which is useful in more complex cases.

Sparse Object

Let’s begin with a simple object representing the payload.


    public class CatalogueResponse {
        public boolean success;

I’m creating this object on the basis of my understanding of the JSON string to be parsed. [For simplicity I’ve made the attribute public, in real life it would be private with a suitable accessor method.]

      "success": true,
      "artists": [
            "name": "Ashley Hutchings",

and to keep things simple I just consider one attribute: “success”.

I adjust my application to specify that CatalogueResponse is the class the parser should produce.

    CatalogueResponse catalogue
             = mapper.readValue(from, CatalogueResponse.class);
    System.out.println("Catalogue " + catalogue);

This code runs but produces the exception:

   Unrecognized field "artists"

At development time this is potentially a helpful error message, telling us that our payload object does not match all the fields in the JSON String. When deployed in a production environment we would probably prefer to decouple the client from upgrades to the server that produce new attributes. We’d request that such unmapped fields are silently ignored. This is specified using another configuration feature:




Running the application now produces the expected (if minimal) output:

       "success" : true

Objects, Names and Values

We can extend our object to allow access to the artist array. A simple possibility is to add that as an array of Objects:

    public class CatalogueResponse {
        public boolean success;    
        public Object[] artists;

This works, and can be convenient if we have no specific interest in artists, we just get an array of HashMaps as before. More likely we want to create suitable classes for artist and albumnre

    public class Artist {
        public String name;
        public Albumn albumns[];

   public class Albumn {
       public String title;
       public Object[] properties;

and our response class can now refer to Artists

  public class CatalogueResponse {
      public boolean success;    
      public Artist[] artists;

The creation of these payload classes is a little tedious and error-prone but is up-front work producing classes that can be re-used many times.

Custom Mapping

Looking at the Albumn class we see that the “properties” attribute is representing a set of descriptive fields that may be attached to any albumn. This approach gives a flexible data structure that can be extended in the future.

       "properties": [
                     { "name": "artist",
                       "value" : "Michael Chapman"
                     { "name": "genre",
                       "value" : "instrumental"
                     { "name": "id",
                       "value" : "BP315CD"

The default mapping produced by Jackson would be an array of HashMaps, each HashMap containing a “name” entry and a “value” entry. It would be much more convenient from the client developer’s point of view to represent the properties  by a single HashMap with keys such as “artist”, “genre” etc. In effect transforming the above structure to this:

      "properties": [
                     { "artist" : "Michael Chapman" },
                     { "genre: "instrumental" },
                     { "id : "BP315CD" } 

This is achieved by using custom serialize/deserialize code.        

Serialization annotation

The first step is to annotate the payload class to specify that we are using a custom mapping.

    public class Albumn {
       public String title;
       public Map<String, String> properties;

Here I’m annotating the attribute directly, specifying the JsonSerialize and JsonDeserialize information. I would usually have a private attribute with getter and setter methods and in which case I would annotate the getter with the JsonSerialize and the setter with JsonDeserialize.

We now need to write the custom Serialize classes as referenced in the annotations.

The Deserializer Class

We create a class


    import java.util.HashMap;
    import java.util.Map;

    import org.codehaus.jackson.JsonParser;
    import org.codehaus.jackson.JsonProcessingException;
    import org.codehaus.jackson.type.TypeReference;

    public class PropertyMapDeserializer
         extends JsonDeserializer<
              Map<String, String>

The key point here is that this class extends the Jackson class JsonDeserialize, which is specialised with the type corresponding to our property field. So

     extends JsonDeserializer<
              Map<String, String>

       public Map<String, String> properties;

The class then also must implement the deserializer method

       public Map<String, String> deserialize(
            JsonParser parser, DeserializationContext context)
            throws IOException,
                JsonProcessingException {

The purpose of this method is to read data from the parser and construct and return the value that will be placed into our property.

Implementing the Deserializer

The implementation is in three steps: initialisation of some variables, reading the raw data and constructing the property hash map.


We initialise two variables

    Map<String, String>  retMap =
              new HashMap<String, String>();

    TypeReference<HashMap<String,String>[]>  typeRef
            = new TypeReference<
             >() {};

The retMap is the return value we are populating. The typeRef is going to be used when reading the JSON string to specify the type of the raw data to be produced. [This is a common Jackson idiom that I used in the previous article.]

Reading Raw Data

We read the raw data into a natural representation of the JSON string using a Jackson ObjectMapper.

       ObjectMapper mapper = new ObjectMapper();
       HashMap<String, String>[] maps
           = mapper.readValue(parser, typeRef);

In this case we’re reading an array of HashMaps. Each HashMap in the array contains the “name” and “value” corresponding to a property:


       "properties": [
                     { "name": "artist",
                       "value" : "Michael Chapman"

We then transfer this data into the single hash map we are creating

Populate Returned Hash Map

We simply iterate the array and extract the name and value. (Yes, there should be some error handling.)

    for (HashMap<String, String> oneMap : maps){
             String name = oneMap.get("name");
             String value = oneMap.get("value");
             retMap.put(name, value);   
       return retMap;


Using the HashMap

The client code can now access the Albumn properties in quite a natural way:“genre”);

Next we need to provide the inverse functionality, to serialize to a JSON string.

Implementing the Serializer

The serialization code is also quite simple. First we create the class, which must extend an appropriate specialisation of the Jackson JsonSerializer


    import java.util.HashMap;
    import java.util.Map;

    import org.codehaus.jackson.JsonGenerator;
    import org.codehaus.jackson.JsonParser;
    import org.codehaus.jackson.JsonProcessingException;

    public class PropertyMapSerializer extends JsonSerializer<
              Map<String, String>

Note that the specialisation matches the type of the field we are serialising.

   public Map<String, String> properties;

We then need to implement the serialisation method:

    public void serialize(Map<String, String> data,
            JsonGenerator generator,
            SerializerProvider provider) throws IOException,
            JsonProcessingException {

The requirement is to write to the JsonGenerator an appropriate representation of the Map<String, String> data received. We could do this by constructing an array of HashMaps, but instead I decided to create an array of a simple Property class. I think that this make for more readable code.

The Property Class

The class is a simple representation of the property data with a convenient constructor.

    public class Property {
       public Property(String newName, String newValue){
          name = newName;
          value = newValue;
       public String name;
       public String value;


The payload we are writing is a representation of an array of these objects.

Preparing the Payload

The payload is obtained from the data paramter passed by Jackson to our serialisation method

        Property values[] = new Property[data.size()];
        int i = 0;
        for ( String key: data.keySet() ){
            values[i++] = new Property(key, data.get(key));

we now have a payload to write:

Writing the payload

Once again we construct a mapper object and use it to write our property array to the generator, which Jackson supplied to our serlialisation method.

       ObjectMapper mapper = new ObjectMapper();
        mapper.writeValue(generator, values);

And that completes the custom mapping.


Although it has taken a while to explain each step in preparing a mapping between Java Classes and JSON representations I wrote the code with very little effort, the Jackson programming model seems very effective.

One other note: You might feel that the custom mapping example I chose is somewhat contrived. In fact it is a simplified form of a real-life system I have been working with. The capability to provide a suitable custom mapping of the JSON payload is extremely useful in my real-world example.

In my last couple of postings I was considering various ways of Testing RESTful services, and specifically described using JUnit to test the service, exploiting the Apache Wink Client libraries. I was interested to see a comment from “Lior” who wanted to run tests as part of an automated build process, and hence wanted to avoid the need to start a server in order to host the service under test. “Ben F” then suggested using a Mock Library to emulate the HTTP Request and Response objects. This reminded me of a few Unit Testing principles and that took me to a whole different way of Unit Testing the service implementation classes. I believe that this approach, which I’ve implemented using JMock, is the key to creating a robust set of true Unit Tests.

So before getting to the explanation of the testing approach, I’d like to look briefly at the meaning of the tests I was previously writing . If, as I now believe they were not “True” Unit Tests what were they? And why does that matter.

Integration Tests and Unit Tests

The essence of the previous testing was that we were testing fully functional code, deployed to WebSphere Application Server (WAS). In fact I had some degree of cheating, in that I was not using a real persistence store but a very simple in-memory collection, but in a real development, I would have been testing my real, functioning code, making database calls etc. I have been doing something much closer to Integration Testing – in that I’m exercising the code I write after is is integrated with much infrastructure code (and my persistence layer.)

There’s no doubt in my mind that such Integraation Tests are valuable, and I am very comfortable in using JUnit for this purpose. Having a battery of such tests is extremely useful for testing in later stages of development, for example we can use them for Soak testing, Performance testing and, by submitting multiple instances simultaneously Stress testing and Concurrency testing too. However, there are several costs to developing Integration tests. First and foremost, as Lior observed, there is a dependency upon a significant stack of software – we can’t run the tests without starting WAS and a Database, and even in the RAD environment this takes some time.  Furthermore the tests tend to be brittle, their correctness depends upon many components. There is much material elsewhere on this topic, for example here (in this reference, the testing we have been doing is termed Functional Testing), so I won’t labour the point and instead cut to the chase:

I want to write Unit tests. That is, tests which explicitly test the code I write to implement my service. While designing these Unit tests I will trust my dependencies such as Apache Wink, WAS and the database and make no attempt to test those dependencies.

We can then see that our Integration Tests serve a significant purpose in verify that our trust in our dependency components is justified. However the ratio of Unit Tests to Integration Tests can be quite heavily biased towards Unit Tests, provided that we can trust our dependencies – and if our dependencies themselves have good Unit tests then we can indeed trust them. [This approach is something of an over-simplification. It is remarkable how many defects are discovered when we first integrate pieces of an application. Hence I am a strong believer in early Integration Testing.]

My challenge then is to write Unit tests for the service code, and the rest of this post will explain how. The beauty of the JAX-RS programming model is that it makes this testing very easy.

Unit Testing POJOs

The key features of programming models such as JAX-RS (and EJB3 and JPA) is that we have simple Plain Old Java Objects (POJOs) that are annotated to enable the application server container runtime to provide infrastructure services. This service implementation code is executable stand-alone, and hence we can test it directly, without using any application server.

    public class BookEditionResources {

    @Produces( MediaType.APPLICATION_JSON)
    public Response getEditionByIsbn(
                     @PathParam("isbn") String isbn ) 
        BookEdition edition
                = EditionStore.getInstance().getEdition(isbn);
        if ( edition != null){   
            return Response.ok(edition).build();
        } else {
            return Response.status(Status.NOT_FOUND).build();

Specifically we can call

        getEditionByIsbn( /* a test value */ )

from our tests. However, things are not quite right for true unit testing, there’s a little work to do:

  • The use of Response objects ties us to the Wink APIs. Although we can use these APIs in Unit tests, the use of Exceptions as discussed in this posting would be make for a cleaner, more easily tested implementation
  • The service classes are currently packaged in a WAR file, and hence are not publically callable by our tests. Some repackaging of the code is needed.
  • There is a dependency on Edition Store, as things stand we’ll not be able to test without whatever persistence mechanism it uses – this leads to the use of Mock objects

I’ll address each of these in turn, first I’ll adjust the implementation delegating the detail of the Response creation to the Wink runtime.

POJO Business Methods

The goal here is to write natural Plain Old Java code, with no special logic relating to it’s role in implementing a RESTful service – all the RESTful characteristic are added by annotation. So first I adjust my service method:

    @Produces( MediaType.APPLICATION_JSON)
    public BookEdition getEditionByIsbn (
             @PathParam("isbn") String isbn
             ) throws NotFoundException {
        BookEdition edition
            = getEditionStore().getEdition(isbn);
        if ( edition != null){
            return edition;
        } else {
            throw new NotFoundException("No such isbn:" + isbn);
The important changes here are:

  • The return type is BookEdition not Response, we are delegating Response consturction to Wink.
  • I throw a NotFoundException when the supplied isbn cannot be found.

hence I create a NotFoundException class, extending WebApplicationException and also create a Mapper class

public class NotFoundExceptionMapper
   implements ExceptionMapper<NotFoundException>
    public Response toResponse(NotFoundException ex) {
        System.out.println("Not found Mapping " + ex);
         return Response.status(Status.NOT_FOUND)

Which I register in my WEB-INF/application configuration class.


This combination of implementation, exception and mapper can be deployed and executed. However it’s not quite ready to test as the packaging needs adjusting, but before we get to that first let’s look at the Unit Tests we’d like to write.

Unit Testing a Service Method

Stripping away all annotations we have this service method

     public BookEdition getEditionByIsbn (
             String isbn ) throws NotFoundException {
        BookEdition edition
            = getEditionStore().getEdition(isbn);
        if ( edition != null){
            return edition;
        } else {
            throw new NotFoundException("No such isbn:" + isbn);

We can see here that the method has the following responsibilities

  • To pass the String isbn parameter to the getEdition() method. Note that this responsibility of the method is very simple: just pass the parameter. [This does make us a little suspicious, should there be some validation of that input? In more realistic implementations inputs might well be validated.]
  • If a non-null the value is returned from getEdition(), return that value.
  • If getEdition() returns null then generate a NotFoundException, which includes a diagnistic indicating which isbn was not found. [This is the stateless service design principle, the client can interpret this response with having retained any record of the original request.]

These responsibilities indicate the major tests that we want to do each of which will require control over the behaviour of getEdition(), which I achieve by using JMock, as I will explain later.:

  1. Verify that the getEdition() method received the expected value. Testing this will require mocking, which I will explain later.
  2. Verify that a non-null value returned from getEdition() is indeed returned. Again, testing this will require a mocking.
  3. Verify the correct production of a NotFoundException.

To give a flavour of the kind of test I am writing, here is a test of the second case, with one crucial detail, the mocking, elided:

        final String isbn = "0863699936";
        final BookEdition expectedEdition
              = new BookEdition("a", "b", isbn, new Date());

        // TODO: Initialise a mock to return that  expectedEdition

        // invoke the service method
        BookEdition retrievedEdition = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

The last two lines are the meat of the test, we directly invoke the service method, as a simple Java call, no HTTP involved, and verify that the expected value is obtained.

Next I want to explain now the Unit Test code here is able to compile and execute against the service code, which will eventually be deployed to WAS as a Web Application. To enable that I needed to do a little restructuring of my projects in RAD.

EAR, WAR, JAR and Project Structure

We have three executable pieces to consider: The Service EAR file that will be deployed to WAS, the Client code and the Tests. I need to enable the Client and Test code to use some of the classes contained in the EAR. I achieve this by packaging the service classes in JAR files, which are included in the EAR and also referenced stand-alone by Clients and Tests. Note that the EAR and each JAR file correspond to separate projects in RAD – when we determine the JAR files we are building we also determine the RAD project structure.

This diagram shows the relationships:


The major features are:

  • The service implementation classes, including the Resource Manager, are moved from the WAR file into their own JAR file, which is included in the EAR as a Utility JAR. Utility JARs in the EAR file are visible to the WAR and hence can still be referenced from the application configuration file.
  • The Wink jar files previously included in the WAR file are now moved to the EAR file also as as Utility JARs. [WAS classloaders are organised such that Utility JARs can “see” each other, but they cannot “see” inside WAR files.]
  • The Entity classes, the JAXB-annotated service payloads are in their own JAR file. This separation allows us to distribute the payload definitions to development teams working on Client code. Such teams (should) not need to see any details of service implementation.
  • Hence, Clients need only the Entity JAR, while Test code needs both Entity and Implementation JARs.

Setting-up these relationships in RAD requires a few adjustments.

Utility JARs in the EAR

The objective is to set up the EAR with Utility JARs as described above.

Ear Structure

My steps were:

  1. Drag the 8 Wink jars (json-20080701.jar etc.) from WEB-INF/lib directory of the Web Application project to the EAR project.
  2. Create an Implementation and Entity projects and move the source code from the Web Application project to those projects. At the moment, the Entity project contains just the payload class,
  3. Add the Implementation and Entity projects as Utility Jars in the EAR. Do this in the EAR project’s JEE Module DependenciesJeeModuleDependencies 

WAR file accessing EAR Utility JAR

The Web Application has no classes of its own and hence no compile-time dependencies, however it does need the utility JARs on runtime classpath specified in the Manifest. This is most easily specified via the JEE Module Dependencies of the Web project.


I add all the Utility Jars.

Unit Test Project – ClassPath

The unit test project needs compile-time and runtime access to the Service implementation and entities and also JUnit and JMock libraries. For simplicity of launching I construct a Build Path that contains everything needed at runtime, even though some of these items are not needed at compile-time.

First the references to the service components.

Project Dependencies

The service components are created from projects in my RAD workspace and so it’s just a matter of adding the Entity and Implementation projects to the Build Path


Next the libraries containing JUnit and JMock.

Test Libraries

I am using JMock 2.5.1 (download) and this in turn requires JUnit 4.4 (download), which is not the version of JUnit delivered with RAD 7.5. Hence I  downloaded both JMock and JUnit.

I then added the following JARs to the Test project’s library path. From Junit


And from JMock


Giving this:


Now to use JMock to remove the dependency on the persistence layer.

It’s a Mockery

In order to test without the full persistence layer we need to find a way to “Inject” a mock in its place, and for that we need some cooperation from the service implementation. As things currently stand our service implementation was not designed for testing and so we have no easy way to inject the mock.

So first, a touch of refactoring …

Enable Injection with Dependency Getter

Currently we have this code, using a singleton pattern to get at the persistence layer. [It was quick demo code, all criticisms may be taken as accepted.]


There are several possible injection techniques we could use ( Martin Fowler  lists a few ) I’m going to use a technique that can be introduced without impacting any existing users of the service.

Rather than directly use the Singleton, I introduce an accessor function:

    protected EditionStore getEditionStore() {
        return EditionStore.getInstance();

The service code is modified to:


You will notice that the Getter is protected, this allows the Unit test to …

Inject a Mock by Overriding the Getter

I create an anonymous class, derived from the class we are testing, the BookEditionResources resource manager and override the getter to inject the mock.

manager = new BookEditionResources(){
            protected EditionStore getEditionStore() {
                return mockStore;

The derived class implementation methods now all use the mockStore because they use the getter. To complete the picture I need to code to create the mockStore object and code to control and verify the mock’s behaviour.

Setting up the Mock

The full initialisation for the tests looks like this, I’ll explain the details in a moment, but first just not that the bulk of the work is done in the setup()  method, which is annotated with @Before specifying that it is initialisation code run Before each test.

public class InjectingTest  {

     Mockery mockery;
     EditionStore mockStore;
     BookEditionResources manager;

    public void setup() {
        mockery= new JUnit4Mockery() {{

        mockStore = mockery.mock(EditionStore.class);

        manager = new BookEditionResources(){
            protected EditionStore getEditionStore() {
                return mockStore;

The major features here are:

  • @RunWith(JMock.class) annotation instructs JUnit to use a JMock test executor.
  • The Mockery mockery is the factory for the mock objects that the test will use. It is an instance of JUnit4Mockery, that is initialised with a (don’t you love the term?) Imposteriser 
  • The role of the imposteriser ClassImposteriser.INSTANCE is to enable the mocking of concrete classes – by default JMock would only enable the mocking of Interfaces.
  • The mockery then is used to mock the EditionStore, mockStore = mockery.mock(EditionStore.class);
  • that mockStore  is the object I use in the overridden getEditionStore() method.

Setting Expectations

I can now define the tests, whose execution depends upon particular mock object behaviours. To illustrate that, first a test of a successful find

Expection of a Successful Retrieval

    public void testRest() throws Exception {
        final String isbn = "0863699936";
        final BookEdition edition
              = new BookEdition("aTitle",
                                new Date());

        context.checking(new Expectations() {{
            oneOf (mockStore).getEdition(isbn);

        BookEdition retrievedEdition
                   = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

The test creates an example edition object with a known isbn, and then configures the mock object. Our test is in two parts:

  1. We want to verify that the resource manager’s getEditionByIsbn() method correctly invokes the persistent store with the isbn parameter (yes in this case this is a trivial case, but imagine a more realistic service implementation with serious transformations or complex conditionality)
  2. We want the mock object to return a known, specific value back to getEditionByIsbn(), then later we will test that the expected value was passed back.

hence I specify two mock expectations. In the mockery.checking(new Expectations() {{  …  }}); initialiser I write:

            oneOf (mockStore).getEdition(isbn);

which specifies that the mock should expect exactly one invocation of the getEdition() method, and furthermore I specify the exact expected value of the parameter. Then I also specify what the mock should return with:


We now have a fully specified mock for this test case and can code the test:

        BookEdition retrievedEdition
                   = manager.getEditionByIsbn(isbn);
        assertEquals(edition, retrievedEdition);

This is conventional JUnit code, and in the context of our JMock environment should exercise exactly the Unit under test, with no external dependencies. This test includes the assertion checking that the expected value was returned. One final thing remains … did the mock receive exactly the expected calls? That is, were the mockery’s expectations satisfied.

Verifying That the Expectations are Satisfied.

Checking the mockery expectations is something I need to do after every test and so that’s best done in a JUnit @After method:

    public void verify(){

Expectations of Not Finding an ISBN

In a similar way I can set up the mock to emulate the case of a lookup of an unknown ISBN

    public void testNotFound() {
        final String isbn = "noSuch";

        context.checking(new Expectations() {{
            oneOf (mockStore).getEdition(isbn);

        try {
           BookEdition retrievedEdition
                      = manager.getEditionByIsbn(isbn);
           fail("unexpected success");
        } catch (NotFoundException n){
            assertTrue(n.toString().indexOf(isbn) >=0);

The mockery is simply configured to return null, which the defined value in the case of an unknown ISBN. The test itself is written to expect an NotFoundException to be thrown, and hence to fail if the lookup should succeed. And the test also checks that there is a meaningful error in the exception – the expectation is that the exception should contain the unfound isbn.


That’s a pretty long explanation of something that in practice takes a very few minutes to set up. If right at the start we plan for testing, create projects in the kind of structure shown here, and design implementation for dependency injection then creating Mocks and Unit Tests is very little effort.

In the previous post I described using the Apache Wink Client facilities to create a JUnit test for an operation provided by my example Library RESTful Service. The heart of test was

        String resoruceUri =
        Resource editionResource
                   = libraryClient.resource(resoruceUri);

        BookEdition edition = editionResource.get(
                      new EntityType<BookEdition>(){}
        assertEquals("0863699936", edition.getIsbn());

where we create a URI for a particular Edition of a Book from its ISBN and attempt to GET the representation of that Edition’s state, eventually verifying that the expected data is retrieved.

One of the beneficial effects of writing Unit Tests is that it focuses the mind on edge cases and error conditions. On seeing this test, we wonder “What should happen if the ISBN is not found?”. That question leads us to exploit several additional features of the the Apache Wink programming model in both service implementation and test client. 

Service Implementation Errors (1) – HTTP Response Codes

The service can deal with the case of an unknown ISBN by returning an HTTP Response Code 404, which is defined in the HTTP protocol as Not Found.  This is simple and unambiguous, consistent with the spirit of the RESTful use of Web principles, and the client needs no additional information. Other errors require different handling as I’ll explain later.

One other point, there is an alternative Response Code that we might use in some situations: 410, Gone. This might be useful in cases where a record has been in the system and has been marked as deleted.

So, how does a JAX-RS service return a Response Code such as 404 or 410? Again, the JAX-RS programming model makes implementation easy, but we do need to adjust the service method’s return type to accommodate the two possibilities of success (in which case a representation of an Edition is returned) and a Not Found response.

  @Produces( MediaType.APPLICATION_JSON)
  public Response getEditionByIsbn(
         @PathParam("isbn") String isbn)
     BookEdition edition 
          = EditionStore.getInstance().getEdition(isbn); 
     if ( edition != null){
        return Response.ok(edition).build(); 
     } else {
        return Response.status(404).build(); 

Hence I specify a return type of Response. Then in the implementaiton use either the Response.ok() or Response.status() factory methods to create the ResponseBuilder object. This builder object can be used to specify various attributes of the response and finally is used to build() the Response. Note the “chaining” approach to using the ResponseBuilder.

(For brevity, I used the literal 404, the enum would make for more readable code.)

That’s now enabled the service to send a 404 response when an unknown ISBN is requested. So, back to the test client to check for that response.

Client Error Handling (1) – HTTP Responses

The Wink APIs offer a ClientResponse object, used like this

    public void testFindBad(){
        Resource editionResource = makeEditionResource("SILLY");
        ClientResponse response = editionResource.get();       
        assertEquals(404, response.getStatusCode());

If the service returns success HTTP Response Code, such as 200 OK, or 201 Created then the ClientResponse get() method gives access to the entity data

    BookEdition edition = editionResource.get(
                     new EntityType<BookEdition>(){} );

Summary so far: the Response class gives the service provider the capability of returning meaningful response codes and its client counterpart ClientResponse enables clients to detect these response codes. Now let’s move on to another class of problem, I call these Transient Errors.

Service Implementation Errors (2) – Transient Errors

Service implementations are likely to depend upon other software components, which may fail. For example a dependency on an Database whose server might be unavailable. I term such error scenarios “Transient Errors”. The implication being that the same request by the client, if resubmitted in the future will succeed. As a service designer we must consider how to deal with such error conditions.

Just one aside about this concept, I would also treat some errors detected by  the service’s own implementation as transient. For example suppose that the implementation had some defensive programming, such as a default case in a switch statement .

  switch ( getRole() ) { 
   Role.Tinker:  // code 
   Role Tailor: // code 
   default: // unexpected, happens if code is defective 

What to do in the default case? I treat this as a a Transient Error, it will be resolved by installing a corrected version of the service implementation, then if the client retries their request it will succeed. From the client’s perspective this is equivalent to a Database being unavailable and then restarted – eventually a retry succeeds.

So, given that the service implementation detects a Transient Error condition what should it do? I see two responsibilities:

  1. On the server log suitable diagnostics in order to enable operators to detect and rectify the problem.
  2. Return a meaningful error to the client.

The production of diagnostics is a system-wide issue that should be addressed by explicit architectural and design policies, by default I tend to use java.util.logging but I’ll not dig further into that area here. I do want to look a little more deeply at the second item: the returning of error information to the client. I will explore three different approaches.

First, we can simply throw an exception:

    throw new WebApplicationException(

This is will result in a simple response to the client with a 503 Response Code. In many situations this is sufficient. The client doesn’t know what is broken, but can at least call a help desk and report the problem. However sometimes giving a little more information can either assist in error reporting or serve to clarify the nature of the problem. Reading the HTTP specification concerning response codes we see the recommendation:

Response status codes beginning with the digit "5" indicate cases in which the server is aware that it has erred or is incapable of performing the request. Except when responding to a HEAD request, the server SHOULD include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. User agents SHOULD display any included entity to the user. These response codes are applicable to any request method.

That leads us to the second approach. In the server we can produce an error response with a String entity payload like this

    return Response.serverError()
                   .entity("Database XXX unavailable")

And then in the client access the message:


This approach is likely to be sufficient for GET operations. However when we come to POSTs and PUTs more detailed responses are needed. That leads to the third possibility, using a structured Entity response rather than a simple message. First let’s build the test for POSTing of a new Edition, then look at the responses the create request might give.

Unit Test for POST

The test is constructed as follows. First I do some preparatory work:

  • create a suitable method and annotate it as a JUnit test
  • create the client Resource object for the Editions URI
  • create the Bean containing the details for the new Edition

    public void testPost() {

    Resource editionResource = libraryClient.resource(

    BookEdition theEdition = new BookEdition(
              /* title, isbn etc */

Then I use the chained resource builder approach to specify that I am sending and accepting JSON  content, and pass the Java Bean as the edition data

    ClientResponse response = editionResource 

That calls the RESTful service operation and makes the response available. Now we can add assertions about the response code


And we can obtain the entity from the response and make assertions about its content too, that’s just the same as the GET example at the start of this post.

Now, what about error conditions. This is first and foremost a question of service design: what error conditions might there be in this case? What responses should the service give.

Service Errors in a POST operation

Just as with the GET case, infrastructure errors may occur and these can be treated as Transient Errors and reported using either of the two techniques I described: throwing a WebApplicationException or creating a serverError response, and populating it with an error message.

There are two other conditions to consider:

  1. Suppose that a record for the ISBN already exists: what should the service do?
  2. Suppose that the supplied data is invalid: for example in this case an invalid ISBN might be supplied. More generally, we can conceive of quite complex field-level validation and cross-field validation.

Duplicate Records

Error handling in a distributed, loosely-coupled, system is greatly simplified if services are idempotent. That is, a request may safely be issued more than once. This means that if a client submits a request and then does not see a timely response (perhaps due to a network failure) or the response is lost (perhaps due to a client crash) the client may safely resubmit the request.

If we have adopted the principle of idempotence then it follows that  an insertion encountering a duplicate record is not necessarily an error. It might simply mean that the client sent an insertion, which worked, but failed to see the response and hence has resubmitted the request. However, even in this case there are two possibilities to consider, depending upon whether the  state exactly matches the request.

Existing Record Matches

In this case the system is already in the requested state, most likely because the current request is a resend of a previously successful request. It is reasonable to treat this as a success. The next case is less obvious, this is when the system record state differs from the request content.

Existing Record Does Not Match

There are many possible scenarios that might lead to situation where there is an existing record in place that does not match the current request:

  • A different request (perhaps from a different user’s UI) has created a different record. The difference might be fundamental (a completely different book) or some small semantically trivial difference (“JRR Tolkien” versus “J.R.R Tolkein”)
  • Our previous request succeeded but then subsequently a different request updated the record
  • The current request is mistaken, it’s using the ISBN of a different Edition

In all these case it seems clear that it is unreasonable for the system, without Human intervention, to decide to replace the current record state with that requested. Hence the system cannot be taken to the requested state and we would return an error code. More on the error code in just a moment, but first an aside about an extension to this line of thinking.

Interleaved Requests

The underlying assumption behind the scenarios I explored above is that the system may receive requests in an order different from that in which they were issued and that optimistic locking approach will apply when designing RESTful services. That is we will not design the distributed system along these lines:

  1. User in UI attempts to GET a record
  2. System returns 404, NOT FOUND and LOCKS preventing anyone else from inserting
  3. User POSTs new record, which therefore cannot encounter a duplicate

Although such an approach would simplify our reasoning we find that it simply does not scale and requires excessive coupling between components. Instead I assume:

  • Requests from two users will be delivered in an arbitrary order
  • Requests from the SAME user may be delivered out of order

The second condition enables us to use highly scalable infrastructure design: we can have stateless, clustered servers, multiple routers, and use store-and-forward asynchronous transport. Generally, we find that scalability and reliability are easier to achieve if we use idempotent services that do not strictly order requests. [There are business cases where request ordering is essential, and in these cases some aspects of service design may be simplified at the expense of greater constraints on the infrastructure.]

An implication of the arbitrary ordering of requests is that the service may need to provide a way of ensuring sensible ordering of request processing. Consider a service that controls a mobile phone, allowing a customer to enable and disable a phone. A customer loses their phone, sends a DISABLE request. Then very soon afterwards they find the phone, and so send an ENABLE request. It’s clear that it would be very annoying for the customer to have those two requests processed out of order … the phone end-state would be DISABLED. With additional information in the request such as the time of the request the service can identify a “stale” request and hence intelligent (in)action.

Handling Duplicates in the Service Implementation

To return to the service implementation, I now add duplicate detection and return suitable error conditions. First the main flow of the service implementation:

    try {
            BookEdition added 
                          =  EditionStore.getInstance().addEdition(edition);
            return Response.ok(added).build();
        } catch (DuplicateException e) {
                   BookEdition alreadyThere = e.getAlreadyThere();

This now catches a DuplicateException from the addEdition() method, which contains the state of the pre-existing record. Thats now extracted into the alreadyThere variable.

Now by using the equals() method, we can determine whether this record matches the requested state.

            if ( edition.equals(alreadyThere )){
                return Response.ok(alreadyThere).build();
            } else {

If the state does match then we treat this as a succesful (idempotent) insertion. We return the alreadyThere value, accommodating the possibility that the persisted state is enriched beyond the values specified in the insertion request.

Now for the case where the pre-existing record does not match the requested insertion. In this case I think that a 409- CONFLICT is a reasonable  response code, and we note that the HTTP specification says for a 409:

The response body SHOULD include enough information for the user to recognize the source of the conflict. Ideally, the response entity would include enough information for the user or user agent to fix the problem; however, that might not be possible and is not required.

Hence we need to return something more than a string. I propose to return an object that contains both the requested data and the currently persisted state. It’s pretty obvious why we would return the current state, but why also include the rest? The idea is that we want the services to be stateless in the sense that the response can be understood in isolation, without reference to conversational state. Our response includes enough context so that the client can understand the response as it stands.

So I want to return an object of some complexity. I view “Already Inserted a Different Value” as a special case of an Optimistic Lock violation, and hence I’m going to use a more general purpose object to hold the two values of interest, the current request and the value we find in the store.

     @XmlRootElement(name = "OptimisticFailure")
     public class OptimisticFailure {
         private BookEdition m_requested;
         private BookEdition m_unexpectedValue;

         // and getters, setters and zero-arg constructor

Note that we can imagine presenting this information to the user, saying “You asked to create a record like that, but we found one like this. Would you like to make any amendments?”

So, how do I code the return of such an object as part of the error processing? For this I use a specialised Exception Class.

Throwing Custom Exceptions

First I defined a Custom Exception, this is derived from WebApplicationException

    public class LibraryDuplicateInsertionException 
                extends WebApplicationException {
    private OptimisticFailure m_failureDetails;
    public OptimisticFailure getFailureDetails() {
        return m_failureDetails;

    public LibraryDuplicateInsertionException(
                           BookEdition request,
                           BookEdition alreadyThere ) {
                  = new OptimisticFailure(request, alreadyThere);

I can then complete the service implementation:

            if ( edition.equals(alreadyThere )){
                return Response.ok(alreadyThere).build();
            } else {
                throw new LibraryDuplicateInsertionException(
                                                     edition, alreadyThere

Before running this code one further task remains, and Exception Mapping Provider is needed.

Exception Mapping Providers

An Exception Mapping Provider is responsible for creating a Response object from the Exception information. JAX-RS specifies that the provider must implement the Interface ExceptionMapper<T> and so I created this Provider class:

public class LibraryDuplicateInsertionExceptionMapper  
  implements ExceptionMapper<LibraryDuplicateInsertionException>
    public Response toResponse(
               LibraryDuplicateInsertionException ex)
         return Response.status(Status.CONFLICT)


  • The use of the @Provider annotation
  • the toResponse() method has responsibility for transforming the exception to a suitable response. This is a very simple example giving a single response code 409 – CONFLICT and returning a simple payload; the Exception was designed to have suitable data available.
  • The OptimisticFailure  returned by getFailureDetails(). already had JAXB annotations

Finally I announce this new provider to the Wink runtime. You may recall that Wink servlet’s parameter refers to the WEB-INF/application property file, which originally contained just the name of the ResourceManager class.  I add the name of the new provider class to this file.



I can now execute test sequences such as

  1. Delete record with ISBN 1234  – this is idempotent and so always succeeds whether or not the record exists.
  2. Insert record with ISBN 1234 – should succeed, no record can exist
  3. Insert identical record with ISBN 1234 – as it is identical, although it is a duplicate insertion is treated as succesful
  4. Insert a record with ISBN 1234 but differences in other fields. This fails with 409 – CONFLICT and a payload containing current record and new requested record.

Validation Errors

It should be clear that validation errors can be addressed in the same way, returning a complex object that contains the error details. I think a simple collection of Name/Value pairs may be the simplest way to capture a set of validation results. For validation of simple Bean the name can be either a specified field or “global” for more general validation errors.

In a previous posting I described how I exercised my RESTful services using the Firefox extension Poster and also by calling the service from JavaScript in the browser using the Dojo framework. Poster is especially useful because we can very quickly exercise a service. However for testing I want something more:

  1. To repeatedly run a set of tests covering a variety of inputs.
  2. To specify expected outcomes for these test. Using just Poster I must inspect the values by eye, which is error prone.

In other words I want to use a tool such as JUnit. To do that we need to be able to write Java clients of our RESTful services. This is not something directly addressed by the JAX-RS specification, however the Apache Wink framework does include APIs that enable writing clients of a RESTful service. So in the remainder of this post I will explain how I created tests for my service.

Wink Libraries

Previously, when using Wink in a JEE application deployed to WebSphere I copied Wink libraries into WEB-INF/lib directory of the Web Application, and hence these libraries were available at both compile time and runtime. The JUnit tests run as stand-alone Java applications and can pick up the Wink libraries from classpaths specified in RAD. There’s quite a few ways of doing that, for example simply referring to the JARs whereever Apache Wink was downloaded, or creating a User Library. My preference is to set up projects that can be checked into CVS and checked out by any developer, with no dependencies on external downloads.

Hence I created an Apache Wink project and copied the Wink download to that project.


I then set up the project build path to refer to the JARs needed from that project. I select the project and then

    rightClick –> Properties –> Java Build Path –> Libraries Tab

Then Add Jars and select JAR files from the Apache Wink project.


These JARs include the APIs we use directly and so are needed on the Build Path


Additionally, we need the JSON and logging libraries at runtime. It’s convenient to add these libraries to the Build Path (even though we don’t need them at build time) because then simple “Run As” commands pick up the correct runtime classpath.

    slf4j-simple-1.5.8.jar (or slf4j-jdk14-1.5.8.jar from here)


Note that If we now check the ApacheWink and the test project into CVS we can create a Project Set File for those two projects which can be checked out by any developer and they immediately have a complete working test program, with no external dependencies.

Next I completed setting up the Build Path by enabling the use of JUnit 4 …

JUnit 4

Still in the Libraries tab and click Add Library, select JUnit and click Next. From the drop-down select JUnit 4 and click Finish. I prefer JUnit 4 to JUnit 3, it’s use of annotations simplifies developing tests.

Client Code

Now to write the test. We are going to invoke the service operation which retrieves the state of a Book Edition with a particular isbn


And in our client we need a Java object that corresponds to the state returned. In the server code we already have a suitable Java Bean and so I simply copied that code into my Test (that’s just a temporary expedient, clearly it would be better to refactor shared code to a common library.)

The test code looks like this:

package org.djna.library.test;

import static org.junit.Assert.assertEquals;

import org.apache.wink.client.EntityType;
import org.apache.wink.client.Resource;
import org.apache.wink.client.RestClient;
import org.junit.Test;

public class EditionTester {
    public void testExample() {
        RestClient libraryClient = new RestClient();
        String resoruceUri =
        Resource editionResource
                   = libraryClient.resource(resoruceUri);
                   "Accept", "application/json"
        BookEdition edition = editionResource.get(
                      new EntityType<BookEdition>(){}
        assertEquals("0863699936", edition.getIsbn());


There are a few points worth noting here. First concerning the JUnit tests:

  • import static org.junit.Assert.*; Java 5 enables static imports giving access to the various assertions that express the checks performed by the test.
  • The @Test annotation specifies that this method is a test. Hence I can request “Run As Junit” on this class and the method here will be executed.
  • The assertEquals() method checks that the edition retrieved from the service has the expected ISBN.
  • Some of the code here is standard “setup” code and when we have a few more test methods we’d refactor it into a setup method, which would be annotated with @Before.

Now to look at the Wink client code. This APIs used in this test code are worth studying, they could be useful in application code too. It’s clear that writing a client of a RESTful service is very easy.

  • RestClient libraryClient = new RestClient(); gives us client object to work with  …
  • and then we can use the client to specify a resource Resource editionResource = libraryClient.resource(resoruceUri);
  • Before invoking the method we need specify the MediaType that we can accept in the response. If you look at the service code you’ll see that the @Produces annotation specifies only JSON responses and by default this is not the requested type. So we have editionResource.header("Accept", "application/json" );
  • Finally we invoke the service, specifying the expected response Class: editionResource.get( new EntityType<BookEdition>(){} );


Creating JUnit tests for the RESTful service using the Wink client APIs is straight forward. There’s a few more test to write here, for example what happens if we specify a resource that does not exist? And I also need to test the edition creation POST operation. That’s for the next posting.

Having used Apache Wink to  implement my basic JAX-RS service, which simply retrieves some information from Library, details of a Book Edition given its ISBN, I next want to implement an operation to add an Edition to the library. This operation will use a POST method to the resource


that is, the resource that specifies that entire collection. Once again, the JAX-RS programming model makes the implementation of the service operation very easy, but I stumbled across a minor annoyance in testing the operation from Dojo, so in this posting I’ll take a side trip into the Dojo JavaScript libraries.

JAX-RS POST implementation

I already have a resource implementation class and so I just need to add a method to process the POST request

public BookEdition addNewEdition(BookEdition edition) {
    return EditionStore.getInstance().addEdition(edition);

There are a few points to note here.

  • The input parameter edition is not annotated. Contrast this with the GET example,  getEditionByIsbn(@PathParam("isbn") String isbn), which enabled access to an element from the URI. Here the edition parameter is mapped to the request body, that is the content which is POSTed. Such a parameter is termed an Entity parameter. A method may have at most one entity parameter.
  • I have decided to permit both JSON and XML payloads to be Posted. JSON is very convenient from Dojo, as we shall see, but other clients may prefer XML. The Wink JAX-RS implementation will deal with deserializing JSON or XML depending upon the MediaType of the request.
  • As a design decision, I have chosen to return the state representation of the added edition. In this case it’s overkill as the state was passed in and we make no modifications in adding it to the Library. However in general it’s quite possible that there is some enrichment of data as it is inserted into the store. Hence I use the general pattern of returning a representation of what has been stored.

Testing with Poster

We can quickly test this method using a Firefox extension: Poster.


Specifying an Edition in XML

      <author>John Power</author>

and the content type application/xml. The response is as expected, containing a representation of the Edition we added. In the same way, we can provide a JSON payload

      {"title": "Invincible",
       "author": "John Power",

And content type application/json. The content type of the response is determined by the HTTP Header field Accept with a value of application/xml or application/json.


Testing from Dojo

One very likely scenario for using REST services is to call  an operation from an Browser-based rich client application. The Dojo JavaScript libraries are one possible technology for constructing such applications.

This is some sample code which invokes the addNewEdition() operation:

      var myEdition = {"Edition":{“author”:”x”, “title”"isbn":"44"}};
      var xhrArgs = {
                url: http://myhost/LibraryWink/library/editions,
                postData: dojo.toJson(myEdition),
                handleAs: "json",
                headers: { "Content-Type": "application/json"},
                load: function(data) {
                           =  "Message posted.";
                error: function(error) {
                            = "Error :" + error;
                           = "Message being sent…"; 
        var deferred = dojo.rawXhrPost(xhrArgs);

I’ll quickly summarise a few key features here, but first note in particular the actual POST request:


You note that I use rawXhrPost rather than


I can find no documentation that explains why rawXhrPost() is needed, but after much experimenting and imprecating I discovered that rawXhrPost() does indeed work and that xhrPost() sends no payload.

The other salient points:

  • postData: dojo.toJson(myEdition) converts the payload object to JSON.
  • headers: { "Content-Type": "application/json"} sets the MimeType in the header, hence allowing JAX-RS to correctly interpret the JSO string – the handleAs property relates to the interpretation of any response from the service
  • The url: http://myhost/LibraryWink/library/editions is our resource URI

Next – Unit Testing

So, that’s two ways of exercising a REST service, however as a test mechanism both leave a few things to be desired. In the next posting I’ll move on to using JAS-RS client

JAX-RS with Apache Wink

December 2, 2009

JAX-RS is a specification addressing the implementation of providers and consumers of REST services in Java. Apache Wink is an Open Source implementation of that specification. To give an idea of what is possible: you can annotate a Java Class like this

    public class LibraryResource {

    @Produces( MediaType.APPLICATION_JSON)
    public BookCollection getBooks() {
          // implementation here


and deploy this class along with the Wink framework to an Application Server and you have immediately implemented a RESTful service with a URI


In this posting I will explain in detail how to use Apache Wink to implement a service like this. I will be creating RESTful services for a simplified, fictional Lending library. First, a reminder of the problems addressed by JAX-RS. I’ve described the service providers responsibilities in more detail in this posting but in summary we first need to do some up-front design work:

  • Identify some resources, in our case these could be Books, Editions, Authors, Categories and Borrowers
  • Decide on the URIs that identify these resources, both individual resources and collections of resources.
  • Design the services, giving appropriate meanings to methods such as GET, PUT, POST and DELETE

Then we write code and exploit JAX-RS (and JAXB) to address these implementation problems:

  1. Transform between the State of a Resource and its Transfer Representation. That is, manage the conversion between a Java Object and an XML or JSON representation of its attributes.
  2. Interpret the URI of a service request and direct the request to the appropriate Resource Manager, which we have developed in Java.
  3. Map the HTTP method (GET, PUT …) to appropriate methods of the Resource Manager.
  4. Enable the resource manager to use the details of the URI to identify particular resources. for example


    would identify a borrower with id 12345

  5. Give access to qualifying parameters such as search criteria


I am using Rational Application Developer 7.5  (RAD) and the WebSphere Application Server 7.0 (WAS) test environment provided by RAD. I downloaded Apache WINK and then added WINK jar files to my dynamic web project.

Download Wink

Apache WINK can be downloaded from Wink Download. I downloaded

This contains directories

  • dist – the core of Wink
  • lib –  wink supporting libraries
  • ext/wink-json-provider – JSON formatting and parsing

There is also documentation and examples.

Creating and Populating Web Project

In RAD create an Application (EAR) project and Dynamic Web Project, then add the Wink JAR files, define the Wink servlet and create packages for the Java Classes we will write shortly. The next sections describe these steps in more detail.

Wink JAR files

Copy the following jar files from the Wink distribution to the WEB-INF lib directory of the dynamic web project.

From dist


From lib:


(note that instead of slf4j-simple-1.5.8.jar you may prefer to use a different slf4j implementation such as slf4j-jdk14-1.5.8.jar, you can download that from here.

From ext/wink-json-provider


So my LibraryEar and LibraryWink projects now look like this


Wink Servlet

Edit WEB-INF/web.xml, add the servlet definition.



This has the effect of mapping all URLs begging with “library” to the Wink REST servlet.

The Web project now shows a servlet:


We will create our own resource manager classes with JAX-RS annotation that will implement the service function and hook that servlet to call our implementation: note that the servlet has an initialisation parameter applicationConfigLocation which refers to a file


the purpose of this file is to register our resource manager code with the Wink servlet. We’ll create that file once we have written the code.            

Creating Packages

I created two packages to differentiate two kinds of Class used in the service implementation:

  • org.djna.library.service: the resource manager classes, these being the classes that offer the service interfaces themselves
  • org.djna.library.resources: the resource state classes, which hold the data passed to and from the service methods, these classes will be serialised to and from JSON and XML


In the next section I’ll describe how to implement the state and resource manager classes

Information using JAXB

I  first create classes to hold the state of the various resources in our system. In this example they will be simple Java Beans with with minimal  business logic, effectively just Data Transfer Objects that will be serialised to and from JSON or XML. Something I want to study in the future is the relationship between these resource objects and JPA Entities.

Book Edition

We’ll start with a simple class representing an Edition of a Book. In this simple example the class has few attributes:

          package org.djna.library.resources;

     import java.util.Date;

public class BookEdition {
    private String title;
    private String author;
    private String isbn;
    private Date publicationDate;

The isbn uniquely identifies the book edition.

I generated getters and setters for every field, for example

    public String getTitle() {
        return title; 
  public void setTitle(String title) {
        this.title = title; 

I also created a two constructors, one takes no arguments and one allowing a fully populated BookEdition to be created.

    public BookEdition() {
        title = null;
        author = null;
        isbn = null;
        publicationDate = null;
       public BookEdition(String nTitle,
            String nAuthor,
            String nIsbn,
            Date nPublicationDate){
        title = nTitle;
        author = nAuthor;
        isbn = nIsbn;
        publicationDate = nPublicationDate;

The zero argument constructor is required by JAXB.

Annotating for Serialization

The BookEdition class will be passed to and from the REST service and so must be serializable. This is quickly accomplished by adding annotations to the class.

@XmlRootElement(name = "Edition") 
public class BookEdition {

These two annotations are sufficient to enable BookEdition to be transformed to and from XML and JSON.

  • The XmlAccessorType, with parameter  XmlAccessType.PROPERTY specifies that the attributes to be serialised will be deduced from the presence of Getters and Setters in the class. This a very simple use of JAXB. JAXB has many other capabilities, for example by using other XmlAccessTypes you can expose attributes directly. Also, if necessary you can get finer control over which fields are to be serialised and their serialized representations by applying other annotations to individual attributes.
  • XmlRootElement specifies the root element of the XML or JSON payload.

Resources using JAX-RS

Next I create a Resource Manager which will offer services to manipulate BookEditions and specify a base URI for the services. I then create the individual service methods which act on particular resources and collections of resources. In order to enable testing of the services I create a Mock Data Store for Book Editions which simply works with a few hard-coded examples.

This assumes that I have already made decisions about the URIs I intend to use to access my resources.  For this simple case, with only a single kind of resource the following URIs (I show only the significant portion of the URI, prepend http://myhost/rest to all these) will be sufficient:

  • /editions : all book editions
  • /editions?query=XXX : editions matching query
  • /editions/0858835544 : a particular edition specified by its ISBN

In the remainder of this posting I’ll focus on implementing that final operation, the retrieval of an Editions’s state specified by its ISBN.

Resource Manager Class

JAX-RS has a rich set of capabilities for mapping REST URIs to methods of resoruce manager classes. I’ implemented a very simple case: a single resource manager with methods for each particular operation I wanted to expose.

Resource Manager Class and Path

The resource manager class is an an arbitrary class, no special requirements for inheritance or interface implementation. It is annotated  to specify the URIs that will be dispatched to this class

    package org.djna.library.service;


     public class BookEditionResources {

The context root for the Web Application is LibraryWink and the servlet is mapped to library hence the base URI for the REST services exposed by this resource manager is


Next I prepared some example, hard-coded data items.

Example Data

The EditionStore class is included in the project ZIP file (TBD).

Retrieve One Item – GET

The first method I implemented retrieved a single edition, specified by its ISBN. That is, we implement a GET method for the URI


The method in the resource manager looks like this

    @Produces( MediaType.APPLICATION_JSON)
    public BookEdition getEditionByIsbn(@PathParam("isbn") String isbn) {
        return EditionStore.getInstance().getEdition(isbn);

The JAX-RS runtime has a flexible algorithm for matching the request to the service method, this considers the matching of the request URI to the @PATH annotations for class and method, the HTTP method types (GET, POST …) and also the requested Media Types.  In the general case the algorithm  needs to deal with the possibilities of there being many resource managers each with many methods. There is an explanation of the algorithm here.

In this getEditionByIsbn() example we see that the method is annotated with @GET to indicate that it deals with HTPP GET requests, and with @Produces( MediaType.APPLICATION_JSON) to specify that the response is in JSON format. It is possible to specify more than one Media type (for example both XML and JSON) in which case the client may specify its preference.

The @Path("{isbn}") annotation is a little more complex. Two things are happening here. First, this Path is used in conjunction with the overall ResouceManager class Path annotation so that the request URI that will be serviced by this method will look like this


where XXXX will be a specific ISBN. The second thing that is being specified is that the isbn portion of the URI is to be accessible as a parameter, the parameter having the name isbn.

This now gives a context for the final annotation: @PathParam("isbn") the method declaration

    getEditionByIsbn(@PathParam("isbn") String isbn)

specifies that the Java method takes a single String parameter, and the annotation of that parameter tells JAX-RS to populate that parameter from the URI path.

Application Configuration

You may remember that we need to configure the Wink REST servlet to use our application code. We do this by creating a simple text file in the location specified by the servlet’s applicationConfigLocation parameter, that is a text fille: WEB_INF/application

The contents of the file is a simple list of resource manager classes, in our case just



I can now deploy the code to the WAS test environment and using a browser issue the GET request


The response is a simple JSON string containing the details of the Edition with ISBN 1900924323

   { "Edition": {
     "author": "Brian Hinton, Geoff Wall",
     "isbn": "1900924323",
     "publicationDate": "2002-03-27T14:45:43.437+00:00",
     "title": "The Guvnor"
   }  }


The experience of coding a RESTful service using the Apache Wink implementation of JAX-RS is pretty good, I wrote very little code and achieved a working implementation very quickly. There’s a few directions for further study:

  • The relationship between our resource classes and JPA entities. Can we (and should we) annotate a class with both JPA and JAXB annotations?
  • How easy will it be to implement PUT, POST and DELETE?
  • How easily can we implement queries across collections?
  • JAX-RS has various techniques for allowing to refactor resource manager logic, with flexible URI path specifications. Are there obvious patterns for using these capabilities?
  • JAX-RS enables various life-cycle models for resource managers. How easy are they to use?

RESTful services are particularly useful for Web 2.0 developers and are pleasingly easy to develop in Java using the Apache Wink implementation of JAX-RS. You can get more detail about REST starting here in Wikipedia in this posting I want to illustrate the significance of  JAX-RS by considering the responsibilities of the a developer who wants to provide a REST service. In a more detailed posting I describe in detail how to use Wink to create a few simple services for a Lending Library.

REpresentation State Transfer (REST) is concerned with the State of resources and passing of representations of that state between client and server. So, we need to understand:

  • what is meant by a resource
  • how URIs are used to specify particular resources
  • how requests to retrieve and manipulate the state of a resource are defined by the client
  • the representation of the state as it is transfered between client and server


Resources may be any entity (or collection of entities) owned by a service provider. They are often familiar business entities such as Customers and Orders. I am  library examples we might have Books and DVDs. We also treat collections of entities as resources, for example “All orders placed today” or “Books written by Alastair Reynolds”.

We can easily imagine a simple mapping between records in a Relational Database and the resources managed by a service provider. However we can also treat more abstract information as resources., for example “The temperature now in London” or “The current stock price for IBM on the NYSE”.

So our first task as a REST service provider is to identify the resources our service is managing. Typically we will give some detailed consideration as to how clients would like to refer to the resource and that leads to define the URIs by which clients with specify the resources.


Strictly speaking REST services can use many different technologies, they are not limited to the use of Web technologies and HTTP protocols, but here I am going to focus on the most common case, accessing resources specified by a URI using HTTP (or HTTPS).

The service provider will define the URIs that specify particular resources. In our library example we might have

and we can similarly use URIs to specify more abstract information such as

We see here a more extensive URI hierarchy and this indicates the extent of design effort that may be required in the resource identification activity so that we can define the URIs to be used.

This leads to some specific implementation coding tasks:

  1. The URI must in some way map to code that actually manages the book resources, and similarly the URI must be associated with the resource manager for the current weather resources.
  2. The resource manager must interpret more precise URIs in order to identify the specific resources. So  uk/london/temperature must be interpreted correctly, the implementer needs structured access to the URI.

The implementation tasks are addressed by JAX-RS, and in the more detailed article I describe how to use JSX-RS annotations to complete the service implementation.

These ideas are extended further to allow the passing of parameters in the URI, in the example the collection URI is  modified by some search criteria

JAX-RS has annotations to allow access to parameters of various kinds.


RESTful architecture using HTTP explicitly exploits of HTTP methods such as GET, PUT, POST and DELETE, to define actions on resources:

  • GET: retrieve a representation of the resources specified by the URI
  • PUT: replace the state of the resource specified by the URI with new values
  • POST: add a new resource, typically the URI will be that of a collection and the specified ID will be be generated by the resource manager or derived from the payload
  • DELETE: remove the resource specified by the URI

This too leads to specific coding tasks: the implementer needs to map from the HTTP method to the appropriate action.


The resource representation returned from  a GET or (for example) specified in a PUT can be in arbitrary format as specified by the MIME type of the request, usually that type would be application/xml for XML application/json for JSON.

From the service provider’s perspective this leads to two further tasks:

  1. To determine the request MIME type
  2. To parse requests or format responses in accordance with that type.

JAX-RS has annotation capabilities to deal with different MIME types, however the corresponding  parsing and formatting is not part of the JAX-RS as such, but rather is provided by JAXB, hence we will need to use some JAXB annotations when supporting the chosen MIME types.