Liz Douglass

Django form validation – why does the error message always appear?

leave a comment »

Recently I’ve been working on a piece of functionality involving a Django form. The form was defined like this:

class FooForm(forms.Form):
    recipient = forms.CharField(widget=forms.HiddenInput)
    text = forms.CharField(widget=forms.Textarea)

    def clean_message(self):
       data = self.cleaned_data['text']
       message = data.strip(' \t\n\r')
       if (len(message) == 0):
           raise forms.ValidationError("You need to provide some text")
       return data

It was instantiated in a handler method like so:

def bar(request):
    form = FooForm({'recipient' : some.name})
    return render_to_response('my.html', RequestContext(request, {'form': form}))

The validation was working a little too well in that the error message for the text field was appearing when the form was first loaded. The response to a question posted here explains why this was happening. As the Django documentation says, the form has been data bound. This has happened because a data dictionary has been provided as the first argument to the form constructor. This “trigger(s) validation of this input” (Lott, March 17 2009). So the error message is appearing because the text field was not bound with a value that passes validation. As it also says in the Django documentation, the alternative to binding data is to provide dynamic initial values like so:

def bar(request):
    form = FooForm(initial={'recipient' : some.name})
    return render_to_response('my.html', RequestContext(request, {'form': form}}))

Using this, the validation is not immediately triggered and the message only appears if an invalid form has been submitted.

Written by lizdouglass

November 1, 2010 at 9:49 pm

Posted in Uncategorized

Tagged with ,

Migrating to Scala 2.8

leave a comment »

A few months ago Scala 2.8.0 was released. We migrated our project from version 2.7.7 quite soon after the announcement. Making the switch was quite straight forward, in fact all we needed to do was change the build.scala.versions property in our Simple Build Tool build.properties file. Admittedly it took several hours to fix all the compilation errors, but once this was done we found we found we were able to make our code base more readable because of two shiny new things in particular:

1. Collections:

As I’ve written about before, we are using MongoDB for persistence. We are also using the MongoDB Java driver. This library makes heavy use of BasicDBObjects. These are the simplest implementation of the DBObject interface. This interface represents “A key-value map that can be saved to the database”. BasicDBObjects are used for more than just inserting data. In fact they are also used extensively in many of the methods defined on the DBCollection class. For example, this is the definition of the update method:

WriteResult update(DBObject q, DBObject o)

(Source: class com.mongodb.DBCollection)
(q is the query used to find the correct element and o is the update operation that should be performed)

We often use the BasicDBObject constructor that takes a Java map. When were using Scala 2.7.7 we had a library to handle the conversion of maps from Scala to Java. We ended up with a lot of repository methods that were littered with calls to asJava, like the one below:

import org.scala_tools.javautils.Imports._
import com.mongodb._
import org.bson.types.ObjectId

def updateLastLogin(personId: String) {
   val criteria = new BasicDBObject(Map("_id" -> new ObjectId(personId)).asJava)
   val action = new BasicDBObject(Map("$set" -> new BasicDBObject(Map("lastLogin" -> new Date).asJava)).asJava)
   collection.update(criteria, action)
}

Now in Scala 2.8 there is a new mechanism for converting collection types from Scala to Java and thankfully all the asJava calls have disappeared from our codebase. Now we have cleaner functions like this one:

def markLogin(memberId: String) = set(new MemberId(memberId).asObjectId, "lastLogin" -> new Date)

Where:

def set(id: ObjectId, keyValues: (String, Any)*) = collection.update(Map("_id" -> id), $set (keyValues: _*), false, false)

Note that we are now also using the Casbah library. The update method above is defined on the MongoCollectionWrapper in the Casbah library.

2. JSON

We have a Scala backend API and a Django frontend. The backend serves JSON to our frontend. We are using the Lift Json library to do both the serialising in the backend webapp, as well as the de-serialising in our integration tests. Some of our tests use the Lift Json Library “LINQ” style JSON parsing, while others use the “Xpath” style (both of which are described here):

LINQ style:

import net.liftweb.json.JsonParser._
import net.liftweb.json.JsonAST._
...

val (data, status) = ApiClient.get(myUri)
val json = parse(data)

val myValues = for{JField("something", JString(myValue)) <- (json \ "somethingElse")} yield myValue
myValues should have length (3)
myValues should (
    contain("frog")
    and contain("tennis")
    and contain("phone")
)

Xpath style:

scenario("succesfully view something") {
   val json = getMemberLiftJson(someId)
   assert((json \ "bar").extract[String] === "ExpectedString")
}

The Xpath style is readable and compact and we’ve retained quite a few of these type of tests in our code base. Personally I find the LINQ style quite confusing. I need to refer to the Lift project github site every time I try to use it. In Scala 2.8 there is now JSON support and we have instead been using this for parsing in the tests. Specifically, we have been using the parseFull method like so:

scenario("bar") {
    val (content, status) = ApiClient.get(someUri)
    JSON.parseFull(content) match {
        case Some(data : List[Any]) => {
            data.length should be(4)
            data should not contain("Internet")
        }
        case json => fail("Couldn't find a list in response %s".format(json))
    }
}

Go Scala 2.8!

Written by lizdouglass

November 1, 2010 at 9:48 pm

Posted in Uncategorized

Tagged with

Migrating from Maven to Simple Build Tool

leave a comment »

A while ago I moved our Scala project build from Maven to Simple Built Tool (sbt).

Why sbt?

  • sbt is made for Scala projects. The buildfile is written in Scala and is as concise as the Buildr ones that I have worked with previously.
  • sbt has several project types including the basic and web project types. Each has multiple build tasks/actions defined and all of these can be customised.
  • sbt has support for dependencies to be declared in either Ivy or Maven configuration files. All our dependencies were already specified in pom files, so not having to migrate these (at least straight away) made the transition to sbt easier.
  • sbt compiles fast. Robert first added sbt into our project so that he could compile the code more quickly than is possible in Intellij.
  • sbt has support for ScalaTest – the framework that we use for all our unit and integration tests. When we were running our ScalaTest tests as part of our Maven build we found that we needed to include the word ‘Test’ somewhere in the class name. Forgetting the requirement had cost one of our project team members several hours on one occasion.
  • We can now run both our webapp and integration tests at the same time. We’d found that it wasn’t possible to configure Maven to do this. We had instead started up a Jetty server from within our integration test project in order to run the web project.
  • sbt promotes faster development. Continuous compilation, testing and redeployment means that our work cycle is faster. This is particularly noticeable when we are working on a feature that requires us to make changes in both our Django front end and Scala backend. We can make changes in the source code in both and have both projects automatically redeploy.

Creating the sbt buildfile:

At the time of the migration to sbt our Scala backend was divided into 3 subprojects:

  • Core : A Scala project with ScalaTest unit tests
  • IntegrationTests – ScalaTest integration tests
  • Webapp : Basic webapp

Part 1: Declare all the sub-projects in the buildfile:

The first step in creating our build file was to declare all three subprojects. The core and integration subprojects are sbt DefaultProjects and have tasks such as compile and test defined. The webapp is an sbt WebappProject and has additional tasks such as jetty-run. Both the webapp and integration projects depend on the core project:

lazy val core = project("my-api-core", "Core", info => new DefaultProject(info))
lazy val webapp = project(webappProjectPath, "Webapp", info => new WebappProject(info), core)
lazy val integration = project("integration", "IntegrationTests", info => new DefaultProject(info), core)

Part 2: Getting the webapp running from sbt:

Although the webapp was declared as a Webapp project, it wasn’t possible to run it without declaring some additional Jetty dependencies. These were specified as inline sbt dependencies in the WebappProject class. This class extends from the sbt DefaultWebProject (please see below). Note that the port and context path can also be specified.

class WebappProject(info: ProjectInfo, port: Int) extends DefaultWebProject(info) {

    val jetty7Webapp = "org.eclipse.jetty" % "jetty-webapp" % "7.0.2.RC0" % "test"
    val jetty7Server = "org.eclipse.jetty" % "jetty-server" % "7.0.2.RC0" % "test"

    override def jettyPort = 8069
    override def jettyContextPath = "/my-api"
}

Part 3: Getting the integration tests running from sbt:

As mentioned above we need to run our webapp project at the same time as our integration test project. This is so that our integration tests can make calls to the webapp project endpoints. In addition to this our integration tests need to have a MongoDB database populated with some test data.

1) Populating the Mongo database for testing:

All our integration test classes extend from a common trait (below). This trait populates the MongoDB database at the start of each test:

trait JsonIntegrationTest extends FeatureSpec with BeforeAndAfterEach with ShouldMatchers {
    implicit val formats = new Formats {
        val dateFormat = DefaultFormats.lossless.dateFormat
    }

    override def beforeEach = {
        IntegrationDB.init
        IntegrationDB.eval(suiteData)
    }
}

The init method populates the database. The name of the MongoDB database is taken from a configuration file:

object IntegrationDB extends Assertions {
    private val config = new ConfigurationFactory getConfiguration("my-api")
    val dbName = config.getStringProperty("mongo.db")
    val db = new Mongo().getDB(dbName)

    def init = {
        val testDataFile: java.lang.String = "mongodb/IntegrationBaseData.js"
        val testDataFileInputStream = getClass.getClassLoader.getResourceAsStream(testDataFile)
        if (testDataFileInputStream != null) {
            eval(Source.fromInputStream(testDataFileInputStream).mkString)
        } else {
        fail("a message")
        }
    }

    def eval(js: String) = {
        db.eval(js)
    }
}

We are using The Guardian Configuration project to manage the config of the all of our Scala sub-projects. One of the features of this library is that it enables you to read properties from a number of sources. All of the properties for our projects are Service Domain Properties, meaning that they will be loaded from files on the classpath. The Configuration project library loads the properties from whichever correctly named file it encounters first on the classpath.

As mentioned above, our IntegrationTests project has a dependency on the Core project and therefore the config files of both projects will appear in the classpath. Both projects had a properties file with the same name that specified the mongo.db property. The intention was for the integration test properties to override those in the Core project. This did not work as planned because the ordering of the two config files in the classpath could not be guaranteed. Sbt does allow you to add items additional items to the classpath of your project using the +++ method. I did try to promote the integration test properties using this method (below). Unfortunately this did not guarantee classpath order either.

class IntegrationProject(info: ProjectInfo) extends DefaultProject(info) {
    val pathFinder: PathFinder = Path.lazyPathFinder(mainResourcesPath :: Nil)
    override def testClasspath = pathFinder +++ super.testClasspath
}

The work around was to set a system property called int.service.domain. This is used by the Guardian Configuration project to define the name of the properties file that that should be loaded from the classpath. Our integration test project now has a properties file with a different name to the one in the core project. The test action in the IntegrationTests project calls a method to switch to the integration test properties before the tests are run:

Now in the integration tests:

val useIntegrationConfig = true;
override def testAction = super.testAction dependsOn (task {setSystemProperty(useIntegrationConfig); None})

private def setSystemProperty(integrationConfiguration: Boolean) = {
    if (integrationConfiguration)System.setProperty("int.service.domain", integrationTestConfigDomain)
    else System.setProperty("int.service.domain", defaultConfigDomain)
}

2) Starting the main web application from the Integration test project:

As I mentioned above we’d found it hadn’t been possible to start up the webapp as well as run the integration tests using Maven. With sbt it is possible to do this. Another webapp project has was declared inside the definition of the IntegrationTests project. This sbt project has a separate output path and jetty port to the main webapp. This enables us to keep the main webapp running and run the integration tests at the same time.

class IntegrationProject(info: ProjectInfo) extends DefaultProject(info) {
    val useIntegrationConfig = true;

    lazy val localJettyPort = 8071
    lazy val localWebappOutputPath: Path = "target" / "localWebappTarget"
    lazy val localWebapp = project(webappProjectPath, "IntegrationTestWebapp",
        info => new WebappProject(info, localJettyPort, localWebappOutputPath, useIntegrationConfig), core)

    override def testAction = super.testAction dependsOn (task {setSystemProperty(useIntegrationConfig); None}, localWebapp.jettyRestart)
    lazy val startLocalWebapp = localWebapp.jettyRestart
    lazy val stopLocalWebapp = localWebapp.jettyStop
}

class WebappProject(info: ProjectInfo, port: Int, targetOutputDir: Path, useIntegrationTestConfiguration: Boolean) extends DefaultWebProject(info) {
    override def outputPath = if (targetOutputDir != "default") targetOutputDir else super.outputPath

    val jetty7Webapp = "org.eclipse.jetty" % "jetty-webapp" % "7.0.2.RC0" % "test"
    val jetty7Server = "org.eclipse.jetty" % "jetty-server" % "7.0.2.RC0" % "test"

    override def jettyPort = port
    override def jettyContextPath = "/my-api"
    override def jettyRunAction = super.jettyRunAction dependsOn (task {setSystemProperty(useIntegrationTestConfiguration); None})
}

Note that the testAction for the IntegrationTests project starts up the webapp but does not shut it down after the tests have finished. I did try several techniques for getting this to happen including the one below.  This attempt started jetty, stopped jetty then tried to run the tests. Please let me know if you have any better ideas 🙂

</span></span>
<pre>def startJettyAndTestAction = super.testAction dependsOn (localWebapp.jettyRun)
override def testAction = task{None} dependsOn (startJettyAndTestAction, localWebapp.jettyStop)

Written by lizdouglass

November 1, 2010 at 9:47 pm

Posted in Uncategorized

Tagged with , ,

MongoDB

with 2 comments

Recently I started on a project that is using some interesting technologies including Scala, MongoDB and Django. Some are quite new to me and I’ve learnt a great deal. Here are some observations of the things I’ve learnt.

MongoDB

Mongo is a schema-less document-oriented database that stores data in binary encoded JSON documents – BSON documents. The online documentation is quite good and there is a good tutorial and an online interactive shell.

How have we been using Mongo so far?

We have two projects that interact with with Mongo; one is a RESTful API back-end and the other is a tool for populating a Mongo database with data from a MySQL database. Both these projects are written in Scala and use the Mongo Java API.

Why Mongo?

– The JSON-like documents allows us to store data about in a way that is obvious because they read like plain English.  We need to store information about the users of our system. In nearly all cases, the information available is different for each person – some people have no contact phone numbers, others have 6 children. Mongo has allowed us to build up profiles for our users and only include the pieces of information that we actually have available for them.

– The schema-less and denormalised nature of the database means that we can modify the structure frequently. We only started this project a couple of weeks ago and have already made several quite large changes. These include how we organise the Mongo documents that we’ve been generating from a legacy MySQL database. This sort of flexibility is fantastic, especially at the beginning of a new project.

Populating the Mongo database

Our data migration project extracts data for each record in the MySQL database using a SQL query. This data is then used to create a Person domain object, which is composed of microtypes like the one below. The Person type, as well as all of these types, implement the ConvertableToMongo trait:

class NextOfKin(val relationship: Relationship, val person: Person) extends ConvertableToMongo {
  def toMongoObject: DBObject = {
    new BasicDBObject(Map(
      "relationship" -> relationship.toMongoObject,
      "person" -> person.toMongoObject).asJava)
}

where:

class Relationship(val description: String) extends ConvertableToMongo {
  def toMongoObject(): DBObject = {
    new BasicDBObject(Map("relationship" -> description).asJava)
  }
}

ConvertableToMongo is a trait:

trait ConvertableToMongo {
  def toMongoObject: DBObject
}

Note that we need to use the asJava method from the scala-javautils library convert the Scala map to the requisite Java map required by the Mongo API.

The ConvertableToMongo trait has a single method that returns a Mongo DBObject. These are inserted into a Mongo collection like this:

val usersCollection = mongo.getCollection("user")
members foreach(user  => usersCollection insert(user toMongoObject))

The end result is a Mongo document like this one:

{
	"_id" : ObjectId("4c29f7fdbe924173a47a759f"),
	"firstName" : "Joe",
	"surname" : "Bloggs",
	"gender" : "Male",
	"nextOfKin" : {
		"relationship" : "Son",
		"person" : {
			"name" : "John Bloggs"
		},
	},
}

Note that unless specified every document added to a Mongo collection will automatically be assigned an ObjectId with the key _id.

Written by lizdouglass

July 19, 2010 at 8:31 am

Posted in Uncategorized

Tagged with

Part 3: Recursive menu building

leave a comment »

Part 3 of the Freemarker cleanup involved refactoring menu creation. The application has several menus that appear on the left side of the browser window. These menus are either nested inside eachother or stacked on top of one another.

Before part 3, the creation of the menus was done by one template called main.ftl. This template included other templates (and so on). This was main.ftl:

 
[#ftl]

[#if user.member]
	[#include "/member-menu.ftl"]
[#else]
	[#include "/cross-role-menu.ftl"]
[/#if]
[#if someCondition?? ]
	[#include "/some-menu.ftl"]
[/#if ]
	[#if foo.id !=  user.id]
		[#include "/foo-menu.ftl"]
[/#if ]

All of the menu templates had some conditional logical statements in them – in fact, there was probably an average of about a dozen per template. The only place that these logic statements were tested was in the Selenium tests – obviously not ideal.

The aim of this refactoring was to get rid of all the conditional logic in the templates. The logic would be replaced with tested Java code. This made it possible to render the menus from one single recursive menu template:

 
[#ftl]
[#import "/spring.ftl" as spring /]

[#if menu??]
    [@buildMenu menu=menu depth=0/]
[/#if]

[#macro buildMenu menu depth]
    [#list menu.menuEntries as menuEntry]
        [#if menuEntry.type == "LINK"]
            <div>
                <a href="[@spring.url menuEntry.link?html /]">${menuEntry.caption?html}</a>
            </div>
        [#else]
            [#if menuEntry.menuEntries?size > 0]
                [#if depth > 0]
                    <h3>${menuEntry.caption}</h3>
                    <div id="${menuEntry.name}" class="menu_sub_block">
                [#else]
                    <div id="${menuEntry.name}" class="menu_block">
                    <h5>${menuEntry.caption}</h5>
                [/#if]
                [@buildMenu menu=menuEntry depth=depth+1/]
                </div>
            [/#if]
        [/#if]
    [/#list]
[/#macro]

The new template builds all the menus from the menu model entry. This is added into the model in the MenuInterceptor. This interceptor has dependencies on four factories that create the required menus. The postHandle method creates a rootMenu that contains the other menus:

 
public class MenuInterceptor extends HandlerInterceptorAdapter {
    private FooMenuFactory fooMenuFactory;
    private CrossRoleMenuFactory crossRoleMenuFactory;
    private MemberMenuFactory memberMenuFactory;
    private SomeMenuFactory someMenuFactory;

    @Autowired
    public void setFooMenuFactory(FooMenuFactory fooMenuFactory) {
        this.fooMenuFactory = fooMenuFactory;
    }

    // omitted setters for the other factories

    @Override
    public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, ModelAndView mv)
            throws Exception {

        if (mv != null) {
            User user = (User) request.getAttribute(RequestAttributeNames.user.name());

            Menu rootMenu = new Menu("menu");
            Menu memberMenu = memberMenuFactory.createMemberMenu(user);
            Menu crossRoleMenu = crossRoleMenuFactory.createCrossRoleMenu(user);
            Menu someMenu = someMenuFactory.createSomeMenu(user);
            Menu fooMenu = fooMenuFactory.createFooMenu(user);
            rootMenu.addEntries(memberMenu, crossRoleMenu, someMenu, fooMenu);

            mv.addObject("menu", rootMenu);
        }
    }
}

Each of the menu factories creates a Menu object. The Menu class has a list of MenuEntry objects. There are two implementations of the MenuEntry interface: Link and Menu:

The MenuEntry interface:

 
public interface MenuEntry {

    String getCaption();

    MenuEntryType getType();
}

Menu:

 
public class Menu implements MenuEntry {
    private String caption = "";

    private List<MenuEntry> menuEntries;
    private final String name;

    public Menu(String name) {
        this.name = name;
        menuEntries = Lists.create();
    }

    public void setCaption(String caption) {
        this.caption = caption;
    }

    public void addEntry(MenuEntry menuItem) {
        menuEntries.add(menuItem);
    }

    public String getCaption() {
        return caption;
    }

    public List<MenuEntry> getMenuEntries() {
        return menuEntries;
    }

    public void addEntries(MenuEntry... links) {
        menuEntries.addAll(Arrays.asList(links));
    }

    public MenuEntryType getType() {
        return MenuEntryType.MENU;
    }

    public String getName() {
        return name;
    }
}

… and Link:

 
public class Link implements MenuEntry {

    private final String captionText;
    private final SecureLinks.Link link;

    public SecureContextLink(String captionText, SecureContextLinks.Link link) {
        this.captionText = captionText;
        this.link = link;
    }

    public String getCaption() {
        return captionText;
    }

    public String getLink() {
        return link.toString();
    }

    public MenuEntryType getType() {
        return MenuEntryType.LINK;
    }
}

Each of the factories creates a menu using the logic that was previously in the Freemarker templates. Each factory was developed using TDD and looks a bit like this:

 
public class FooMenuFactory {
    private FooRepository fooRepository;
    private final SecureContextLinks links = new SecureContextLinks();

    public FooMenuFactory(FooRepository fooRepository) {
        this.fooRepository = fooRepository;
    }

    public Menu createFooMenu(User user) {
        Menu fooMenu = new Menu("foo_menu");

        fooMenu.setCaption(user.getDisplayName());

        if (context.isMember()) {
           fooMenu.addEntry(new SecureContextLink("Summary", links.getSomeSummary()));
        }

        // add other links

        return fooMenu;
    }
}

Moving to this style of menu creation reduced 7 templates down to one recursive template. All of the logic that was previously in the templates was moved into Java and was unit tested. (Yay!)

Written by lizdouglass

January 11, 2010 at 6:19 am

Posted in Uncategorized

Tagged with ,

Part 2: SecureLinks

leave a comment »

Part two of the Freemarker cleanup came about when we noticed that we were adding the same links many times into various models. This repetition was eliminated by adding the most common links into the model using an interceptor.

The interceptor has a postHandle method that adds an instance of our new SecureLinks class (below) into the model:

 
@Override
public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, ModelAndView mv)
        throws Exception {

    if (mv != null) {
        mv.addObject("links", new SecureLinks());
    }
}

The SecureLinks class provides the most commonly used links in our application:

 
import my.app.LinkBuilder;
import my.app.QueryString;
import my.app.controller.secure.foo.FooController;
import my.app.controller.secure.bar.BarController;

public class SecureLinks {

    private static final LinkBuilder BUILDER = new SecureLinkBuilder();

    public Link getFooSearch() {
        return new PlainLink(FooController.class);
    }

    public Link getBarList() {
        return new LinkWithQueryString(BarController.class, QueryString().add("letter", "A"));
    }
    
    // other methods omitted for brevity

    public static interface Link {
    }

    public static class PlainLink implements Link {
        private final Class<?> controller;
        private final String methodName;

        public PlainLink(Class<?> controller) {
            this(controller, null);
        }

        public PlainLink(Class<?> controller, String methodName) {
            this.controller = controller;
            this.methodName = methodName;
        }

        @Override
        public String toString() {
            if (methodName != null) {
                return BUILDER.linkTo(controller, methodName);
            }
            return BUILDER.linkToGet(controller);
        }
    }

    public static class LinkWithQueryString implements Link {
        private final Class<?> controller;
        private final String methodName;
        private final QueryString query;

        public LinkWithQueryString(Class<?> controller, String methodName, QueryString query) {
            this.controller = controller;
            this.methodName = methodName;
            this.query = query;
        }

        public LinkWithQueryString(Class<?> controller, QueryString query) {
            this(controller, null, query);
        }

        @Override
        public String toString() {
            if (methodName != null) {
                return BUILDER.linkTo(controller, methodName, query);
            }
            return BUILDER.linkToGet(controller, query);
        }
    }
}

Adding the links into the model in the interceptor means that there is something extra in each ModelMap that may not necessarily be used/required, but we thought that it was worth it to remove the repetition.

Written by lizdouglass

January 6, 2010 at 5:29 am

Posted in Uncategorized

Tagged with ,

Java LinkBuilder

with one comment

Shortly before Christmas Tom and I decided to try to remove logic from some project Freemarker templates. Our first step was to move away from building up URIs inside templates, by creating some sort of Java URI builder.

Background:
We had lots of Freemarker template snippets that looked like this:

[#if member.someType.value != 'Foo']
     <div>[@macros.link_to 'Bar reports' 'member/report?id=${id}&amp;type=M&amp;database=${database}' /]</div>
[/#if]

All of them used the link_to Freemarker macro, which was defined as:

[#macro link_to caption path target='_top']
    [#if path?starts_with("/")]
        [#assign newPath=path?substring(1) /]
    [#else]
        [#assign newPath=path /]
    [/#if]
    <a href="[@spring.url '/appSecureServletPath/${newPath}' /]" target="${target}">${caption}</a>
[/#macro]

What we wanted to achieve first up:

Our aim was to replace all the uses of the link_to macro and instead generate all the URIs in Java and then add them to the model. We wanted to move a template usage like this:

<a href="[@spring.url contactDetailsChange?html /]">Change your contact details</a>

Where the link is added in the controller like so:

modelAndView.addObject("contactDetailsChange", linkBuilder.linkTo(ContactDetailsController.class));

Creating the LinkBuilder:

We did an assessment of all the URI endpoints in our project and realised that:
– We have some controller classes with a request mapping, some controllers that have methods with request mappings and some controllers with a combination of both class and method mappings.
– We have some controllers with more than one GET request method mapping and therefore could not assume only one request method mapping per controller.

We test drove a LinkBuilder interface and a DefaultLinkBuilder implementation for all the combinations we found in the controllers. The idea was to link from one handler class/method to another without being concerned about the specific URI mapped paths.

We also included methods that return a ModelAndView for redirecting and forward from a controller. Some of the methods also take a QueryString microtype (see below).

This is the LinkBuilder interface:

import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.servlet.ModelAndView;

public interface LinkBuilder {

    String linkTo(Class<?> controller, String methodName);

    String linkTo(Class<?> controller, RequestMethod method);

    String linkTo(Class<?> controller, String methodName, QueryString query);

    String linkTo(Class<?> controller, RequestMethod method, QueryString query);

    String linkToGet(Class<?> controller);

    String linkToGet(Class<?> controller, QueryString query);

    String forwardTo(Class<?> controller, String methodName);

    ModelAndView redirectTo(Class<?> controller, String method);

    ModelAndView redirectTo(Class<?> controller);

    ModelAndView redirectTo(Class<?> controller, String method, QueryString query);

    ModelAndView redirectTo(Class<?> controller, QueryString query);
}

And the DefaultLinkBuilder:

import org.apache.commons.lang.ArrayUtils;
import org.springframework.core.annotation.AnnotationUtils;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.servlet.ModelAndView;

import java.lang.reflect.Method;

public class DefaultLinkBuilder implements LinkBuilder {

    private final String prefix;

    public DefaultLinkBuilder() {
        this("");
    }

    public DefaultLinkBuilder(String prefix) {
        this.prefix = prefix;
    }

    public String linkTo(Class<?> controller, String methodName) {
        return prefix + getControllerUrl(controller) + getMethodUrl(controller, methodName);
    }

    public String linkTo(Class<?> controller, RequestMethod method) {
        return prefix + getControllerUrl(controller) + getMethodUrl(controller, method);
    }

    public String linkTo(Class<?> controller, String methodName, QueryString query) {
        return linkTo(controller, methodName) + "?" + query;
    }

    public String linkTo(Class<?> controller, RequestMethod method, QueryString query) {
        return linkTo(controller, method) + "?" + query;
    }

    public String linkToGet(Class<?> controller) {
        return linkTo(controller, RequestMethod.GET);
    }

    public String linkToGet(Class<?> controller, QueryString query) {
        return linkTo(controller, RequestMethod.GET, query);
    }

    public String forwardTo(Class<?> controller, String methodName) {
        return "forward:" + linkTo(controller, methodName);
    }

    public ModelAndView redirectTo(Class<?> controller, String method) {
        return new ModelAndView("redirect:" + linkTo(controller, method));
    }

    public ModelAndView redirectTo(Class<?> controller) {
        return new ModelAndView("redirect:" + linkTo(controller, RequestMethod.GET));
    }

    public ModelAndView redirectTo(Class<?> controller, String method, QueryString query) {
        ModelAndView mv = redirectTo(controller, method);
        query.addToModel(mv.getModel());
        return mv;
    }

    public ModelAndView redirectTo(Class<?> controller, Context context) {
        QueryString queryString = context.asQueryString();
        return redirectTo(controller, queryString);
    }

    public ModelAndView redirectTo(Class<?> controller, QueryString query) {
        ModelAndView mv = redirectTo(controller);
        query.addToModel(mv.getModel());
        return mv;
    }

    private String getControllerUrl(Class<?> controller) {
        RequestMapping annotation = AnnotationUtils.findAnnotation(controller, RequestMapping.class);
        if (annotation != null) {
            return getFirstValue(annotation);
        }
        return "";
    }

    private String getMethodUrl(Class<?> controller, String methodName) {
        Method[] methods = controller.getMethods();
        for (Method method : methods) {
            if (method.getName().equals(methodName)) {
                RequestMapping annotation = AnnotationUtils.findAnnotation(method, RequestMapping.class);
                if (annotation != null) {
                    return getFirstValue(annotation);
                }
            }
        }
        throw new IllegalArgumentException("Cannot find method with name " + methodName
                + " with a RequestMapping annotation on controller " + controller.getName());
    }

    private String getMethodUrl(Class<?> controller, RequestMethod requestMethod) {
        Method[] methods = controller.getMethods();
        for (Method method : methods) {
            RequestMapping annotation = AnnotationUtils.findAnnotation(method, RequestMapping.class);
            if (annotation != null) {
                if (ArrayUtils.contains(annotation.method(), requestMethod)) {
                    return getFirstValue(annotation);
                }
            }
        }
        throw new IllegalArgumentException("Cannot find method that can handle " + requestMethod
                + " requests on controller " + controller.getName());
    }

    private String getFirstValue(RequestMapping annotation) {
        String[] value = annotation.value();
        if (value.length > 0) {
            return value[0];
        }
        return "";
    }
}

We have two subclasses of the DefaultLinkBuilder the SecureLinkBuilder and the UnsecureLinkBuilder. These classes simply set the appropriate servlet path as the prefix.

The QueryString class knows how to add itself into the model:

import org.apache.commons.httpclient.NameValuePair;
import org.apache.commons.httpclient.util.EncodingUtil;

import java.util.List;
import java.util.Map;

public class QueryString {

    private final List<NameValuePair> pairs = Lists.create();

    public QueryString add(String name, String value) {
        pairs.add(new NameValuePair(name, value));
        return this;
    }

    public boolean isEmpty() {
        return pairs.isEmpty();
    }

    public NameValuePair[] toArray() {
        return pairs.toArray(new NameValuePair[pairs.size()]);
    }

    public void addToModel(Map<String, Object> model) {
        for (NameValuePair pair : pairs) {
            model.put(pair.getName(), pair.getValue());
        }
    }

    @Override
    public String toString() {
        return EncodingUtil.formUrlEncode(toArray(), "UTF-8");
    }
}

The creation of the LinkBuilder and the implementations allowed us to go through all the controllers and add links into the models. We removed a lot of string concatenation in templates doing this.

Written by lizdouglass

January 5, 2010 at 10:13 am

Posted in Uncategorized

Tagged with , ,

Book Club: Working Effectively With Legacy Code – Chapters 17, 18 and 19 (Michael Feathers)

leave a comment »

This week at book club we continued the three-chapter theme, taking on chapters 17, 18 and 19 of Working Effectively With Legacy Code. Chapter 17 is the most lengthy of the three and we spent almost all of our time discussing that one.

Chapter 17: My Application Has No Structure

In this chapter Feathers discusses why code bases degrade. He says that “They (applications) might have started out with a well-thought-out architecture, but over the years, under schedule pressure, they can get to the point where nobody really understands their complete structure.” He gives three possible reasons why a team may be unaware of this happening:

  • The system is so complex that it takes a long time to get the big picture
  • The system can be so complex that there is no big picture
  • The team is in a very reactive mode, dealing with emergency after emergency so much that they lose sight of the big picture

Feathers details some suggestions for how to communicate the intent of a system including ‘Telling the Story of the System’, a process where “One person starts off by asking the other ‘What is the architecture of the system?’ Then the other person tries to explain the architecture of the system using only a few concepts.” This is useful because “Often when we force ourselves to communicate a very simple view of the system, we can find new abstractions.”

  • I think all of us have used this technique when first coming to a project to try and get an understanding of the system. I usually have this type of conversation with each pair for the first couple of weeks on a project.
  • Ahrum said that in his experience developers find it very difficult to explain a system simply because we generally go into too much detail. Giving information about logging and security are examples including too much noise in the explanation.
  • Tom said that it’s important to speak with everyone on the project, not just the architect. He finds that architects generally focus too much on application layering.
  • I think this technique is related to Dan’s idea of having a project shaman.

Feathers also discussed Naked CRC. This is a lightweight version of CRC where there is no writing on the cards. “The person describing he system uses a set of blank index cards and lays them down on a table one by one. He or she can move the cards, point at them, or do whatever else is needed to convey the typical objects in the system and how they interact”. I’ve never used this approach before but Anita and I both said that we would like to try it because we think we’re visual learners. Tom suggested that combining the cards and a whiteboard could also be useful, especially to show interactions between components.

We also spent some time discussing other techniques for getting the big picture of an application, especially when encountering it for the first time:

  • Reading the code can be a good way to spot major concepts or the lack of them.
  • Raphael said that it’s useful to listen to the language that people use to describe the system and to compare that to the concepts in the code. Sarah and Mark both recently posted on this.
  • Tom said that he likes to make lots of mindmaps when he first goes onto a new project. He said he generally likes to observe for about a month before he attempts any large refactorings.

Written by lizdouglass

December 19, 2009 at 7:28 am

Posted in Uncategorized

Tagged with

Adding a Spring aspect

leave a comment »

Yesterday Tom and I added a Spring aspect to our front end project. My only previous encounter with aspects was when I read a Spring book about a year ago. So, what is an aspect? According to the documentation “Aspects enable the modularization of concerns such as transaction management that cut across multiple types and objects”.

Tom and I needed to redirect the user to an ‘eek’ page if they tried to do some transactions when their account was in a bad way. An aspect seemed like a tool for the job.  But what sort of aspect? As Tom explained to me, the Around advice is the only type that can be used to change the flow of execution:

Around advice: Advice that surrounds a join point such as a method invocation…. Around advice can perform custom behavior before and after the method invocation. It is also responsible for choosing whether to proceed to the join point or to shortcut the advised method execution by returning its own return value or throwing an exception.

We created the new aspect in these steps:

1. Create a new method annotation. This annotation was applied to all the controller method join points. According to the Spring documentation this is what is known as an “@annotation pointcut designator – the subject of the join point matches only methods that have the given annotation.”

@Target({ElementType.METHOD})
@Retention(RetentionPolicy.RUNTIME)
public @interface TransactionPermitted {
}

2. Create the aspect using TDD. Our aspect checks to see whether the the users account is in a state and redirects to the “transactions-not-permitted” view if this is the case. We used the example in the documentation to create our aspect. Tom pointed out a couple of thing while we were doing this that I think are important to remember:

  • Aspects need a no argument constructor.
  • From the docs: “The value returned by the around advice will be the return value seen by the caller of the method”. We had to make sure that all the handler methods advised by this aspect also return a string (and not a ModelAndView for example).
@Aspect
@Component
public class TransactionPermittedAspect {

private final Logger logger = Logger.getLogger(TransactionPermittedAspect.class);

private FooRepository fooRepository;

@Autowired
public void setFooRepository(FooRepository fooRepository) {
this.fooRepository = fooRepository;
}

@Around("@annotation(my.package.TransactionPermitted)")
public Object intercept(ProceedingJoinPoint pjp) throws Throwable {
logger.debug("Checking whether transactions are permitted");

Widget widget = AspectUtils.getArgumentOfType(pjp.getArgs(), Widget.class);
Bar bar = fooRepository.getBar(widget);

if (bar.hasASomeBadCondition()) {
return "transactions-not-permitted";
}
return pjp.proceed();
}

}

Our aspect (above) uses AspectUtils. This class, created by Tom, abstracts the argument searching and casting:

public class AspectUtils {

public static &lt;T&gt; T getArgumentOfType(Object[] args, Class&lt;T&gt; type) {
for (Object arg : args) {
if (type.isInstance(arg)) {
return type.cast(arg);
}
}
throw new IllegalArgumentException("Cannot find argument of " + type);
}
}

3. Apply the annotation pointcut to the target handler methods. This step was trivial, but adding them using TDD was interesting. Each class with a method annotated by our pointcut now has a test like this:

@Test
public void shouldCheckThatTransactionsArePermittedForFoo() {
assertNotNull("Transaction permission check not found", myMethod().getAnnotation(TransactionPermitted.class));
}

private Method myMethod() {
return MethodFinder.findRequestHandlerMethod(WibbleController.class, "myMethod");
}

Note that the MethodFinder looks for the target method with the a particular name and a RequestMapping annotation:

import static org.junit.Assert.fail;
import org.springframework.web.bind.annotation.RequestMapping;

import java.lang.reflect.Method;

public class MethodFinder {

public static Method findRequestHandlerMethod(Class&lt;?&gt; type, String methodName) {
Method[] methods = type.getMethods();
for (Method method : methods) {
if (method.getName().equals(methodName) &amp;&amp; (method.getAnnotation(RequestMapping.class) != null)) {
return method;
}
}
fail("Cannot find method named [" + methodName + "] with RequestMapping annotation in " + type.getName());
return null;
}
}

}

Adding this around aspect took us a very short time and it perfectly suited our need.

Written by lizdouglass

December 16, 2009 at 10:19 am

Posted in Uncategorized

DefaultHttpClientIntegrationTest

leave a comment »

Today Tom and I were looking at our DefaultHttpClient class. This class wraps/adapts an Apache HttpClient. It implements our HttpWebClient interface:

public interface HttpWebClient {

    public HttpResponse post(String endpointURL, String requestBody);

    public HttpResponse get(String endpointURL, QueryString query);
}

Until today we didn’t have any tests for just this class, although there was some coverage from integration tests for other classes. We added tests that are very similar to some written by Cam and Silvio in another project (full credit to them). There’s a couple of interesting things happening our test class (below):

  • A Jetty server is created in the setup method. The DefaultHttpWebClient test instance makes a request to this server.
  • The findFreePort method uses java.net.ServerSocket to identify a free port for a Jetty Server to run on.
  • Each test sets the handler for the test Jetty server as a new TestHandler (line 84). The TestHandler class implements the Jetty AbstractHandler interface. It delegates to an overloaded handle method that is defined in each test method.
  • In the shouldPerformHttpPost test the handle method returns a response with an OK status code. It also puts some information into a map called ‘inspector’. Request information is put into the ‘inspector’ and it is used in several assertions.

Here tis:

public class DefaultHttpWebClientIntegrationTest {

    private int freePort;
    private Server server;

    @Before
    public void setup() throws Exception {
        freePort = findFreePort();
        server = new Server(freePort);
        server.start();
    }

    @After
    public void teardown() throws Exception {
        server.stop();
    }

    private int findFreePort() throws Exception {
        ServerSocket socket = new ServerSocket(0);
        int port = socket.getLocalPort();
        socket.close();
        return port;
    }

    @Test
    public void shouldPerformHttpPost() {
        final Map<String, String> inspector = Maps.create();

        server.setHandler(new TestHandler() {
            public void handle(HttpServletRequest request, HttpServletResponse response) throws IOException {

                inspector.put("method", request.getMethod());
                inspector.put("contentType", request.getContentType());
                inspector.put("body", IOUtils.toString(request.getInputStream()));

                response.setStatus(HttpServletResponse.SC_OK);
                response.setContentType("text/plain");
                response.getWriter().print("test response");
            }
        });

        DefaultHttpWebClient client = new DefaultHttpWebClient();
        HttpResponse response = client.post("http://localhost:" + freePort + "/test", "test payload");

        assertThat(inspector.get("method"), equalTo("POST"));
        assertThat(inspector.get("contentType"), equalTo("text/xml; charset=UTF-8"));
        assertThat(inspector.get("body"), equalTo("test payload"));

        assertThat(response.isOK(), equalTo(true));
        assertThat(response.getContentType(), startsWith("text/plain"));
        assertThat(response.getResponseBodyAsText(), equalTo("test response"));
    }

    @Test
    public void shouldPerformHttpGet() {
        final Map<String, String> inspector = Maps.create();

        server.setHandler(new TestHandler() {
            public void handle(HttpServletRequest request, HttpServletResponse response) throws IOException {

                inspector.put("method", request.getMethod());
                inspector.put("foo", request.getParameter("foo"));

                response.setStatus(HttpServletResponse.SC_OK);
                response.setContentType("text/plain");
                response.getWriter().print("test response");
            }
        });

        QueryString queryString = new QueryString();
        queryString.add("foo", "bar");

        DefaultHttpWebClient client = new DefaultHttpWebClient();
        HttpResponse response = client.get("http://localhost:" + freePort + "/test", queryString);

        assertThat(inspector.get("method"), equalTo("GET"));
        assertThat(inspector.get("foo"), equalTo("bar"));

        assertThat(response.isOK(), equalTo(true));
        assertThat(response.getContentType(), startsWith("text/plain"));
        assertThat(response.getResponseBodyAsText(), equalTo("test response"));
    }

    private static abstract class TestHandler extends AbstractHandler {

        public void handle(String target, HttpServletRequest request, HttpServletResponse response, int dispatch)
                throws IOException, ServletException {

            handle(request, response);
            ((Request) request).setHandled(true);
        }

        public abstract void handle(HttpServletRequest request, HttpServletResponse response) throws IOException;
    }
}

Written by lizdouglass

December 14, 2009 at 10:19 am

Posted in Uncategorized