Devoxx 2014 Talk now freely available

My talk at DevoxxUK 2014 has now been made freely available* at So if you would like to me reprise my previous post in full audiovisual form, go here.

* following registering for a free Parleys account.

Authored by Graham Allan.
Published on 18 February 2015.

Google AutoValue - Easier Value Objects In Java (Part 1, Configuration)

Despite my numerous gripes about working day-to-day in Scala, one of the things I do love about it are case classes. The ability to define an immutable class which will have equals() and hashCode() methods, a decent toString(), accessor methods (just say no to setters, kids) in a single line of code, is such a breath of fresh air compared to Java. Bleh, all that boilerplate just for a value object with a few fields.

Well, a team at Google think they have a decent answer for this very problem in Java: AutoValue.

AutoValue is designed to take the boilerplate out of writing value objects. Using Java’s standardised annotation processing mechanism, you define a class with the properties you want, and a code generator produces all that boilerplate for you. You don’t have to write that crap, but even better, you don’t have to read it either.

Could it be anywhere near competing with Scala’s case classes? To find out, I created a simple project to see what realistic usage of the library might look like. This rest of this post describes how to set a realistic project up to use AutoValue, and first impressions on how the IDEs interact with it. I’m not yet at the point where I can make an informed comparison to case classes.

Before getting into a ton of detail of the features of AutoValue, it was important to me that it is easy:

  • for the project to run with standard build tools, without an IDE
  • for developers new to the project, to get it checked out and usable in their IDE of choice
  • for modification of the code to be immediately recognised in their IDE

In terms of build tools, this meant checking it can work with Maven and Gradle, for IDEs it meant Eclipse and IntelliJ.

Unlike other tools in this space, such as Project Lombok, AutoValue uses the Java standard annotation processing framework defined in JSR 269. This limits how powerful the AutoValue library can be, but it also means there’s not much in the build process that’s really specific to it. In theory this experience should hold for all annotation processors.

The code

I defined one Java class, a super-simplistic represention of a car:

public abstract class Car {
  public abstract String model();
  public abstract int numWheels();

@AutoValue tells the annotation processor to take this class, and generate the value type, with all the boilerplate. The generator spits out a Java source file, in I haven’t included the source here, for brevity, and because I think we’ve looked at enough of that kind of boilerplate for a lifetime. If you want to confirm the generated source is sensible, you can see what was generated here. The generated class is a package-private subclass, with equals(), hashCode(), toString() implemented in a reasonable way, saving you the hassle. Once that’s generated, you then have to define a static factory method in the abstract class Car, returning new instances of the generated subclass. A little bit of hoop-jumpiness, but nothing too insane.

I also added a single JUnit test, with a single assertion: assertImmutable(AutoValue_Car.class);. Using my own Mutability Detector library, this unit test performs static analysis on the generated class, to check it’s immutable. Not only is it a good sanity check to highlight any surprises in how AutoValue generates immutable classes, but it also makes sure that the generate and compile portion of the build is working correctly. Note I can only reference AutoValue_Car.class because I put my test in the same package.

Now that I had a class annotated with @AutoValue, and an expectation of what would be generated, and a way to check compilation worked, I set about seeing how such a setup would be configured across the build tools and IDEs.


I was pleasantly surprised by Maven (bet you don’t hear that often). I just added this dependency:


The project compiled, and the tests were successful. There was literally nothing else that had to be done to make sure @AutoValue was picked up by the annotation processor. Win. This may be a factor of how annotation processing is integrated to javac, but still, it’s nice when something Just Works™.


Again, it Just Worked™: it compiled and executed tests successfully. Nice part about this case was the dependency declaration was shorter without really losing information: compile '', which is a nice aspect of Gradle.

Now we have the kind of build configuration that would allow us to run tests on CI, it was time to move on to the IDEs. My expectation is that a developer should be able to checkout the project, import it into their IDE, and have it compile successfully, ready to be worked on.


For Eclipse, to be able to import a project, I expect to run a command from the build tool to generate the config files. The commands I used were mvn eclipse:eclipse and gradle eclipse. Unfortunately here is where the “just work-iness” ended.

Eclipse requires some configuration to tell it that it should run annotation processors in a project. This can be set in the project properties through Eclipse, and I was able to work it out through the GUI without too much furrowing of brow. However, I wanted to get to a point where a new developer could be up-and-running straight away. I never got to that point with either Maven or Gradle. The closest I got was to add two config files to source control (.factorypath and .settings/org.eclipse.jdt.apt.core.prefs). Fortunately in both cases the config files are machine-independent, so it’s pretty safe to share them with your fellow developers.

Maven also required the following snippet to be added to the maven-compiler-plugin’s <configuration> section.


There was also a weird issue I’m blaming on Eclipse. Even with the correct config files, when I imported the project, I needed to turn annotation processing off and on again to get it to work. Nasty.

Overall, not a glowing testimonial for Eclipse or Maven and Gradle’s support for it.


For IntelliJ, as well as the build commands mvn idea:idea and gradle idea, I also attempted to import the project directly from IntelliJ. In this case, Maven beat Gradle. The project compiled and ran the tests successfully both from using mvn idea:idea, and by importing the Maven “external model” from within IntelliJ. However, for the equivalents in Gradle, I’m still unable to find the right configuration to get it to build successfully. However, this could just be a general Gradle+IntelliJ problem: the resultant project configuration just doesn’t look right, regardless of annotation processors. It could be that I’m using fairly recent version of Gradle, and the IntelliJ plugin hasn’t caught up, or that I’m just less familiar with the Gradle+IntelliJ combination than Maven+Eclipse, and I’m missing something obvious. Answers on a postcard please.


Apart from the Gradle+IntelliJ combination, annotation processing seems to be well supported across build tools and IDEs. You may have an opinion on code generation, but when your build tool successfully builds without much configuration, and your IDE reflects changes and generates code seamlessly, it’s not that different from a vanilla compiler. Initial impressions are good.

Hopefully in a later post I’ll be able to comment on AutoValue’s feature set in detail, but for now I just wanted to see what the basic usage felt like. Time will tell whether I would consider AutoValue a reasonable compromise in the quest for something like Scala’s case classes for Java.

Entire source code listing can be found in this GitHub project:

Authored by Graham Allan.
Published on 23 July 2014.

Sealing Algebraic Data Types in Java

Recently I caught up on Dick Wall’s Devoxx UK talk, “What have the Monads ever done for us?”*. Dick does a great job of introducing terms such as “monoid”, “functor” and “monad” in a way that’s easy to grasp, even for Java-damaged minds like my own. I thoroughly recommend watching the talk to all developers who have heard these terms and want a simple way to understand them, without trudging through tutorials using silly metaphors like space suits or burritos.

Also, achievement unlocked: met Dick Wall at the speaker’s dinner. A thoroughly (and completely predictably based on the Java Posse podcasts!) lovely chap.

As well as heartily recommending you check out Dick’s talk, I wanted to pick up on something he mentioned, and suggest an approach. Dick introduces an algebraic data type, a simplified linked list implementation, using an abstract class DeadSimpleList with exactly two subclasses: DeadSimpleListNil and DeadSimpleListCons. The base class looks like this:

public abstract class DeadSimpleList<T> {
   public abstract <U> DeadSimpleList<U> map(Function<T, U> function);
   public abstract boolean isEmpty();
   public abstract String contentsAsString();

(For clarity, most of the useful functions and generics had been omitted.)

One of the points of these algebraic data types is that subclassing is limited and controlled. As opposed to an interface, they are explicitly intended not to be extended by users of the type. For example, Scala’s abstract Option type is extended by the concrete Some and None types, and nothing else. Some and None entirely define Option’s behaviour. You want to explicitly prevent someone coming along and adding a new extension of Option, like SomethingWhenIFeelLikeIt. If it’s possible for such a beast to exist, it becomes more difficult to reason about.

In talking about how extension should be prohibited, Dick says:

“[Haskell and Scala let you] keep the superclass public, inherit from it, in a controlled way, but then stop anyone else from inheriting from it… and I don’t know that there’s a good answer for that in Java”.

I think there is a good answer. Or at least, as good as these things can ever be in Java.

As the example continues, the two subclasses are declared as public class DeadSimpleListNil<T> and public class DeadSimpleListCons<T>, which would have to reside in different files from the superclass. I think the desired trick is to limit subclassing by reducing visibility of the abstract class’ constructor. Like so:

public abstract class DeadSimpleList<T> {
  public abstract <U> DeadSimpleList<U> map(Function<T, U> function);
  public abstract boolean isEmpty();
  public abstract String contentsAsString();

 // add this constructor
 private DeadSimpleList() { }


Adding the private constructor will cause the subclasses to fail to compile, because the constructor of the abstract class will not be visible. The only way to get them to compile is to move the subclasses into a scope where they can invoke the private constructor of the superclass, like so:

public abstract class DeadSimpleList<T> {
    // abstract methods, private constructor, as before
    public static final class DeadSimpleListNil<T> extends DeadSimpleList<T> {
      // method implementations as before
    public static final class DeadSimpleListCons<T> extends DeadSimpleList<T> {
      // concrete constructor and method implementations as before

Now you have complete control over how the abstract class is subclassed. Users can still reference the inner classes, and construct instances of them. Crucially, they can’t create their own subclass called RandomlyEmptyDeadSimpleList, and thus nobody has to waste any brain cycles fretting about the possibility of its existence.

Although the semantics are quite different from Scala’s sealed keyword, the outcome is roughly equivalent: only when you control the source code of base class can you add subclasses. Surprisingly for Java, this mechanism isn’t as annoyingly cumbersome as one might have predicted.

* I missed it when I attended Devoxx UK, since I was giving my talk at the same time on another track. It’s common for me to visit conferences where talks I’m interested in clash, this was the first time where one of those talks was mine.

Authored by Graham Allan.
Published on 17 July 2014.
Tags: devoxxuk , java , scala , software


Graham "Grundlefleck" Allan is a Software Developer living in Scotland. His only credentials as an authority on software are that he has a beard. Most of the time.

© Copyright 2013-2017