Running Multiple Versions of a Micro Service

Last week I visited GOTO Amsterdam. There were some fantastic speakers talking about micro services. At Magnet.me we’re spending a lot of effort in transforming our monolithic application into micro services so we were curious to see if we were on the right track. Turns out we are, for which I’m very proud of the team. One talk in in particular stood out and gave me a new insight, which I’d like to share here.

The talk was by Fred George who presented this definition of micro services. I thought it was pretty accurate:

Fred told a very interesting story about rolling out new versions of micro services. He gave the example of a car rental company that had a micro service that was responsible for generating ads to show customers on the website. The system was connected via a message bus. Because the advertisement service was so small, it was easy for programmers to duplicate the micro service (literally just copy/paste it), optimize it, and then run the new version along side the older version. This sounded very wrong and horrible to me at first, because you’re duplicating code, and you’d have to “maintain” two versions of a service. But then Fred shed some light on the potentially huge benefits.


Running and Testing Puppet Master Locally

About a month ago we started to use Puppet at Magnet.me. Puppet now automates all our server provisioning from our dev server, to the Raspberry Pi we have running at the office. While learning Puppet, we could not find any information about developing Puppet manifests locally and testing them on local copies of the machines. As a result, we ended up editing the Puppet manifests directly over an SSH connection. This prevented us from having a nice pull-request Git flow and often led to syntax errors in the manifests making it impossible for operational nodes to update. In this blog post, I will explain the solution we came up with to test our configurations on local VMs.


Cache-Control Using Annotations With Jersey

Building RESTful services with Jax-RX is awesome, but there’s no annotation based notation to set your Cache-Control headers. You can either set the Cache-Control headers using a filter based on a URL pattern and map your responses through there, or you can use the ResponseBuiler with the CacheControl class like this:

1
2
3
CacheControl control = new CacheControl();
control.setMaxAge(60);
Response.ok(myEntity).cacheControl(control).build();

However, I would rather have something like this:

1
2
3
4
5
6
@GET
@CacheMaxAge(time = 10, unit = TimeUnit.MINUTES)
@Path("/awesome")
public String returnSomethingAwesome() {
  ...
}

It turns out that’s pretty easy to set-up. I mostly use either no caching at all or caching for a certain period.


Using Git Submodules for Maven Artifacts Not in Central

Sometimes you come across a project (or a branch of a project) that you need and isn’t in Maven central. If you own your own Maven repository that’s no problem: you simply deploy the project to your own Maven repository and you are done. However, when you don’t have your own Maven repository things get complicated.

You could install the other project in you local Maven repository using mvn install. After that you tell other developers via the README that they need to do the same. However, this requires manual effort, which is always more error-prone. You’d like to have Maven build that project automatically.

An alternative is to copy/paste the code from that project into your project. This is also not the best idea: you’ll loose track of the version you’ve imported, you’ll have code that doesn’t belong to you in your repository, chances are that no one will update the code after that initial import because it might break, and finally there might be some licensing issues.

There is a third option which is a bit more elegant (albeit not perfect). You can import the third party project as a Git submodule or using Git subtree merging. Here are the main differences:

  • You can use submodules if you want to use the code but not import their repository in yours. This is useful if you want to contribute back to the original repository, or the original repository is very large and you don’t want to bloat your repository. In other words: this will only link to the other repository.
  • You can use subtree merging if you just want to import the code into your repository. This is useful if you just want read-only access to the other repository and you’re not planning to contribute back to that repository.

For more info about the differences read this blog by Atlasssian.

I’m going to focus on using submodules, but you can do the same trick with Maven using subtree merging. I recommend this excellent guide for subtree merging if you decided to go that way. Here’s how it works for submodules:


The Truth About Code Reviews

During the International Conference of Software Engineering 2013 I visited a great talk by Alberto Bacchelli on modern code reviews. It resonated with my experience with code reviews and I’d like to share the highlights of his research conducted among 17 developers from 16 different teams at Microsoft.

Most developers have to do some kind of code reviewing at some point. When you use GitHub for example, every pull request can be seen as a code review. At some companies like Microsoft, Google and Facebook, code reviews are part of the job. To make it easy for developers they work with specialized tools like Google’s Rietveld, Facebook’s Phabricator or the Open Source tool Gerrit. We do code reviews because it is well known that code reviews offer many advantages. When I have to explain to someone why code reviews are important, I come up with the same arguments most developers come up with (in order of importance):

  1. finding defects
  2. code quality improvement
  3. alternative Solutions
  4. knowledge Transfer
  5. shared code ownership

However, research shows a different result:

Although the top motivation driving code reviews is still finding defects, the practice and the actual outcomes are less about finding errors than expected: Defect related comments comprise a small proportion and mainly cover small logical low-level issues


Visiting GitHub

After Tweeting some of the employees GitHub, the most popular online code hosting platform allowed us to visit one of their offices in San Francisco. GitHub was launched in 2008 and is truly a unique company. It has over 170 employees of which 60% work outside the office all over the world. Employees are encouraged to travel as much as they like. They have a big interactive table (photo) that shows where the employees are on a 3D map. Besides having complete freedom with your workplace and schedule, GitHub also doesn’t have any managers. Employees are expected to pick up work themselves and form their own teams. Everything happens distributively using online tools to form discussions and to arrange votes. From hiring new employees to choosing what kind of coffee mugs should be available in the kitchen; everything is democratic. Half the company looks like a Starbucks cafe for people who don’t care for a desk.

Visiting a company like GitHub, with a radically different management style was inspiring and it was awesome to see the people behind such a great company.


LaTeX Build Server

I recently had to work with LaTeX again. Although LaTeX has its perks like a proper equation editor and BibTex I don’t like working with it for several reasons:

  1. there is no proper WYSIWYG editor that compiles while you type for OS X and the source files are hard to read by themselves
  2. you have to manually configure that it compiles LaTeX, then bibtex, and then LaTeX twice
  3. the horrors of positioning and loading images with all the different compilers, bounding boxes, etc.
  4. the fact that they have their own Stack exchange Q&A is an indication of how arcane it is.

Because the source files are so unpleasant to read, you always have to make sure you send the compiled latest version of your document to the right people. What I like about some MarkDown related projects like Jekyll and Octopress is the way you can push the source to for example GitHub, and they build the site for you. I wanted that for LaTeX as well. Using a simple script I now have my own LaTeX build server. This way my colleagues and I can always see the latest version of the document in any browser so I don’t have to think about distributing the latest version to the right people or devices.

The end result is nice and simple listing of the PDF, diff and log in a folder that indicates the build date:


Clean Test Classes Using JUnit’s Rules

A couple of days ago I discovered the beauty of JUnit’s TestRules while searching for an easy way to set a time-out on all tests in a testcase. JUnit has a built-in rule for this called Timeout. You can set this rule for every test in your class by setting the timeout in a field like this:

Setting a Timeout RuleView the Javadoc
1
2
3
4
5
6
7
8
9
10
public class MyTest {
  
  @Rule
  public MethodRule globalTimeout= new Timeout(20);
  
  @Test
  public void someTest() {
      ...
  }
  

Another gem is the ExpectedException rule, which allows you to inspect a thrown exception in several ways.

Inspecting excptionsView the Javadoc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public static class HasExpectedException {
  @Rule
  public ExpectedException thrown= ExpectedException.none();

  @Test
  public void throwsNothing() {
  // no exception expected, none thrown: passes.
  }

  @Test
    public void throwsNullPointerException() {
      thrown.expect(NullPointerException.class);
      throw new NullPointerException();
  }

  @Test
  public void throwsNullPointerExceptionWithMessage() {
      thrown.expect(NullPointerException.class);
      thrown.expectMessage("happened?");
      thrown.expectMessage(startsWith("What"));
      throw new NullPointerException("What happened?");
  }
 }

The great thing is, it’s super easy to extend one of these rules.


Integration Testing With Jetty

This is a followup after my previous post about separating JUnit tests into fast tests and integration tests. The sample code is available on GitHub.

When building a web application I like to have an integration test suite that resembles the real life situation as best as possible. The code should be able to run without too much effort from a build server like Jenkins and it should be fairly easy to maintain. In this post I will explain how I achieved these goals.

To see what this example app does run the server by running the main method in com.alexnederlof.inttesting.MyWebServer.java and brows to http://localhost:9090. You can do this from your favorite IDE.


Separating the Fast From the Slow JUnit Tests

For some time now I was looking for a good way to do real integration testing with JUnit. These tests tests tend to be slow because the whole stack has to be build up and shut down. Furthermore, some tests also need a specific environment like a database connection which is not available to any developer. That’s why you probably want to split up your test suite in fast and slow (or dependent) tests. JUnit has a technique to split up these tests using Categories. This allows you to specify the category your tests belongs to and then skip those tests in your Suite like so:

1
2
3
4
@RunWith(Categories.class)
@IncludeCategory(SlowTests.class)
@SuiteClasses( { A.class, B.class })
public static class SlowTestSuite { }

The downside here is that you have to specify all the tests in the test suite. As far as I know, JUnit has no mechanism to do this for this for you. You can work with the Maven Surefire plugin to filter stuff in and out, but I think there’s a better way. A man by the name of Johannes Link built a great little library which does just what I want. I allows us to specify in a Suite to run all JUnit tests it can find. It can also exclude and include certain tests. Unfortunately it doesn’t work with JUnit’s Categories system but it uses type inheritance. This makes it easy to filter out types of tests. For example I define all the fast tests like this:

1
2
3
@RunWith(ClasspathSuite.class) // Loads all unit tests it finds on the classpath
@ExcludeBaseTypeFilter(SlowTest.class) // Excludes tests that inherit SlowTest
public class FastTests {}