Deadlock with nested connections and connection pool.

I recently had to debug a deadlock issue with a legacy code that would deadlock when accessing its REST API by multiple threads.
The issue was in the way the REST API was calling Oracle Database and using connections from the pool.
Here is a pseudo code showing the database access used by the REST API:

1 public Result getResult(){
2   Result result = null;
3   try( Connection connection = getConnection(); PreparedStatement ps = connection.prepareStatement(sql), ResultSet rs = ps.executeQuery() ){
4     if ( rs.next() ) {
5       result = new Result();
6       String id = rs.getString(“id”);
7       SubResult subResult = getSubResult(id); //BAD, potential DEADLOCK situation
8       result.setSubResult(subResult);
9       return result;
10     }
11   }catch(Exception e{
12     return null;
13   }
14 }

15 public SubResult getSubResult(String id){
16   SubResult result = null;
17   try( Connection connection = getConnection(); PreparedStatement ps = connection.prepareStatement(sql), ResultSet rs = ps.executeQuery() ){
18     if ( rs.next() ) {
19       result = new SubResult();
20       //update SubResult from ResultSet
21       return result;
22     }
23   }catch(Exception e{
24     return null;
25   }
26 }

27 private Connection getConnection(){
28   return //new connection from the pool
29 }

The REST API call goes through the following steps:

1. Calls geResult().
2. geResult() in turn calls getConnection() on line 3 that will go and try to acquire connection from the pool.
3. Before closing the original connection geResult() will call getSubResult() on line 7 that in turn will try to acquire connection from the pool on line 17.
4. getSubResult() closes its connection.
5. getResult() closes its connection.

All seem to be ok here and the code properly acquires and releases the DB connections from the pool. And this will always work as long as we call geResult() non concurently.
But once we start calling geResult() from multiple threads we will quickly get into a deadlock situation by the following scenario:

1. Multiple threads call geResult() and open all the available connections from the pool by entering line 3.
2. Then the threads will call getSubResult() on line 7 that in turn will try to acquire connection on line 17.
3. Since all connections have been acquired, each thread will sit and wait on line 17 for a new connection to be available that will never happen since they are the ones holding the connections and at the same time not closing them because they are waiting on line 17. And this is how you get a DEADLOCK.

To fix the problem with current code we need to update it to pass the same connection created at line 3 to getSubResult() that will prevent getSubResult() from trying and waiting for a new connection from the pool.

But the best approach is to use frameworks like Spring that will manage connections for us and make sure that only one connection is used per thread (Spring will bind the connection to thread local when using its transaction management facilities via annotations and proxy).

If you are stuck with legacy code like this and cant use Spring then NEVER use nested connections like the above example.

Static state and Maven build.

Recently I had to deal with a weird and occasional Maven build fails from different plugins failing with this same exception:

Caused by: java.lang.NullPointerException
at org.apache.http.impl.conn.SystemDefaultRoutePlanner.determineProxy(SystemDefaultRoutePlanner.java:79)
at org.apache.http.impl.conn.DefaultRoutePlanner.determineRoute(DefaultRoutePlanner.java:77)
at org.apache.http.impl.client.InternalHttpClient.determineRoute(InternalHttpClient.java:124)

Since it was only failing on my local box and subsequent build would run fine I kept on ignoring it until it started to constantly fail on our Jenkins box after I had to add a new plugin to one of our sub-project pom.xml.
First reaction was to blame on the new plugin, and it took me to totally wrong direction.

I finally decided to look at SystemDefaultRoutePlanner.java source code and discovered that it was accessing a static variable without null check. But why would this static variable be null?
The only explanation is that some plugin or unit test from previous project builds where setting it to null

Keep in mind that Maven by default will build all projects within a single JVM instance and one project build static state change will be reflected onto another project build.

After some online research I found that one of Maven sub project was using SOAPUI plugin for integration testing and it was setting that static variable to null and never reverting it back after the tests. And since the Apache HttpClient that SystemDefaultRoutePlanner is part of, is used by many plugins it was failing on the subsequent project builds when trying to download the new plugin that I mentioned earlier.

Currently there are defects against SOAPUI plugin and HttpClient around this problem. Here is the one for HttpClient: https://issues.apache.org/jira/browse/HTTPCLIENT-1766.

This is a good example to take extra precaution when calling static methods or setting global variables inside unit test that can not only break other unit test but also break maven build.
The best practice is to avoid altering static state in unit test, but if really needed you have to use Junit’s @After annotation that allows you to cleanup and revert any static state you might have altered inside your unit test (even if your unit test fails or throws exception).

Extensible Java Enum.

As you know you can’t extend Enum in Java in order to add a new enumeration value since they are final. But what about the case that you want to use that enum in your code but at the same time allow additional values to be represented by that enum type. For example lets take a very simple ENUM that defines some type of EVENT:

Here you can see that we wont be able to introduce additional EVENTS without modifying the EventTypes enum. This will become an issue when we dont control all EVENT types, for example in case when they are passed to us from external system.

To solve this issue we will leverage the fact that enums can implement interfaces by declaring a parent interface that our EventTypes enum will implement and changing the handleEvent method to take object of that interface type instead of EventTypes.
Here is the revised code:

Continue reading Extensible Java Enum.

Fix those pesky eclipse build errors.

Have you had Eclipse keep on showing mysterious build errors on projects (or even on files) that just wont go away no matter how many time you build and refresh the project(s)?
If you have situation like that and are sure that there should not be any errors. Try the following to force-clear the errors:

  • Close the eclipse (this is important since Eclipse will write errors back to marker files on close).
  • Go to the following directory under your eclipse workspace:
    yourpath/.metadata/.plugins/org.eclipse.core.resources/.projects/
  • Open terminal in there and run the following command
    find . -name “.markers” -delete

    You can also delete .markers files manually too.

  • Open eclipse and refresh your projects. The errors should be gone.

Hadoop, trying it.

Full Project: https://github.com/tsolakp/hadoop-sample

There has been a lot of talk about Hadoop lately but with very little explanation of what it actually is. Unfortunately some of tutorial have not been simple enough to give me a big picture of what this technology is all about.
After some reading http://developer.yahoo.com/hadoop/tutorial/ comprehensive tutorial on Yahoo and trying I want to share my observations.

What it is and what it is not.

First of all Hadoop is simply a batch application. It is designed to process large amount of data (for example in forms of large data files) in parallel over distributed system where each machine is called a node.

It is not a database like the related Apache projects called Hive or HBase.

It consist of two major parts. One is distributed batch app that manages job on each node and second is the HDFS (file system) which allows Hadoop to transparently distribute large files across nodes so that each job on the node could access and process the file or its parts.

One probably obvious but crucial aspect of Hadoop is that it makes sure the processing code (i.e Mapper’s or Reducer’s Java classes) are run closer to the data which means that unlike Spring Batch, for example, it consists of multiple processes running on each node and controller by master “namenode”.
This makes it difficult to debug, but fortunately you can run Hadoop jobs just like plain Java “main” application for testing before deploying to “pseudo” (all nodes are run on single machine) or “real” (nodes are actually separate machines) environments.

Continue reading Hadoop, trying it.

EasyMock gets more power with PowerMock.

There are a lot of mocking libraries for Java, out of which EasyMock is my favorite. It has been serving well for me until I had to unit test legacy code which did not completely adhere to IOC (inversion of control) principles and had a lot of static method calls and direct creation of service/helper objects instead of using dependency injection. Unfortunately EasyMock could not handle mocking of static methods and objects which were created directly in the class or method being tested along with some other limitations such handling final classes.
This forced me to use other mocking libraries until I encountered PowerMock. It provides all the ammunition (and more) in dealing with these issues and has nice integration with EasyMock.
Here I’ll show you how to use EasyMock with PowerMock in order to handle common mocking test cases which were not possible with EasyMock before.

Continue reading EasyMock gets more power with PowerMock.

Ubuntu and Dual Monitor with ATI x300.

I am kind of new to linux, otherwise I probably would not be writing this article or spend so much time figuring out the solution I am about to describe.
Anyways, I have this old IBM T43 that I use for my work and ever since I installed Ubuntu on it I just could not get laptop’s monitor to behave as second monitor
and have big Dell one to be as the primary one.
I installed “Monitors” app for Gnome and every time I would select Dell as primary the Gnome would freak out and show total garbage on both screens.
After some Googling all I could conclude is that the main problem was my ATI X300 mobile graphics card that was not supported any more by AMD/ATI and
the only supported open source drivers (that I was using) did not work very well in dual monitor setup.
Also the only suggested fixes required complicated xrandr commands or xorg.conf changes that did not work in my case.
What actually worked is this pretty easy fix that I got when paying more attention to what Gnome’s “Monitors” app was doing when applying the changes.

Continue reading Ubuntu and Dual Monitor with ATI x300.

JSP myths.

Since JPS’s came out as part of J2EE there have been lots of talks of how they were so ASP like and were badly designed.
But since there was not any alternative that time the Java community had to embrace it and used it. In the end JSP made into most J2EE based web applications.

Then in the last few years JSP started to get bad press again with advance of AJAX, component based frameworks like Wicket , JSF, Tapestry and with template based frameworks like Freemarker.

Most of bad press was around the difficulty to work with JSP as HTML developer, JSP containing too much Java code, JPS being difficult to debug, JSP not being suitable to develop component based web applications, JSP becoming obsolete since what was done in JSP can now be done with AJAX on client browser.

There are some valid points and I am personally not a fan of JSP either but want to get things straight and show that those issues have nothing to do with JSP, that JSP is not dead yet and can very well play nicely with other web frameworks and AJAX.

Continue reading JSP myths.

Maven2 eclipse plugin and custom or pom packages.

Update: This approach will also work with “ear” projects and only requires adding plugin definition without changes to eclipse maven plugin.

Recently I had to create maven project which was producing war file and then packaging it into tar before deploying to maven repository without deploying the war itself. This forced me to declare my pom package as “pom”. Everything was fine until I tried to run “mvn eclipse:eclipse” which gave me error that it does not support “pom” packages.
After googling I stumble upon this ticket

http://jira.codehaus.org/browse/MECLIPSE-310

which suggested to specify “war” (or “jar”) packaging in eclipse plugin configuration like so

<build>

        <plugins>

            <plugin>
                <groupid>org.apache.maven.plugins</groupid>
                <artifactid>maven-eclipse-plugin</artifactid>
                <configuration>
                    <packaging>war</packaging>
                </configuration>
            </plugin>

    ….

</plugins></build>

Continue reading Maven2 eclipse plugin and custom or pom packages.

Issues with depepency injection in latest frameworks.

Here I am going to show some of important limitations to Java and Guice approached to dependency injections, particularly where wiring logic should go.

Java Dependency Injection

public interface LogService {
public void log(String result);
}

@Named(“FileLogService”)
public class FileLogService implements LogService{
public void log(String result){}
}

@Named(“ConsoleLogService”)
public class ConsoleLogService implements LogService{
public void log(String result){}
}

public class LogClient{

@Inject
@Named(“FileLogService”)
private LogConsole logConsole = null;

public void doSomething(){
logConsole.log(“doSomething”);
}
}

This is one way of injecting dependency into client code using Java annotation. As you can see we just hard coded LogClient to FileLogService and there is no way to have LogClient use ConsoleLogService.

Continue reading Issues with depepency injection in latest frameworks.