Friday, March 26, 2010

Using ROO with MySQL and Maven Tomcat plugin

I spent some time trying to get Roo to work with both MySQL and the Tomcat plugin in such a way that I could do integration tests from Eclipse and deploy to Tomcat and use JNDI.

This posting is really more about how to use the Maven Tomcat plugin, it just so happens that I ran across this whilst using Roo.

The Roo tutorial seems a bit terse when it comes to these topics, which is a shame. I think Roo would be more useful if they helped setup the Maven Tomcat plugin and tried to configure the JDBC bits. Perhaps this will help document the problem and at least on possible solution.

I used Roo to setup MySQL (see guide) and then I tried to run mvn tomcat:run. It failed. Then I spent some time trying to figure out how to best configure the Maven Tomcat plugin and without breaking integration tests that I wrote.

Along the way, I learned quite a bit about the maven tomcat plugin, Roo and Spring 3. I figured I would share this with like minded folks on my team and beyond.

Here is what I came up for the pom.ml.

I could not get the XML stuff to show up correctly on blogger.. go here to see the code listings:


Roo added the following to pom.xml:


By the way Roo fills your application with artifact ids that are prepended with com.springsource. I am not sure why it does this, but it probably has something to do with having their own repositories and wanting to control the jar files that get used for customer support reason. It is either that or a secret plot for taking over the world.

Roo also added a tomcat plugin. The problem is that I need a context.xml file for tomcat so I can setup my JNDI datasource.

I reconfigured the tomcat plugin as follows in pom.xml:










Notice that I add mysql jar file to the list of dependencies for the plugin so that Tomcat can find mysql and I can avoid the dreaded Class not found exception for the MySQL driver.

By the way, I tried many things before getting to this point. This is just the first one that worked. I am sure there are other ways, and even possible better ways. I got the above through trial and error and reading the maven tomcat plugin documents (sparse) and the tomcat manual.

Also notice that I have a tomcat property set. This tomcat folder will be under the project root directory, ${project.dir}/tomcat. This was the key for adding the JNDI datasource to my database.

In the ${project.dir}/tomcat dir I have a conf directory. The conf directory has the following files:

$ pwd

$ ls
context.xml tomcat-users.xml web.xml

The tomcat-users.xml and web.xml were generated by the tomcat plugin under /target/tomcat. I just copied them here.

The context.xml is where I set up the JDBC as follows:


username="mydb" password="mydb"
maxActive="20" maxIdle="10" maxWait="-1" />

(Now would be a good time to add the applicationContext.xml file to the watched resources (note to self)).

Notice that I have configured the Resource for my database.

In my application context file, I just add the following:

The problem is that now my code is somewhat tied to running in a container which is exactly what I don't want. I want to be able to run this application context without change in my integration tests.

I want to use this dataSource when I am testing and such.

To achieve this in Spring 3, I can use SPeL (Spring Expression Language) as follows:

ref="#{ systemProperties['ac.testing']=='yes' or systemProperties['ac.development']=='yes' ? 'dataSourceDBCP' : 'dataSourceJNDI'}"/>

The above basically states that if ac.testing or ac.development system properties are set to yes then use dataSourceDBCP otherwise use dataSourceJNDI.

I like SPeL.

Then to run with dataSourceDBCP instead of dataSourceJNDI (for whatever reason), you can pass a system environment variable as follows:

$ mvn tomcat:run -Dac.testing="yes"

You could setup a maven profile or something some such so that you don't ever have to pass this whilst running this from maven.
Kafka and Cassandra support, training for AWS EC2 Cassandra 3.0 Training