|
Articles in this issue :
|
Small and Simple Web Applications - the Friki Way (Part 6)
Frank Carver,
December 2003
Abstract
This article is the sixth of a series which laments the bloated and unmaintainable state of so many J2EE web applications and looks at ways to keep web applications small, simple and flexible. The series uses the author's Friki software as a case study, and discusses ways to design and build
a powerful, flexible and usable web application in less space than
a typical gif image.
This article carries on working through the user requirements, improving the web application with several pages and the navigation between them.
Introduction
If you've read the first, second, third, fourth and fifth articles, you should be aware that the aim of this project is to develop a small, simple and understandable "Wiki" (editable web site) application. We've considered and decided to defer the relatively heavy decision of how to store and retrieve pages by introducing a Java interface, decided on and implemented a simple "template" system to display pages, and are building a supporting test suite and an automated build script as we go along. Last session we built, deployed and tested a working web application, although it didn't actually do much.
First, let's recap what our "customer" wants, and what we have got so far:
each page may be viewed using it's own unique URL DONE
page content may contain links to other pages by name DONE
links to nonexistent pages will be marked with a "?" DONE
- page content may be created and edited using just a browser
- at least 100 different pages can be stored
What do we write next?
It should be pretty obvious that the next most important feature is the ability to edit pages. Without this, it's unlikely that anyone would care how many pages could theoretically be held in the system. However, after thinking about how far we got last time, we have another surprisingly tough choice to make. Last session we made a kind of "absolute minimum" solution to the task - all we show is page names, not the actual content. It's pretty obvious that real users are going to want more. Way back in part one we built a template mechanism for just this purpose, and by now I'm certainly itching to use it for real. Doesn't it make more sense to get the page viewing working "properly" before we move on to the next task?
This is a tough choice because there are sensible arguments for both approaches. Moving forward leaving unfinished work behind you can easily create stress and general unhappiness with the code, reducing the effectiveness and speed of further development. The "Pragmatic Programmers" suggest that it's a good idea to "fix broken windows" (see Hunt & Thomas, 2000). On the other hand, time is precious. It's rarely a good investment to spend time on something which may only be changed or discarded later. The Extreme Programming crowd chant "YAGNI: You Aint Gonna Need It!" (see Beck, 2000). In your own projects, you will likely come up against this issue more often than you think. The important thing is to be aware of it and make a conscious, informed, choice rather than just following assumptions.
In this case I'm going to decide that making progress on the next goal is more important, because I reckon being able to create page content in a browser will help us understand more about the page layout and look-and-feel needed to get viewing working right. Of course, this is only a guess, so I may be wrong, but even if this is not the perfect choice in hindsight, it's still better than doing nothing while we try and work out which way to go.
Fourth task: page content may be created and edited using just a browser
The first thing to understand about editing anything in a browser is how the information moves back and forth between the server (where our application is running) and the client (the browser). Let's imagine a user wants to edit the page "FrikiRocks". We'll assume for the moment that this page already exists, and contains some text.
- At the start of this operation, the server has no idea that someone might want to edit this page, so we have to start by telling it. So the first information transfer is a message from the browser to the server indicating an edit request for a named page.
- It's almost always handy to see what was there originally, then we can be happy that our changes are sensible. So the next information transfer is a reply carrying the current content of the page back to the client.
- While the user is making her changes, the new text only exists in the memory used by the browser. To keep the new text more permanently, it needs to be sent back to the application on the server for storage. So the next information transfer is a message from the browser to the server containing the changed text.
- Different Wiki implementations differ in the a next step. Some Wikis send back a simple confirmation message ("page FrikiRocks successfully updated"), some offer a preview of the final page and allow the user to agree/re-edit/cancel the changes, some just re-show the new page as if the user had asked to view it, some send a "browser redirect" to tell the browser to go and fetch the new page itself. All of these approaches have advantages and disadvantages, and we don't really know (yet) what our users will prefer. Our original rule applies: "do the simplest thing", so we can show people something working, and get concrete suggestions for improvements later. So we will just send a confirmation, for the moment.
Anyway, enough theorising. let's get down to work. Remember that we always start by adding a new test to our growing test suite. In this case we are testing information flows between client and server, so that sounds like it belongs in "RemoteTests" to me:
RemoteTests.java
package tests;
import junit.framework.*;
public class RemoteTests extends TestCase
{
public static TestSuite suite()
{
TestSuite ret = new TestSuite();
ret.addTest(new TestSuite(EditPageTest.class));
ret.addTest(new TestSuite(WebPageTest.class));
return ret;
}
}
Note that I have added the new test above the one we wrote last time, so it will be run first. This is a handy trick during development. Remote tests can be relatively slow, and we want to know as soon as possible if our new test is passing or failing. We still need to run those other tests to make sure our code hasn't broken anything, but the important thing is to keep the cycle of testing, learning and improving the code as short as possible.
Obviously, there is no EditPageTest yet, so we'd better write one:
EditPageTest.java
package tests;
import junit.framework.*;
import com.meterware.httpunit.*;
public class EditPageTest extends TestCase
{
WebConversation http;
WebResponse response;
public void setUp()
{
http = new WebConversation();
}
public void testFetchPageForEdit()
throws Exception
{
response = http.getResponse(
new GetMethodWebRequest("http://localhost:8080/frikidemo/edit?page=FrikiRocks"));
assertEquals("example page should return a code of 200 (success)",
200, response.getResponseCode());
}
}
Run it, and see what we get:
Testcase: testFetchPageForEdit took 1.016 sec
Caused an ERROR
Error on HTTP request: 404 Not Found [http://localhost:8080/frikidemo/edit?page=FrikiRocks]
That's OK, and just what we expected, so let's make this little test work by adding an "edit" operation to our application, so we can at least return something. Note that our test doesn't say anything about what information should be returned yet, so the easiest thing is just to tweak the existing web.xml file to add another mapping for "edit". Note in particular that I haven't introduced a new servlet, just adding more mappings to the existing ShowServlet. This is important, because it enables us to test that the config file changes are correct, without getting distracted by having to write and test a separate servlet. We know that the ShowServlet "works", so we use that to make sure our new configurations and test code are correct. Neat.
web.xml
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN"
"http://java.sun.com/dtd/web-app_2_3.dtd">
<web-app>
<servlet>
<servlet-name>show</servlet-name>
<servlet-class>friki.ShowServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>show</servlet-name>
<url-pattern>/show</url-pattern>
</servlet-mapping>
<servlet>
<servlet-name>edit</servlet-name>
<servlet-class>friki.ShowServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>edit</servlet-name>
<url-pattern>/edit</url-pattern>
</servlet-mapping>
</web-app>
Well, it passes this first test, but of course it's a long way from editing a page yet. I think the next test should be one to prompt us to write that new servlet. But, before we go on, have you noticed a similarity between the code in "WebPageTest.java" from last session, and "EditPageTest" from this session? I have, and I want to remove the duplication before we move on. First, let's copy the common code out to a new class:
RemoteWebTest.java
package tests;
import junit.framework.*;
import com.meterware.httpunit.*;
public class RemoteWebTest extends TestCase
{
WebConversation http;
WebResponse response;
public void setUp()
{
http = new WebConversation();
}
public void fetchPage(String url)
throws Exception
{
response = http.getResponse(new GetMethodWebRequest(url));
assertResponseCode("URL '" + url + "' should return 200 (success)", 200);
}
public void assertResponseCode(String message, int code)
throws Exception
{
assertEquals(message, code, response.getResponseCode());
}
}
Note that we have put an assertion that the page returns a success code in the "fetch" method. This may or may not be a good idea in the long run, but it certainly helps simplify our code right now. Now we can trim our existing test cases:
EditPageTest.java
package tests;
import junit.framework.*;
public class EditPageTest extends RemoteWebTest
{
public void testFetchPageForEdit()
throws Exception
{
fetchPage("http://localhost:8080/frikidemo/edit?page=FrikiRocks");
}
}
WebPageTest.java
package tests;
import junit.framework.*;
public class WebPageTest extends RemoteWebTest
{
public void testApplicationPresent()
throws Exception
{
fetchPage("http://localhost:8080/frikidemo");
}
public void testServerPresent()
throws Exception
{
fetchPage("http://localhost:8080/");
}
public void testExamplePage()
throws Exception
{
fetchPage("http://localhost:8080/frikidemo/show?page=ExamplePage");
}
}
That's a lot neater. It's easier to read and understand (it clearly shows that our tests don't actually test much, for example.) We've isolated the dependencies on the HTTPUnit test framework into a single class, and things are generally smaller. Now, back to adding that next test.
The idea of the next test is pretty simple. To start with we just want to check that when we ask to edit the page "FrikiRocks", we get back the content in an editable box. However, for an HTML page to make sense, it probably needs to contain a load more than that: body tags, head tags, product logos, descriptive text, positioning, copyright messages and so on.
We could build our tests by hard coding the whole HTML page in the test, and writing asserts that ensure that every single character is sent correctly. This is simple to write at the start of a project, and can seem a good idea. But beware. Any change to the "look and feel" of the application means that the HTML for many pages will change. Any change to the HTML for lots of pages is likely to break lots of tests, even though what each test is supposed to be testing probably still works. Tests that can fail when unrelated code, data or configurations change are known as "brittle". Brittle tests are one of the main reasons why automated testing is considered hard to do, expensive to maintain, and easy to get wrong..
So, let's plan ahead, and build our application with testing in mind. In this case we are testing that (regardless of whatever else there is on the page) it contains a form with a textarea containing the text we want. If we define that the form has a specific "id" attribute, we can easily ignore the rest of the page, and concentrate our attention:
EditPageTest.java
package tests;
import junit.framework.*;
import com.meterware.httpunit.*;
public class EditPageTest extends RemoteWebTest
{
public void testFetchPageForEdit()
throws Exception
{
fetchPage("http://localhost:8080/frikidemo/edit?page=FrikiRocks");
WebForm form = response.getFormWithID("editform");
assertEquals("textarea 'content'", "For Sure", form.getParameterValue("content"));
}
}
Note that some HTTPUnit code has crept back in (for the moment at least.) This may well end up being pushed "up" in to the RemoteWebTest parent class, but there's no real need or reason to do that yet. We can leave that decision until we have more than one place where the code is used. Meanwhile, this test fails, of course. We don't have any code in our real application to produce a form or textarea yet.
First, let's change our web.xml deployment descriptor to use a real EditServlet class rather than reusing ShowServlet:
web.xml
...
<servlet>
<servlet-name>edit</servlet-name>
<servlet-class>friki.EditServlet</servlet-class>
</servlet>
...
Now, we take the quickest route to working code. Copy the old ViewServlet code to a new file, and modify it to generate an "edit" page instead of a "show" page:
Just one thing to mention, Before all those more knowledgeable or thoughtful readers start to complain. You will notice below that I "hard code" HTML into the code of a servlet. This is not a good idea. It's hard to maintain, easy to screw up, and bloats the code with unneeded data. However, at this point in developing this application it is a reasonable thing to do. Remember that our plan is to eventually move all HTML out to external templates, but doing that now would un-necessarily complicate things. We are not testing (or writing) the HTML used by the final application, we are testing the information flow between pages during editing, and this is merely a simple and robust way to do it.
EditServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class EditServlet
extends HttpServlet
{
public void doGet(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException
{
String name = req.getParameter("page");
Writer writer = res.getWriter();
writer.write(
"<html><head><title>Page " + name + "</title></head>" +
"<body>\n" +
"<h2>Edit Page '" + name + "'</h2>\n" +
"<form id='editform' method='POST' action='update'>\n" +
" <input type='hidden' name='page' value='" + name + "'>\n" +
" <textarea name='content'>For Sure</textarea>\n" +
"</form>\n" +
"</body></html>"
);
writer.flush();
}
}
Run it, and it all works. Excellent
Note that we're still hard coding the page content at the moment as well, because that's all our current tests need. In effect, our system has only one page. Sure, we'll need to fetch pages from our page store at some point, but until we have tests for that, there's no point worrying about the code.
Now, in order to change this content and send it back, we'll need some sort of "send" button. So let's add this to our test. Just as before, we'll look for a specific "id" for the button, rather than a particular name or value, so that page designers or user-interface specialists can tune-up the interface without breaking simple functionality tests like this one.
EditPageTest.java
...
public void testFetchPageForEdit()
throws Exception
{
fetchPage("http://localhost:8080/frikidemo/edit?page=FrikiRocks");
WebForm form = response.getFormWithID("editform");
assertEquals("textarea 'content'", "For Sure", form.getParameterValue("content"));
SubmitButton sendButton = (SubmitButton)form.getButtonWithID("sendbutton");
assertTrue("form needs a 'send' button", sendButton != null);
}
...
EditServlet.java
...
writer.write(
"<html><head><title>Page " + name + "</title></head>" +
"<body>\n" +
"<h2>Edit Page '" + name + "'</h2>\n" +
"<form id='editform' method='POST' action='update'>\n" +
" <input type='hidden' name='page' value='" + name + "'>\n" +
" <textarea name='content'>For Sure</textarea>\n" +
" <input type='submit' id='sendbutton' name='SEND' />\n" +
"</form>\n" +
"</body></html>"
);
writer.flush();
...
Good. Now we have a button, we can use it to submit the form, and see what happens. Remember that full details of all the HTTPUnit APIs I use here can be found at http://httpunit.sourceforge.net/doc/api/index.html
EditPageTest.java
...
SubmitButton send = (SubmitButton)form.getButtonWithID("sendbutton");
assertTrue("form needs a 'send' button", send != null);
WebRequest sendback = form.getRequest(sendButton);
sendback.setParameter("content", "For Sure.\nThanks, Dude!");
response = http.getResponse(sendback);
assertResponseCode("upload should return a code of 200 (success)", 200);
...
Can you guess what happened? Error on HTTP request: 404 Not Found [http://localhost:8080/frikidemo/update]. Looks like HTTPUnit is happy with what we are asking it to do - it's examined the form and its button, and worked out to try and send the content to "update". Unfortunately, we have no update servlet to receive the new page content. Let's quickly make one. And don't forget to add it to the web.xml mappings.
UpdateServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class UpdateServlet
extends HttpServlet
{
public void doPost(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException
{
String name = req.getParameter("page");
Writer writer = res.getWriter();
writer.write(
"<html><head><title>System Message</title></head>" +
"<body>Page '" + name + "' updated successfully</body></html>");
writer.flush();
}
}
web.xml
...
<servlet>
<servlet-name>update</servlet-name>
<servlet-class>friki.UpdateServlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>update</servlet-name>
<url-pattern>/update</url-pattern>
</servlet-mapping>
...
Now it works again. Good.
One thing is beginning to bug me, though. We now have three servlets containing very similar code. I'm sure we can factor out that duplication before we move on. Let's start (as usual) by copying out the common bits to a new class:
FrikiServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public abstract class FrikiServlet
extends HttpServlet
{
protected void doBoth(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException
{
String name = req.getParameter("page");
Writer writer = res.getWriter();
process(name, writer);
writer.flush();
}
public void doPost(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException
{
doBoth(req, res);
}
public void doGet(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException
{
doBoth(req, res);
}
protected abstract void process(String name, Writer writer)
throws IOException;
}
Note that two of our existing servlets use "doPost", and one uses "doGet". I've chosen to use one of the common idioms for this, and implemented both doGet and doPost in the base class, but called a single method from both. This is handy, because it allows us to freely choose whether to use HTTP GET or POST requests anywhere in our application. Note also that I have made this class "abstract", and added an abstract method to do the actual processing (the bit where the three servlets differ.) Now we can strip out all that duplicated code in the three servlets:
ShowServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
public class ShowServlet
extends FrikiServlet
{
protected void process(String name, Writer writer)
throws IOException
{
writer.write(
"<html><head><title>Page " + name + "</title></head>" +
"<body>This is Page '" + name + "'</body></html>");
}
}
EditServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
public class EditServlet
extends FrikiServlet
{
protected void process(String name, Writer writer)
throws IOException
{
writer.write(
"<html><head><title>Page " + name + "</title></head>" +
"<body>\n" +
"<h2>Edit Page '" + name + "'</h2>\n" +
"<form id='editform' method='POST' action='update'>\n" +
" <input type='hidden' name='page' value='" + name + "'>\n" +
" <textarea name='content'>For Sure</textarea>\n" +
" <input type='submit' id='sendbutton' name='SEND' />\n" +
"</form>\n" +
"</body></html>"
);
}
}
UpdateServlet.java
package friki;
import java.io.IOException;
import java.io.Writer;
public class UpdateServlet
extends FrikiServlet
{
protected void process(String name, Writer writer)
throws IOException
{
writer.write(
"<html><head><title>System Message</title></head>" +
"<body>Page '" + name + "' updated successfully</body></html>");
}
}
Don't forget to re-run all the tests after these changes. The tests we have been building as we go along form the essential "safety net" to allow us to do sweeping simplifications like this without fear.
I think this is a reasonable place to stop this session. I think we've done enough to be confident that we have the process for editing pages in place. As usual, I encourage you to think about other tests you could add to the test suite, and make your own decisions on whether they would be useful, or whether they might make the test suite more "brittle". And please remind any web UI designers you happen to meet about the huge testability benefits of using "id" attributes in their web designs.
How are We Doing?
We still haven't made a Wiki yet! But we have just about completed another of our user goals, and have built, deployed and run a growing web application. Best of all, we didn't abandon our commitment to testing everything (even things that seemed easy to write and hard to test). We can say with confidence that whatever we do next, it won't sneakily break what we have written so far. And our application still fits in a 7K war file.
Next session we will attack the last of these customer goals, which requires that we finally tie together the web user interface with the page storage. With any luck, we'll be able to use that page template code, too!. Hmm? You said I promised a usable web application this time? Point your browser at http://localhost:8080/frikidemo/edit?page=FrikiRocks and have a play. It doesn't store changes (we haven't done that goal yet), but you can enter text, click buttons and watch it do its stuff. It may look rough, but it can be used . . . Sort of.
References
- Hunt A and Thomas D, "The Pragmatic Programmer", Addison Wesley, 2000. ISBN 020161622X
- Beck K, "Extreme Programming Explained", Addison Wesley, 2000. ISBN 0201616416
Discuss this article in The Big Moose Saloon!
Return to Top
|
Unit Testing Database Code
Unit Testing Database Code
by Lasse Koskela
Have you ever tried to write unit tests for a class that does some
data munging on a database? Many have tried and surrendered after a
while because of a number of reasons. Some have complained about the
test running for too long or about the test needing a set of fixed
test data, which easily gets out of synch. Most problems related to
testing database related code can be summarized under lack of
encapsulation.
This article's goal is to show some ways to organize your database
code in such a way that writing those unit tests with JUnit and its
extensions becomes possible.
We'll use a fictious
Data Access Object pattern (DAO) implementation
called UserDAO as an example. The actual pattern is not relevant here
so we've left out most of the elements the pattern suggests. For more
context on the DAO pattern itself, please refer to the pattern documentation
at http://java.sun.com/blueprints/corej2eepatterns/Patterns/DataAccessObject.html
(our UserDAO and User classes map to the CustomerDAO and Customer in the
blueprints).
In general, the key in writing testable database code is to separate
logic from access. For example, a DAO class should not encapsulate
both the code for querying data over JDBC and the code for obtaining
the JDBC connection. Listing 1 shows an example of this kind of flaw.
Listing 1. Badly encapsulated database code
public class MyNonTestableUserDAO implements UserDAO {
private Connection getConnection() throws SQLException {
return DriverManager.getConnection(
"jdbc:mckoi://localhost/",
"admin_user",
"aupass00");
}
public User createUser(String userId, String firstName, String lastName)
throws DAOException {
try {
PreparedStatement ps = getConnection().prepareStatement(SQL_INSERT);
ps.setString(1, userId);
ps.setString(2, firstName);
ps.setString(3, lastName);
ps.executeUpdate();
ps.close();
return new User(userId, firstName, lastName);
} catch (SQLException e) {
throw new DAOException(e.getMessage());
}
}
}
The mock approach
The problem in testing the DAO class in Listing 1 is that unless
we can replace the JDBC connection implementation, running the
test successfully would require a real database with the right
data. Now, how do we manage to do that?
One could intercept the getConnection() call with the help of
AspectJ or other AOP frameworks, but that's too much work and
results in unnecessarily complex code. Also, one could consider
making the getConnection() method protected, and subclassing the
DAO class in the test code overriding that particular method,
which is already a pretty clean and compact solution
(illustrated in Listing 2).
Listing 2. Letting the test code extend the class under test,
overriding the nasty getConnection() method
public class MyTestableUserDAO1 implements UserDAO {
protected Connection getConnection() throws SQLException {
return DriverManager.getConnection(
"jdbc:mckoi://localhost/",
"admin_user",
"aupass00");
}
public User createUser(String userId, String firstName, String lastName)
throws DAOException {
try {
PreparedStatement ps = getConnection().prepareStatement(SQL_INSERT);
ps.setString(1, userId);
ps.setString(2, firstName);
ps.setString(3, lastName);
ps.executeUpdate();
ps.close();
return new User(userId, firstName, lastName);
} catch (SQLException e) {
throw new DAOException(e.getMessage());
}
}
}
public class TestMyTestableUserDAO1 extends TestCase {
public void testCreateUser() {
// configure a mock implementation for the java.sql.Connection interface
final MockConnection mock = new MockConnection();
mock.setExpectedCloseCalls(0);
mock.setupAddPreparedStatement(new MockPreparedStatement());
// replacing the real Connection implementation with
// a mock implementation
UserDAO dao = new MyTestableUserDAO1() {
protected Connection getConnection() {
return mock;
}
};
// exercise the class under test and assert expectations
User user = dao.createUser("laskos", "Lasse", "Koskela");
assertNotNull(user);
assertEquals("laskos", user.getUserId());
assertEquals("Lasse", user.getFirstName());
assertEquals("Koskela", user.getLastName());
// afterwards, we can check with the mock implementation that the
// class under test collaborated with it as expected
mock.verify();
}
}
Often the best approach, in my opinion, is to fix the root problem
-- the bad encapsulation. Once the logic inside getConnection() is
moved out of the class under test, it is trivial to pass in a mock
implementation in the unit test code instead of the real thing.
Listing 3 illustrates this change.
Listing 3 ? A better structure enabling us to test the class
under test as-is
public class MyTestableUserDAO2 implements UserDAO {
private Connection connection;
public MyTestableUserDAO(Connection connection) {
this.connection = connection;
}
public User createUser(String userId, String firstName, String lastName)
throws DAOException {
try {
PreparedStatement ps = connection.prepareStatement(SQL_INSERT);
ps.setString(1, userId);
ps.setString(2, firstName);
ps.setString(3, lastName);
ps.executeUpdate();
ps.close();
return new User(userId, firstName, lastName);
} catch (SQLException e) {
throw new DAOException(e.getMessage());
}
}
}
public class TestMyTestableUserDAO2 extends TestCase {
public void testCreateUser() {
// configure a mock implementation for the java.sql.Connection interface
final MockConnection mock = new MockConnection();
mock.setExpectedCloseCalls(0);
mock.setupAddPreparedStatement(new MockPreparedStatement());
...
// replacing the real Connection implementation with
// a mock implementation
UserDAO dao = new MyTestableUserDAO2(mock);
// exercise the class under test and assert expectations
User user = dao.createUser("laskos", "Lasse", "Koskela");
assertNotNull(user);
assertEquals("laskos", user.getUserId());
assertEquals("Lasse", user.getFirstName());
assertEquals("Koskela", user.getLastName());
// afterwards, we can check with the mock implementation that the
// class under test collaborated with it as expected
mock.verify();
}
}
Note that even though this example hands an instance of
java.sql.Connection to the DAO implementation, it could just as easily
be a javax.sql.DataSource or some custom interface for ultimately
obtaining a JDBC connection.
For details about writing tests using the mock objects approach and
the different frameworks at your disposal, please refer to the
resources section.
The sandbox approach
As always, there's more than one way of doing things. If refactoring
the code to accommodate the mock objects approach illustrated above
is too big a task and if it's acceptable to have the unit test run
for a bit longer, there's always the option to use the real database
and simply setup a "sandbox" for the test code to play with.
A great tool for this alternative method of testing, that I refer to
as sandboxing, is dbUnit (http://dbunit.sourceforge.net). The
dbUnit framework allows the developer to create a data set, which is
automatically created into the real database before running the test
code and can clean up its mess afterwards if necessary.
Listing 4. The "sandbox" approach
public class MyNonTestableUserDAO implements UserDAO {
private Connection getConnection() throws SQLException {
return DriverManager.getConnection(
"jdbc:mckoi://localhost/",
"admin_user",
"aupass00");
}
public User createUser(String userId, String firstName, String lastName)
throws DAOException {
try {
PreparedStatement ps = getConnection().prepareStatement(SQL_INSERT);
ps.setString(1, userId);
ps.setString(2, firstName);
ps.setString(3, lastName);
ps.executeUpdate();
ps.close();
return new User(userId, firstName, lastName);
} catch (SQLException e) {
throw new DAOException(e.getMessage());
}
}
}
public class TestMyNonTestableUserDao extends DatabaseTestCase {
private static final String TESTDATA_FILE =
"TestMyNonTestableUserDao-dataset.xml";
public TestMyNonTestableUserDao(String testName) {
super(testName);
}
// dbUnit uses this method to obtain a connection to the database which
// it is supposed to set up as a sandbox for the actual test methods
protected IDatabaseConnection getConnection() throws Exception {
Class driverClass = Class.forName("com.mckoi.JDBCDriver");
String url = "jdbc:mckoi://localhost/";
String usr = "admin_user";
String pwd = "aupass00";
Connection jdbcConnection = DriverManager.getConnection(url, usr, pwd);
return new DatabaseConnection(jdbcConnection);
}
// dbUnit uses this method to obtain the set of data that needs to be
// inserted into the database to set up the sandbox
protected IDataSet getDataSet() throws Exception {
return new FlatXmlDataSet(new FileInputStream(TESTDATA_FILE));
}
public void testCreateUser() throws Exception {
UserDAO dao = new MyNonTestableUserDAO();
User user = dao.createUser("laskos", "Lasse", "Koskela");
assertNotNull(user);
assertEquals("laskos", user.getUserId());
assertEquals("Lasse", user.getFirstName());
assertEquals("Koskela", user.getLastName());
makeSureUserWasInserted(user);
}
private void makeSureUserWasInserted(User user)
throws AssertionFailedError, Exception {
Connection jdbcConnection = getConnection().getConnection();
// actual verification emitted for brevity ...
}
}
Note that the test data is located in an XML file named
MyNonTestableUserDAO-dataset.xml in the local filesystem.
Listing 5 shows a possible example of its contents.
Listing 5. A sample dataset file for the dbUnit test in Listing 4
<?xml version='1.0' encoding='UTF-8'?>
<dataset>
<MY_USERS USER_ID='fb' FIRST_NAME='Foo' LAST_NAME='Bar'/>
<MY_USERS USER_ID='tgpaul' FIRST_NAME='Thomas' LAST_NAME='Paul'/>
<MY_USERS USER_ID='efh' FIRST_NAME='Ernest' LAST_NAME='Friedmann-Hill'/>
</dataset>
For details about writing tests using dbUnit, please refer to resources section.
Resources
Discuss this article in The Big Moose Saloon!
Return to Top
|
JavaRanch Newsletter
The Coffee House
Half Cowboy Plus Protocol
by Solveig Haugland
It was a cold, snowy December in JavaRanch land. Brenda and Lacey
had
switched the coffee house's wooden swinging doors for reinforced
double-pane
storm swinging doors. The cowboys' long oilskin dusters were gone and
western-yoke
North Face coats predominated on the back of coffee drinkers' chairs.
Even
the most self-consciously rugged cowboys were asking Brenda and Lacey
for
Peppermint Patties and Irish Coffees, open and unashamed.
It was, in fact, a December like any other on the ranch. There were
just
two new things. One of them was the new green velvet sofa from Silo and
Barrel
that Lacey had ordered late one night after too many Moroccan Dark
Roast
espressos. The other wasn't.
Her name was Christine, and she wasn't green or velvety, but a lot
of
the cowboys thought about doing stuff with her wherein a couch might
come
in handy. She had dark brown hair, light brown eyes, and kind of looked
like
Sandra Bullock in those movies where she's supposed to be unattractive
and
lonely but there's no way she would be in real life. She was helping
Lacey
and Brenda turn the Coffee House into an Internet Cafe, putting in some
Samba
servers and upgrading to Redhat 9, and she spent the rest of the time
chasing
away Steve Ballmer with a shotgun whenever he came sniffing around the
back door.
All this would have been fine except that the chairs, the new sofa,
the
floor, and a little ridge halfway up the wall that was apparently wide
enough
to sit on if you sat real still, were filling up every day about 7 AM
and
weren't emptying til closing time. Normally that would be fine but
there
wasn't any turnover. Everyone ordered one cup of coffee and sat there
mooning.
The ranchers had nothing to do all day but look at the sky and say yep
and
nope and it shore did look like snow, and so they did that inside with
the
extra twist that they were mooning all day looking at Christine.
"Lacey, if we don't do something about the Christine situation, our
margins
are going to be zilch," said Brenda, as she stumbled over a slender
dark-haired
cowboy who was standing behind the broom closet door. "Even during
haying
season we get more business." She reached under the counter for some
cups
and surprised Zeke and Sid who had camped out there overnight."
"Tarnation, Zeke!" Lacey shouted. "You and Sid know better than
that.
The health department don't allow no cowboys with the cups. You wanna
hang
out with cups, you go help Christine set up printing."
"Dagnabbit, Lacey, don't you shout so loud! You're going to
embarrass
me and Sid here. Not that that newfangled silky shirt of his ain't
embarrassment
enough already."
"That lady at the dry goods store said that I looked like Ed Zander
in
it, so you just shut your mouth, Zeke. Besides, that new haircut
o'yours
makes you look like a diseased Shetland."
"Oh, shut your trap," Zeke had a sneaking feeling that Sid was
telling
the truth, and vice versa. "Lacey, we'd love to go make polite
conversation
with Christine but I don't know what the heck to talk to her about. How
am
I going to woo a lady like her when all I got to talk about is boll
weevils,
winter wheat, and how once upon a time I got 98% on my SCJP?"
"Oh, for Pete's sake," sighed Brenda. Pete prairie-dogged his head
up
out of the Moroccan Roast bean barrel and Brenda pushed him back down
again.
"Christine's a girl, ain't she?"
"Well, yes, that ain't really a topic of dispute," said Zeke.
"Then get out your notes from that Dating Design Patterns talk we
done
gave you a few months ago and sic'em on her! Why I heard her muttering
about
how she can't make that Samba server hook up with some of this ol'
cranky
hardware that we're makin' her work with. That points clearly to a
standard
implementation of goTo Guy! A guy like you could be
troubleshootin'
her USB in no time, if you take my meaning."
"It ain't hard to establish simple conversational state," Brenda
added.
"It ain't hard in Java and it ain't hard with the FEMALE platform.
Heck,
you can just implement Dating Savant and ask her opinion about
anything.
You know darn well if you got ears that she'll talk forever about that
danged
SCO thing."
"Anyone in the county with ears knows that," grumbled Pete. Brenda
wacked
him back down into the bean barrel.
"And she's new around here, that's for sure," nodded Lacey. "Why
don't
you go ahead and do a Peters Inverse Newbie strategy of Encapsulated
Big Fat Opening? There's that Forty-Fifth Annual Snowshoe Race for
Peace
and Wheat Subsidies comin' up next month and you could sure invite her
to
just come along with the gang."
Sid looked puzzled. "But there is no gang. Nobody ever goes to it."
Lacey sighed. "Well, she don't know that, now, do she? You get
yourself
through her firewall by just pretending it's a group activity, you get
a
couple of us to say yeah sure, we're going if we can, and then on the
night
in question it's just you and Christine snowshoeing your way along in
the
moonlight."
"And if you talk to us real nice --- I mean real nice, and you fix
that
fence out back for us too so Steve Ballmer can't sneak in no more with
that
dagnabbed Windows Server 2003 --- we just might let it slip that Brenda
here
used to date one of you and it was real nice, but she wasn't ready to
commit
at the time. Women trust other women when we talk about fellers."
Sid and Zeke looked at each other, hope dawning in their eyes.
"You mean, we could just talk to her about regular things at first,
help
her out with stuff, pretendin' like we ain't just tryin' to woo her?
Won't
she know that we've got Exterior Motives?"
"Heck no," Brenda assured them. "Women would go insane if we knew
what
men were thinking. If you say you want to help her with that router,
she's
gonna think, hmm, this guy wants to help me with my router. And then
she'll
owe you a favor, and she'll buy you a cup o'coffee, and then you'll go
to
that snowshoe thing, and so on. When we say Encapsulated Big Fat
Opening,
we mean Encapsulated Big Fat Opening."
"Well, that sounds like fun and all, but excuse me, when do I get to
kiss
her? All that sounds like somethin' I might do with my nephews. Nice
but,
you know, not all that satisfying." Sid was looking a little skeptical.
"Oh, that's easy. Well, first you ask me and Brenda if we done heard
anything
from her. That's a little Mediator action. And you also go back
to
Kansas City and you go to a good store this time, and get good haircuts
and
good pants. That'll turn her head a little; you men let yourselves get
a
little crusty but, as much as I hate to admit it, Sid, you got a well
formed
set of gluts, and Zeke, this might surprise you but you got a twinkle
in
your eye that can make a girl feel kinda friendly."
"Back on topic, Lacey! You don't wanna blow them up too much ahead
of
time. Anyway, the point is, you get pretty with the self decoration
strategy
of Decorated Visitor Honeypot, you get to know her, you check
with
us every few days, we put in a good word for you once that fence is
fixed,
and then one night when she's here late you just show up with a couple
bottles
of wine and go all the way running the 30% Solution strategy of
Interested
Listener. Just ask her about herself. Brothers, sisters, high
school
prom, what in the world she personally thinks Sun's strategy is, just
all
about her. And then about an hour and a half in, you're gonna be
switching
to promiscuous mode and after that you're on your own."
"That's it?" Sid and Zeke wanted to believe but this was all new
territory,
actually being on the reference implementation team.
"That's it. Oh, and if you want a refresher there's a nice big
dating
design pattern relationships poster on the door of the bean storage
shed
out back."
Brenda wanted to reassure them. "This whole dating design patterns
system
is straight from the Gang of Four Cowboys. They dated all over the
Chisholm
Trail for ten years. You've heard the stories. You've seen the dating
design
pattern study groups. You've heard about the refactorings in the
divorce
groups. You've seen Larry Ellison showing up at hostile takeovers with
a
blond on each arm. It's all about proven reusable solutions to
recurring
problems."
And it was. Sid and Zeke flipped for it and Sid tried his
luck first. He did goTo Guy which contrary to popular articles
is
not harmful. Supplemented with some Interested Listener, he and
Christine
became an item and soon were instantiating each other regularly.
Zeke, disappointed that he lost the flip, went back to Kansas City
for
a new haircut anyway where he met a dancehall floozy with a heart of
gold
who had never met a guy who wanted to hear her talk about herself, and
who
coincidentally had always wanted to get her SCJP. She moved back with
him
and they're very happy, reading Head First Java together in the
mornings
and walking the fenceline spraying thistle in the afternoons.
And Brenda and Lacey's margins went back up to even better than
before,
and they had enough money to hire Ellen to help Christine out, and the
whole
thing started all over again.
"Dating Design Patterns"
is available through SoftPro, Powell's, Nerdbooks,
and Amazon.com.
To read excerpts, visit www.datingdesignpatterns.com
Solveig Haugland is an independent trainer and author near Boulder,
Colorado.
She spends her time working on training through GetOpenOffice.org for those
switching from Microsoft Office (www.getopenoffice.org); Techwriter Stuff: The Single
Source, tshirts and posters for editors and techwriters and those
who love them; and of course her satire of the whole patterns thing, Dating Design Patterns,
the original reusable dating solutions.
Discuss this article in The Big Moose Saloon!
Return to Top
|
Scripting Ant
Scripting Ant
Lasse Koskela
Accenture Technology Solutions
Copyright © 2003 Lasse Koskela.
Introduction
During the past couple of years I have more than once heard someone
asking how to write an Ant build script which incorporates a certain
degree of intelligence to be able to recognize the existence of a
newly added J2EE module automatically -- without someone adding an
entry into a .properties file or touching the build script itself.
In fact, I was one of those someones.
For starters, let us state that it is often the case that we
shouldn't embark on such a scripting mission. In the end, Ant is a
build tool, not a scripting language. It is a completely valid
argument that a build process should be explicit, which is not true
when the build script incorporates such dynamic characteristics as
detecting that a certain type of subdirectory should be processed
using a different target than the next subdirectory. The intent is
not to learn a particular scripting language but to learn how
scripting languages in general can be employed within an Ant build
script.
I wrote this short article because I wanted to know how to do it. I
wanted to learn. Please, keep that in mind when the inevident sense
of insanity penetrates your mind :)
Available tools for scripting Ant
You may have noticed that Ant provides a <script> task as part
of the optional tasks. The <script> task can be used to
execute any scripting language supported by the
Bean Scripting
Framework (BSF) from the Apache Jakarta Project (donated by IBM).
The supported scripting languages include the following:
I'll cover two of these languages, namely JavaScript and Jython,
through a relatively simple example: dynamically calling a
parameterized Ant target for each web application directory in our
project's file system. I might come back and add example solutions
for some other supported languages, and of course the reader is free
to submit her own solutions for the community to see.
Installing support for JavaScript and Jython
Although Ant's <script> tag supports all sorts of languages,
each of them requires downloading one or more 3rd party libraries.
The dependencies are listed in the
Ant Manual's
Library Dependencies page ("Ant Tasks" -> "Library Dependencies").
The most essential component is the BSF .jar file, which includes
the implementation of the <script> task. In addition, we need
some language specific libraries in order to be able to use JavaScript
and Jython.
For JavaScript, we only need js.jar from the
Mozilla Rhino
project. Unzip the Rhino distribution .zip file and copy js.jar
into your ANT_HOME/lib directory.
Jython, however, is a bit more complicated. First, we need to download
a "self-extracting" .class file from
Jython.org.
Then, we need to install the Jython interpreter by calling
java jython-21 on the downloaded .class file. This launches
a graphical installer, which guides the user through installing the
full Jython package somewhere on the user's computer. We're only
interested in a single .jar file, jython.jar, which can be found
under the installation directory. Again, copy jython.jar into your
ANT_HOME/lib to make it available to the Ant runtime.
Your ANT_HOME/lib directory should now include at least the
following .jar files:
|
optional.jar |
|
Includes the script task definition (should come as
part of the Ant distribution) |
|
bsf.jar |
|
The scripting framework implementation |
|
js.jar |
|
The implementation for JavaScript |
|
jython.jar |
|
The implementation for Jython |
That should do it. Now, before we move on to the real thing, let's
run a simple build script to verify that everything is set up
correctly.
Copy-paste this build.xml somewhere and run it. If everything is in
place, the outcome should include a nice greeting from each of the
script blocks.
<?xml version="1.0"?>
<project name="AntScriptingTest" default="test-all" basedir=".">
<target name="test-all" depends="test-jython, test-javascript"/>
<!--
- Runs an empty script to make sure that all necessary libraries are
- available for Jython scripting.
-
- No, it's not a typo. The script task only recognizes Jython's old
- name, "JPython".
-->
<target name="test-jython">
<script language="jpython">
<![CDATA[
import sys
print "Hello from Jython!";
]]>
</script>
</target>
<!--
- Runs an empty script to make sure that all necessary libraries are
- available for JavaScripting.
-->
<target name="test-javascript">
<script language="javascript">
<![CDATA[
importPackage(java.lang);
System.out.println("Hello from JavaScript!");
]]>
</script>
</target>
</project>
Basics
It's cool to be able to run scripting languages from within an Ant
build. However, in order to be actually useful, the script code has
to integrate with the surrounding build script somehow.
The script can access two kinds of objects from the surrounding
build script. First of all, any properties defined with the
<property> tag that are visible to the <script> tag
encapsulating a script can be accessed within the script. In other
words, a property named "foo" in the build script is
accessible as a variable named "foo" inside the script block.
Second, the <script> tag provides a bunch of implicit
variables, namely project, self and any targets within
the build script. project is a reference to the top-level
object, the root element of the build script, <project>.
self is a reference to the encapsulating <script> task.
Finally, the implicit target references are named after the targets
themselves. For example, a target named "foo" would be accessible
for the script via the implicit variable "foo". We'll see how these
implicit variables can be used later in the examples.
Regardless of the scripting language being used, the Ant API is
something a developer needs to look at. Every Ant entity (project,
target, task, and so on) is represented by a corresponding Java
class in the Ant API. Also, the need for scripting the build script
in the first place usually suggests that something has to happen
dynamically, and that something has to do with building the project.
This most definitely means that the script needs to manipulate the
existing build script's structure by adding tasks, creating
dependencies between targets, setting properties, etc.
Furthermore, many, if not all, of the supported scripting languages
provide access to plain old Java classes. This is particularly
useful if you need to communicate with, say, a 3rd party application
server through their proprietary communications library. Yes, you
could write your own Ant task for doing that, but it's often easier
to do the same in a scripting language.
The Example
Let's assume that we have the following directory structure:
/src
/main
/web1
/WEB-INF
/WEB-INF/web.xml
/web2
/WEB-INF
/WEB-INF/web.xml
build.xml
webapps-js.txt
webapps-jy.txt
Yes, I know. It doesn't make much sense. Let's just accept it as it
is and focus on the main subject -- the scripting part.
What we want to accomplish is that whenever the developers add a new
web application directory into a certain location in the file system
(say, adding "web3" next to "main", "web1", and "web2"), the build
script wraps that new directory into a nice WAR archive without
the developers touching the build script.
More specifically, we want to accomplish this by writing a small
script that scans the file system for suitable directories, and then
calls a parameterized Ant target which performs the WAR generation
in a standard manner.
The common part
Let's start with the parameterized target our script is supposed to
call:
<target name="genericwebapp">
<property name="webappFile" value="${pname}.war"/>
<property name="webappRoot" value="src/${pname}"/>
<delete file="${webappFile}" failonerror="false"/>
<war destfile="${webappFile}" webxml="${webappRoot}/WEB-INF/web.xml"
basedir="${webappRoot}" excludes="**/web.xml"/>
</target>
The target expects a parameter named "pname" to indicate the name of
the web application's root directory. For example, "web1". It then
proceeds to make a WAR archive named "${pname}.war", accordingly.
Next, let's see how we end up running the scripted parts in the
first place.
First, we need a top-level target that orchestrates the necessary
sub-targets in order to get a happy ending. We'll have one such
target for each scripting language we're going to use
("build_javascript" and "build_jython"). Second, we need a target
to encapsulate the script block itself (again, one for each
scripting language). Finally, we need some kind of a placeholder
target ("build_generated") for the scripts to mess around with.
Below is an example of such a build script:
<?xml version="1.0"?>
<project name="AntScripting" default="build_jython" basedir=".">
<!--
- A top-level target for running our JavaScript implementation.
-->
<target name="build_javascript" depends="setup_javascript, build_generated"
description="Performs a build using JavaScript for scripting"/>
<!--
- A top-level target for running our Jython implementation.
-->
<target name="build_jython" depends="setup_jython, build_generated"
description="Performs a build using Jython for scripting"/>
<!--
- This target acts as a template to which the "build_setup_*" targets
- dynamically add "antcall" tasks for calling the "genericwebapp" target
- for each webapp directory.
-->
<target name="build_generated"/>
<!--
- Dynamically creates "antcall" tasks into the "build_generated" target
- based on the underlying directory structure using Jython.
-->
<target name="setup_jython">
<property name="webAppsFolder" value="src"/>
<script language="jpython" src="webapps-jy.txt"/>
</target>
<!--
- Dynamically creates "antcall" tasks into the "build_generated" target
- based on the underlying directory structure using JavaScript.
-->
<target name="setup_javascript">
<property name="webAppsFolder" value="src"/>
<script language="javascript" src="webapps-js.txt"/>
</target>
<!--
- A generic target for generating a .war from a web application directory.
-->
<target name="genericwebapp">
<property name="webappFile" value="${pname}.war"/>
<property name="webappRoot" value="src/${pname}"/>
<delete file="${webappFile}" failonerror="false"/>
<war destfile="${webappFile}" webxml="${webappRoot}/WEB-INF/web.xml"
basedir="${webappRoot}" excludes="**/web.xml"/>
</target>
</project>
As you can see from the dependencies of our top-level targets, Ant
will first execute the "setup_xxx" target encapsulating our script
code, and then proceed to executing the "build_generated" target,
which our script should've populated with appropriate tasks from
within the "setup_xxx" target.
At this stage, it's probably good to introduce the two ways the
<script> tag can be used.
The first way is to embed the actual script code inside the
<script> element (encapsulated within a CDATA block) as
illustrated by our test build.xml earlier. That's the simplest
option as long as your script doesn't get too verbose.
The other option used in this larger example is to write the actual
script code into a separate text file and have the <script>
tag point to it using the "src" attribute. This is the best choice
when the code grows to such proportions that it begins to make the
build script itself difficult to read. Also, having the script in a
separate file lets the developer write the scripts using the IDE of
his choice, if there is one available.
Enough talking. Let's get down to the business.
Script implementation in JavaScript
Since we'll be using the file system and the Ant API, we need to do
some importing. The Mozilla Rhino JavaScript engine supports the
following kind of import syntax:
importPackage(java.lang, java.util, java.io);
importPackage(Packages.org.apache.tools.ant);
importPackage(Packages.org.apache.tools.ant.taskdefs);
In other words, packages not within the standard java.* tree need to
be prefixed with "Packages." in order to work. Individual classes
can be imported with a similar syntax using the importClass()
function.
After importing the needed Java libraries, we can access them just
like in plain Java. For example, printing to standard output can be
done with
System.out.println("Hello from JavaScript!");
Without further talk, here's the full source code for our JavaScript
implementation. I have commented it with the intent of communicating
what is happening.
importPackage(java.lang, java.util, java.io);
importPackage(Packages.org.apache.tools.ant);
importPackage(Packages.org.apache.tools.ant.taskdefs);
// A "constant" for the file separator character
var S = File.separator;
// The main method (called from the bottom of the file).
function main() {
// "srcRoot" is the folder in which all web modules should reside
var srcRoot = new File(System.getProperty("user.dir") + S + webAppsFolder);
// Loop through all web modules and setup the antcall for each of them
var iterator = findWebModules(srcRoot).iterator();
while (iterator.hasNext()) {
addCallToGenericWebAppTarget(iterator.next());
}
}
// Returns a java.util.List of the directories under "srcRoot" containing a
// web module.
function findWebModules(srcRoot) {
// "webModules" will contain the list of matching folders
var webModules = new ArrayList();
// Loop through the directory contents
var modules = srcRoot.list();
for (var i = 0; i < modules.length; i++) {
var moduleDir = new File(srcRoot.getPath() + S + modules[i]);
// If the sub directory looks like a web application, add it to the list
if (isWebModule(moduleDir)) {
webModules.add(moduleDir.getName());
}
}
return webModules;
}
// Determines whether the given directory contains a web module
function isWebModule(directory) {
var webXml = new File(directory + S + "WEB-INF" + S + "web.xml");
return webXml.exists();
}
// Creates an "antcall" task for the "genericwebapp" target and configures
// the parameters according to the given web module
function addCallToGenericWebAppTarget(module) {
// Create an "antcall" task which can be used for executing a target
// within the same build script. Note the use of the implicit variable
// "project"...
var callTask = project.createTask("antcall");
// The target we want to call is "genericwebapp"
callTask.setTarget("genericwebapp");
// Configure the parameters for the "antcall" task, namely the webapp
// directory's name
var param = callTask.createParam();
param.setName("pname");
param.setValue(module);
// Add the created task into the body of the "build_generated" target.
// Note the use of an implicit reference to the "build_generated" target...
build_generated.addTask(callTask);
System.out.println("added a call to genericwebapp for module " + module);
}
// Encapsulate everything nicely inside a main() method
main();
Running the build.xml with "ant build_javascript" should produce an
output similar to the following:
C:\AntScripting>ant build_javascript
Buildfile: build.xml
setup_javascript:
[script] added a call to genericwebapp for module web1
[script] added a call to genericwebapp for module web2
build_generated:
genericwebapp:
[delete] Deleting: C:\AntScripting\web1.war
[war] Building war: C:\AntScripting\web1.war
genericwebapp:
[delete] Deleting: C:\AntScripting\web2.war
[war] Building war: C:\AntScripting\web2.war
build_javascript:
BUILD SUCCESSFUL
Total time: 1 second
Script implementation in Jython
Again, the first thing to do in our Jython script is to import the
needed libraries. The syntax for doing this is slightly different
(Jython is basically "Python for Java"...) from JavaScript:
import java.util as util
import java.lang as lang
import java.io as io
We have just imported the listed Java packages as Python
modules into our script. From now on, we can refer to classes
in those packages with the alias ("util" for "java.util" for example).
In other words, java.util.ArrayList would be accessible using the
notation "util.ArrayList".
Note that while I could've used Java's System.out.println() for
writing into standard output, I've chosen to use the Jython print
functionality which is less verbose, as is typical to scripting
languages.
# 1) using Java's System.out.println()
import java.lang as lang
lang.System.out.println("Hello from Jython!");
# 2) using Jython's "native" print function
print "Hello from Jython!";
So, let's see what our example looks like written in Jython:
import java.util as util
import java.lang as lang
import java.io as io
### A "constant" for the file separator character
S = io.File.separator;
### the main() method represents the entry point into this script file
def main():
srcDir = io.File(webAppsFolder);
if srcDir.exists() != 1:
raise Exception("Root folder '" + str(webAppsFolder) + "' does not exist");
# Loop through all web modules and setup the antcall for each of them
webModules = findWebModules(srcDir);
for i in range(webModules.size()):
addCallToGenericWebAppTarget(webModules.get(i));
### Retrieves a list of web module directories under the given root directory.
def findWebModules(rootDir):
list = util.ArrayList();
for subdir in rootDir.list():
# If the sub directory looks like a web application, add it to the list
if isWebRoot(rootDir.getPath() + S + subdir):
list.add(subdir);
return list;
### Determines whether the given path represents a valid web application root.
def isWebRoot(path):
webxml = io.File(path + S + "WEB-INF" + S + "web.xml");
if webxml.exists():
return 1;
else:
return 0;
### Creates an "antcall" task into the "build_generated" target
### for the given webapp
def addCallToGenericWebAppTarget(webModule):
# Create an "antcall" task which can be used for executing a target within
# the same build script. Note the use of the implicit variable "project"...
antCallTask = project.createTask("antcall");
# The target we want to call is "genericwebapp"
antCallTask.setTarget("genericwebapp");
# Configure the parameters for the antcall task, namely the webapp
# directory's name
param = antCallTask.createParam();
param.setName("pname");
param.setValue(webModule);
# Add the created task into the body of the "build_generated" target.
# Note the use of an implicit reference to the "build_generated" target...
build_generated.addTask(antCallTask);
print "added a call to genericwebapp for module " + str(webModule);
### Execute the main method which encapsulates all code in this script file
main();
Running the build.xml with "ant build_jython" should produce an
output similar to the following:
C:\AntScripting>ant build_jython
Buildfile: build.xml
setup_jython:
[script] added a call to genericwebapp for module web1
[script] added a call to genericwebapp for module web2
build_generated:
genericwebapp:
[delete] Deleting: C:\AntScripting\web1.war
[war] Building war: C:\AntScripting\web1.war
genericwebapp:
[delete] Deleting: C:\AntScripting\web2.war
[war] Building war: C:\AntScripting\web2.war
build_jython:
BUILD SUCCESSFUL
Total time: 2 seconds
Summary
We have just seen how a number of scripting languages can be embedded
into an Ant build script with relative ease, yet packing enough power
to actually accomplish something. In case you managed to get an urge
to start scripting your builds right away, I've collected some useful
links into the resources section.
Have fun!
Resources
Discuss this article in The Big Moose Saloon!
Return to Top
|
Use JSIS to Roundup Java Source Information
by Steve Blake
The Java sources that make up
your project or enterprise are a valuable asset. They are a virtual Gold
Mine of information just waiting to be tapped. Static source analysis
tools are becoming an increasingly important facet supporting nearly all
development methodologies including XP and Agile Programming. Now
JSIS Tools has released
a new API called JSIS that gives
Java developers a simple but powerful means of parsing Java sources and
resolving names to their declarations.
JSIS
is the Semantic
Interface Specification for JavaTM technology,
a de-facto standard Java API that combines the capabilities of a Java Parser,
Java Reflection, and a compilation unit manager in one simple package. The
JSIS API provides well defined primitive functionality covering the
Java syntax and semantics. Secondary queries layered on top of JSIS
provide services for tools in any problem domain.
JSIS
extends the semantic capability of Reflection with the syntactic parsing
capability of a compiler. The synergistic result is a simple and easy to use
API that offers more semantic functionality than Reflection and more element
classification and syntactic source location information than a Java parser.
Using JSIS, Java programmers can create high quality semantically
aware source code analysis and inspection tools for use on their own
projects or to promote for commercial sale and profit.
Written in
Java, JSIS is?write once, run anywhere?. It seamlessly
supports Java 1.2, 1.3.x, and 1.4.
Take a look at
the JSIS
Javadocs or the
JSIS API
sources.
Java Sources Are Similar to XML Documents
JSIS works much the same way as the well known DOM and SAX parsing models do
for XML. You can view your sources as having a structure similar to XML
documents; Java elements each have starting and ending points and are
hierarchical. You can select specific elements as you would with a DOM
parser, or you can extend a parser handler just like you would do with a SAX
event-driven parser.
JSIS lets you access all the elements of your Java sources including
compilation units, declarations, names, statements, expressions, comments and
tokens. It supports analysis of implicit elements such as inherited
members. The exact starting and ending location of each element, line and
column numbers define the source span each element occupies.
While JSIS works in tandem with reflection, it goes beyond reflection by
providing expressions and statements and resolving all identifiers and qualified
names to their declarations. A gateway allows you to get reflection
objects from JSIS declarations and construct JSIS declaration objects from
reflection objects.
Here is a
chart comparing JSIS features with Reflection and other Java parsers.
How JSIS Stacks Up |
|
JSIS |
Java Parsers |
Reflection |
Compilation Units |
|
|
|
Find by class name |
Yes |
|
Yes |
Find by source file name |
Yes |
Yes |
|
Find all in a path |
Yes |
|
|
Find all in the environment |
Yes |
|
|
Packages |
|
|
|
Find by package name |
Yes |
|
Yes |
Find all in a path |
Yes |
|
|
Final all in the environment |
Yes |
|
|
Find subpackages |
Yes |
|
|
Syntactic Parsing |
|
|
|
Event driven Parser |
Yes |
|
|
Element classification |
Yes |
|
|
Declarations |
Yes |
Yes |
Yes |
Statements |
Yes |
Yes |
|
Expressions |
Yes |
Yes |
|
Tokens (commas, brackets, etc)
|
Yes |
Yes |
|
Comments |
Yes |
Yes |
|
Line numbers and spans |
Yes |
|
|
Semantic Resolution |
|
|
|
Class/Interface references |
Yes |
|
Yes |
Method invocations |
Yes |
|
|
Field references |
Yes |
|
|
All identifiers |
Yes |
|
|
All qualified names |
|
|
|
|
A Herd Of Potential Applications
JSIS supports the creation
of nearly any application that requires static semantic and syntactic
information about a Java source, program, or environment. The following
table
illustrates just some of the potential tool applications that can be built using JSIS:
Browsing and Navigation Tools
|
Refactoring Tools |
Code Formatting Tools |
Code Restructuring and Source Tools
|
Coding Style and Compliance Tools
|
Data Flow Analysis Tools |
Cross Reference Tools |
Dependency Tree Analysis Tools
|
Design Tools |
Document Generation Tools |
Invocation Call Tree Analysis Tools
|
Language-sensitive Editing Tools
|
Language Translation Tools |
Quality Assessment Tools |
Thread Analysis Tools |
Test-case Generation and Coverage
Analysis Tools |
Quality, & Complexity Metrics Tools
|
Reverse Engineering Tools |
Sample
Application: Finding Unused Declarations
To illustrate how to use JSIS, several small example
applications are included with the API bundle. Take a look at one of them,
a program that finds and prints a list of unused declarations including private
fields, private methods and constructors, parameters, and local declarations.
It illustrates the typical pattern of many JSIS programs:
-
JSIS program setup, setting reference parsing and setting
the source path
-
Selecting a compilation unit by name from the environment
-
Syntactic analysis using the JSIS.Parser class to parse
only non-token elements
-
Semantic analysis to resolve the identifier to its
declaration name element
-
Use of java.util classes with JSIS elements to make lists
and sets
-
Release of compilation unit resource
The main method takes to parameters, the unit name and
the source path location:
Unused usage: java
com.JSISTools.Examples.Unused <compilation_unit_qualified_name> <source_path>
For example:
java com.JSISTools.Examples.Unused java.lang.String C:/Java/src
The sources of this sample are presented in a browsable
HTML format generated by JTNav, an application built and shipped with JSIS.
You can explore the program by clicking on the links of names that resolve to
the declarations. Start with the class that has the main method:
com.JSISTools.Examples.Unused.
One of the first things to do is get the compilation unit
named in the first argument:
compUnit =
Environment.compilationUnit(args[0]);
Next at
line 64 there is an instantiation of the class
com.JSISTools.Examples.UnusedParserHandler that extends
com.JSISTools.JSIS.ParserHandler and implements the StartElement method:
UnusedParserHandler handler = new UnusedParserHandler();
Parser
parser = new
Parser(handler);
A new Parser is created and the parse is started:
parser.parse(compUnit,
false); // false suppresses parse of tokens
StartElement classifies the elements placing the names of
the desired declarations like paramters into a list.
switch
(element.kind())
{
case
Declaration.PARAMETER:
{
declarationNamesList.add(((Declaration)
element).parameterName());
break;
}
Elements that are identifiers are resolved to their
declared name:
Name name = ((Expression)
element).referencedName();
Names of declarations that belong to the analyzed compilation unit are added to
a set.
if
(name.enclosingCompilationUnit().equals(currentUnit))
{
referencedNameSet.add((Name)
name);
}
Later, an unusedNamesList is populated by testing which
declaration names are not contained in the referencedNamesSet:
for
(int i = 0; i <
declarationNamesList.size(); i++) {
name = (Name)
declarationNamesList.get(i);
if
(!referencedNameSet.contains((Name) declarationNamesList.get(i))) {
unusedNamesList.addLast(name);
}
}
The list is sent to Standard.out.
System.out.println("Unused Declaration Names: ");
for (int i =
0; i < unusedNamesList.size(); i++) {
name = (Name)
unusedNamesList.get(i);
System.out.println("
Line " + name.span().firstLine
+ " " + name.kindImage()
+ " " + name.image());
}
Finally, the compilation unit resources are released, an
important step when analyzing lots of units:
compUnit.release();
Conclusion
Give JSIS a try next time you want to prospect for
information from your sources. Many of the things you can do with
reflection can be done easily with JSIS. If you have one source or
thousands, JSIS will scale to automate your static analysis requirements.
Visit www.jsistools.com for more
information and get a JSIS Personal Evaluation version Free!
Discuss this article in The Big Moose Saloon!
Return to Top
|
JavaRanch GO Ladder - Interested?
by Johannes de Jong
As some of you might know, Bert Bates, yep our famous
author, is quite an ace at GO. Awhile ago I suggested
to him that we start a JavaRanch go ladder. Bert
replied back that he thought it was a great idea, and
that I had to come up with suggestions. Now, let's face it,
this time of the year is a busy time, and I simply did
not manage to find the time to hit the www to find out
how a GO ladder is supposed to be organized and what
it entails.
So, before I actually go out and do the leg work, I'd
love to know if there is interest amongst our regulars
for a Java Ranch GO ladder. Please be so kind and drop me a line if you are.
Discuss this article in The Big Moose Saloon!
Return to Top
|
The Big Moose Saloon Question and Answer of the Month
Mosey on in and pull up a stool. The
JavaRanch Big Moose Saloon is the place to get your Java questions answered.
Our bartenders keep the peace, and folks are pretty friendly anyways, so don't
be shy!
Question: Do certification exams encourage Theoretical Programmers?
Over in the Programmer Certification (SCJP) forum, visiting author Khalid A. Mughal opened a bottle of worms with this thread starter:
Would you vote for a sheriff that had practiced horse-riding only on a rocking horse?
People can pass the Programmer Certification exam without having written any code. Is this what the present Programmer Certification exam encourages? Emphasizing theory and no practice? Should the exam be changed? Do employers know or care that they are getting "theoretical" programmers?
In a similar vain, one could ask the following questions:
Would you feel safe if people, who had only read the driving manual and never driven a real car, were unleased onto the unsuspecting rush-hour down-town traffic?
Would you vote for a sheriff that had practiced horse-riding only on a rocking horse?
(I know at least which horse to bet on.)
[Disclaimer: No innuendos to the sheriffs that go galloping on this site.]
What do you think?
Now, mosey on o'er, see what folks are saying and chime in with some thoughts of your own.
Discuss this article in The Big Moose Saloon!
Return to Top
|
Movin' them
doggies on the Cattle
Drive
It's where you come to learn Java, and just like the cattle drivers of the
old west, you're expected to pull your weight along the way.
The Cattle Drive forum is where the drivers
get together to complain, uh rather, discuss their assignments and encourage
each other. Thanks to the enthusiastic initiative of Johannes de Jong, you can keep
track of your progress on the drive with the Assignment Log. If you're tough enough
to get through the nitpicking, you'll start collecting moose heads at
the Cattle Drive Hall of Fame.
For those of you who are tucked safely in the warmth and comfort of
your homes during this holiday season, spare a thought for those
rugged souls out on the cattle drive.
Picture a cold, clear night with the Milky Way splashed across a midnight
blue backdrop, the gentle rustling of the cattle as they bed down for
the night making a comforting sound. Crouching cowboy style beside the
laptop, contemplating the past month on the drive........
Fresh rider on the Drive...
Want to give a hearty welcome to newcomer John Cooper. He's our newest
greenhorn, and he's workin' on his second assignment.
Then there's Kate Head, who has been on the drive for a while, but
keepin' a low profile. She's just signed on to the assignment log after
baggin' her first moose. She's currently workin' on OOP-2, and I
believe she's findin' it a bit slippery!
Another moose on the wall for...
You may have noticed a bit of fancy rope work by Greg Neef. Yup, he
passed Servlets 4b on the first try, which is a first in itself.
Mighty fancy ropin', cowboy, even if you only have 1 "G" in your name!
Tough act to follow!
Other than that, it's been mostly a quiet month on the drive. We're down
to 10 active drivers, but I'm sure that'll change right quick, as soon
as certain drivers who shall remain anonymous realize that they are not
just in the red, but on the inactive list! (You out there, Pauline?)
Then the trail will be as noisy and dusty as usual.
Nitpicking is hard work
too... We know they're workin' reeeeeally
hard, after all, we've seen what those assignments look
like once the nitpickers have combed through 'em. Hats
off to Marilyn deQueiroz, Pauline McNamara, and Jason Adam for their dedication
and patience with the pesky vermin that always manage to
make their way into those assignments.
Joinin' the Drive
You think ya got what it takes? You ready for some work and some good learnin'? Then pull up a stool and read up on Joinin' the Drive. Good luck!
As we drive on into 2004, I'd like to leave you all with this Holiday
wish:
May your code always compile on the first try,
And your programs run exactly as expected.
See ya out on the trail!
Cattle Drive update by Carol Murphy
|
Mosey on over to The Cattle Drive Forum in The Big Moose Saloon!
Return to Top
|
Book Review of the Month
JUnit in Action Vincent Massol, Ted Husted | | | |
If you've ventured into a bookstore lately, you may have noticed that the number of titles available on agile methodologies is multiplying more rapidly than the populations of some third-world countries. Leafing through any one of these titles while sipping an espresso in the bookstore's coffee bar, you'll quickly figure out that repeatable, automated unit tests are a good thing, and that JUnit is the unit testing framework most often used for Java unit testing. A couple of mochachino grande's later, and you've read enough to convince you that your continued survival rests on writing these automated unit tests. Unfortunately, and before your caffeine buzz even wears off, you're struck with the realization that while you're motivated and ready to go, you're just not sure exactly how to go about writing tests for many of your J2EE components.
"JUnit in Action" picks up where these other texts leave off. This is not a book on test-driven development, and it's not a book trying desperately to convince you of the value of tests. The book's goal is to demonstrate exactly how to write comprehensive unit tests for the various components of your J2EE applications. Writing tests for servlets, filters, JSPs, taglibs, database components, and EJBs are all covered in detail, as are testing strategies using mock objects and Cactus. Not only are you shown how to write the tests, but also how to write testable code. Along the way, the author points out useful best practices and how to use design patterns to improve your tests and the code you are testing. Code examples are thoroughly documented throughout the text in order to illustrate the techniques being discussed.
"JUnit in Action" is the definitive how-to manual for unit testing J2EE components. Pick up one of the other books if you're looking for something more motivational, but when you're ready to sit down and bang out some code, you'll want this book at your side.
(Jason Menard - Bartender, November 2003)
| | |
More info at Amazon.com ||
More info at Amazon.co.uk
| |
Other books reviewed in
November :
Discuss this book review in The Big Moose Saloon!
Return to Top
|
We haven't any book promotions remaining in December. Following are the scheduled book promotions for early January:
Return to Top
|
Managing Editor: Dirk Schreckmann
Comments or suggestions for JavaRanch's Journal can be sent to the Journal
Staff.
For advertising opportunities contact the Journal
Advertising Staff.
|