Skip to main content

What is JUnit & How To Use It?


JunitPithy is built using Apache Ant and Ant has JUnit support as an optional package. With junit.jar in the classpath, Ant should automatically detect it, but where you want to keep your default classpath clean, which is my preference when you are trying to have a code tree with minimal external dependencies, the quickest way to enable it is to copy the junit.jar file into the $ANT_HOME/lib directory. With junit.jar in place, the Ant task <junit> is enabled.


With that in place, we can organise the test classes. Rather than mix the test code with the actual code, it is better to have a separate test hierarchy with its own class tree. Here's the opening of build.xml:


<project name="PithyDerby" default="compile" basedir=".">
<property name="sourceDir" value="src" />
<property name="outputDir" value="bin" />
<property name="testDir" value="tests" />
<property name="testOutputDir" value="testbin" />

<path id="classpath">
<pathelement location="${outputDir}"/>
<pathelement location="lib/derby.jar"/>
</path>

<path id="classpath.test">
<pathelement location="${testOutputDir}"/>
<path refid="classpath"/>
</path>



We define testDir and testOutputDir properties (and in a change from the original build file, we remove the init target and set all the properties up front). We then define two classpath variables, one with the classpath needed to run Pithy, and one which includes the other classpath plus the test classes. The next thing we need to do is add a target to compile any tests we have.


<target name="compile-tests" depends="compile">
<javac srcdir="${testDir}" destdir="${testOutputDir}" >
<classpath refid="classpath.test"/>
</javac>
</target>

And finally, we get to the meat of the testing; the <junit> task.
<target name="test" depends="compile-tests">
<junit>
<classpath refid="classpath.test"/>
<formatter type="brief" usefile="false"/>
<test name="com.runstate.pithy.PithTest"/>
</junit>
</target>

</project>


The classpath is set to the same as we used to compile the test classes. The formatter element allows you to set how the test results are displayed. With the type attribute set to "brief" will just display information on failed test cases; other options are "xml" for feeding to other processes or "plain" which reports on all the tests run. By default, output will go to a file (based on the name of the test being run) so we set the usefile attribute to false so that we can see the results.


Finally we can have any number of test elements; these take the name of the test class to be run as an attribute. In the excerpt above, we want to run com.runstate.pithy.PithTest. This file should be found in tests/com/runstate/pithy/PithTest.java. Why have we created it in the same package as the tested class? Well, apart from being easier to remember, it also means the test case can test package private methods, rather than just the normally visible methods, and because we separated the tests from the production code, there is still no danger of your production code becoming inflated with test code.


The PithTest class is just a simple unit test class for testing equality operations with Pith. If you run "ant test" now, ant will compile the code, then the test code and then run and report on the result of that test.
$ ant test
Buildfile: build.xml
prepare:
compile:
compile-tests:
test:
[junit] Testsuite: com.runstate.pithy.PithTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.004 sec

You now have your basic framework set up for running tests. But in this example what we really want to test in this retrofit is the database class, PithyDBDerby.java. (I will admit to one tweak to the PithyDB interface from the original, the ability to start the database empty has been added to it, to allow it to be reset to a known state).


Rather than write code for each test to set up and shut down the database, we can use TestCase's setUp and tearDown methods:
public class PithyDBDerbyTest extends TestCase
{
PithyDBDerby pd;

public void setUp()
{
pd=new PithyDBDerby();
pd.start(true);
}

public void tearDown()
{
pd.stop();
}



Now, before each test method is called, the setUp method is called, creates a PithyDBDerby instance and starts it with no existing data. When the test method has completed, the tearDown method is then called and the database is shutdown. Now we can start writing tests for the database, for example:


public void testAdd()
{
Pith p=new Pith(null,"TestCategory","TestString");
pd.add(p);
ArrayList r=pd.get("TestCategory");
assertEquals(p, r.get(0));
}
public void testMultipleAdd()
{
Pith p1=new Pith("MultipleAdd","Test String");
Pith p2=new Pith("MultipleAdd","Other Test String");

pd.add(p1);
pd.add(p2);

ArrayList r=pd.get("MultipleAdd");
assertEquals(p1,r.get(0));
assertEquals(p2,r.get(1));
}

are some simple tests to make sure that we can write Pith objects out and are able to retrieve them. To add this test, we add another test element in the build.xml file's test target:
<target name="test" depends="compile-tests">
<junit>
<classpath refid="classpath.test"/>
<formatter type="brief" usefile="false"/>
<test name="com.runstate.pithy.PithTest"/>
<test name="com.runstate.pithy.PithyDBDerbyTest"/>
</junit>
</target>

And run ant test again:
test:
[junit] Testsuite: com.runstate.pithy.PithTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.004 sec

[junit] Testsuite: com.runstate.pithy.PithyDBDerbyTest
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 3.284 sec

[junit] ------------- Standard Output ---------------
[junit] PithyDBDerby in use
[junit] PithyDBDerby in use
[junit] ------------- ---------------- ---------------

There's only one problem here; in the output, we see a message which is emitted each time a DB is created and started; each time we add another test to the PithyDBDerbyTest, that will be one more time the setUp and tearDown routines are run, and "create an SQL database" is not a light operation in anyone's book.
Unit testing methodology would point to your tests not being finely grained enough and that a "Mock Object" could be used to supply a faked database connection to the PithyDBDerby class. That is potentially a lot of changes to the code; the pragmatic approach is to reduce the number of setUp and tearDown calls and only do it once per test.


To get there, we need to understand how JUnit actually runs the tests. When a TestCase is run, the test runner looks for a static method "suite" which should return a Test, the base class for tests in JUnit. TestCase is a subclass of Test, as is another class TestSuite, which can be used to composite tests together. If we override the static suite method in a TestCase, we can return our own TestSuite.


public static Test suite()
{
TestSuite t=new TestSuite(PithyDBDerbyTest2.class);
TestSetup wrapper=new TestSetup(t) {
protected void setUp() { oneTimeSetUp(); }
protected void tearDown() { oneTimeTearDown(); }
};

return wrapper;
}

Here we create a TestSuite out of the test class and then we use a TestSetup decorator which wraps the TestSuite in its own setUp and tearDown routines. We then have to make the PithyDBDerby field static, and rename our previous setUp and tearDown routines to oneTimeSetup and oneTimeTearDown. I've put this version of the test into PithyDBDerbyTest2.java, so add


<test name="com.runstate.pithy.PithyDBDerbyTest2"/>


to the junit element in build.xml, and rerun the tests. Now, the database is only created once for as many tests are within the TestCase. The obvious warning to give here is to remember that you have lost some control of the state of the database for each test, so make sure the tests themselves do not interfere with each other.


Hopefully, this process should inspire you to start incorporating testing within your build process, and when you start on redeveloping your code, you can, as well as writing your new code with tests up front, reduce your exposure to problems in the older code by coding tests for the behaviour the new code is expecting; an organic approach to getting better code quality.

Comments

Popular posts from this blog

Asynchronous Vs. Synchronous Communications

Synchronous (One thread):   1 thread -> |<---A---->||<----B---------->||<------C----->| Synchronous (multi-threaded):   thread A -> |<---A---->| \ thread B ------------> ->|<----B---------->| \ thread C ----------------------------------> ->|<------C----->|

WebSphere MQ Interview Questions

What is MQ and what does it do? Ans. MQ stands for MESSAGE QUEUEING. WebSphere MQ allows application programs to use message queuing to participate in message-driven processing. Application programs can communicate across different platforms by using the appropriate message queuing software products. What is Message driven process? Ans . When messages arrive on a queue, they can automatically start an application using triggering. If necessary, the applications can be stopped when the message (or messages) have been processed. What are advantages of the MQ? Ans. 1. Integration. 2. Asynchrony 3. Assured Delivery 4. Scalability. How does it support the Integration? Ans. Because the MQ is independent of the Operating System you use i.e. it may be Windows, Solaris,AIX.It is independent of the protocol (i.e. TCP/IP, LU6.2, SNA, NetBIOS, UDP).It is not required that both the sender and receiver should be running on the same platform What is Asynchrony? Ans. With messag

Advantages & Disadvantages of Synchronous / Asynchronous Communications?

  Asynchronous Communication Advantages: Requests need not be targeted to specific server. Service need not be available when request is made. No blocking, so resources could be freed.  Could use connectionless protocol Disadvantages: Response times are unpredictable. Error handling usually more complex.  Usually requires connection-oriented protocol.  Harder to design apps Synchronous Communication Advantages: Easy to program Outcome is known immediately  Error recovery easier (usually)  Better real-time response (usually) Disadvantages: Service must be up and ready. Requestor blocks, held resources are “tied up”.  Usually requires connection-oriented protocol