Sunday, 27 December 2009

3G Internet

We live in two places, one is a rural area with no broadband service available (I did ask) and the other is right in the city where such problems do not happen. For a long time we relied on dialup. Most of the time dialup is okay in the city. We get 50k and that's enough for casual surfing and email. I can even run a remote terminal at that speed as long as I tune it a bit (limit the colours etc). But in the country the speed falls to about 20k and while that is still okay for surfing and email it doesn't get me much else. I should add that I use broadband at work for things that need more speed and are work related and I don't spend much time on YouTube which needs either speed or patience.

But the rural place at 20k was just too annoyingly slow. With no wired broadband option available I checked if we could install a satellite dish. Yes, but expensive and data capped. Then my neighbour happened to mention that his mate had visited and was using his 3G card in his laptop. I had assumed that, while we get cell phone coverage, that we wouldn't get 3G here. That's because my Vodaphone phone reported no 3G coverage. But this was a Telecom card and it worked fine.

This got me interested. It has the advantage that I can carry it with me, so it doesn't matter whether I am in the city or the rural place. So we got one. It plugs into the USB port and it is something like 5 times faster than dialup in the country. In the city it is a lot faster again. From a cost point of view it came out about the same as the landline phone connection. Since we only ever used the rural phone for dialup we turned that off, so the change was cost neutral. I'm not certain that the 3G card is actually using real 3G in the rural setting, but whatever it is doing works.

Still one problem though. With two laptops in the house we needed to share the connection. At first we shared the connection through a desktop machine plugged into a wireless router but it didn't perform well. I suppose it wasn't really worse than the dialup it was sharing before but I wanted to do better.

So now we have a little box with a wireless aerial and a USB port. It took a little bit of configuring but we got there. The 3G card is from Sierra Wireless but it was supplied by Telecom. The little box is a Netcomm N3G002W. It supports WPA2 and is basically transparent. We plug it in and our laptops can get internet. It is also small enough to carry with us anywhere, including on holiday. So, when we feel like it, we carry our laptops, the 3G card and the Netcomm and we can sit in our hotel room and get to the internet. We ignore the (generally ripoff) connection that sometimes comes with the room. We no longer need a spare desktop in each place to share the connection.

Monday, 7 December 2009


I got all the firewood for next winter cut yesterday. I find this very satisfying and go all smug about how 'green' we are in this regard. The wood grows on our property and trees either fall down from time to time or need thinning. I ought to be specific there. Our native bush is protected from any cutting, but taking fallen wood is okay, so we do. But the non-forested part is fair game and we let the odd tree grow here and there. They need cutting down every so often because otherwise we'd have all forest and no pasture (which would make the sheep unhappy). It is surprising how fast those trees grow.

If I were really, really green I would cut everything with an axe. This would probably make my wife quite pleased because she gets likes to see me chopping wood. But I'd never get it all done. So I use a chainsaw for most of it, though I split it with an axe, which is actually easier than the chainsaw.

So the wood gets all piled up in a covered area where it can dry nicely over the summer. Come winter it will be ready for burning. Our wood burner is connected to a wetback to the hot water cylinder so as well as heating the house it heats the water. During summer the hot water comes from the solar heater on the roof. The system depends on a small electric pump, just as the wood requires fuel for the chainsaw, so not completely green, but close. The fossil fuel energy used is minimal compared to the output and, of course, this is reflected in the costs ie very close to zero operational cost.

Sunday, 6 December 2009

Gadget update

The new gadget is working out well. It doesn't suck power, I can get a day's use out of it as long as I charge it most evenings. I don't sit listening to music all day anyway and, if I need it, I have a charger with me anyway.

I was a bit disappointed that my ring tones don't come through the earpieces, just a generic ring tone. I assign ring tones to different frequent callers so I know who's calling. However the caller name is displayed on the unit itself. I have just a little difficulty reading this without reading glasses, but I can, just.

The earpieces screw quite firmly into my ears which blocks out external noise very well. But it does mean when I want to talk to someone (other than on the phone!) I do have to pull them out. The wired headset earpieces were not so fitting but they did fall off until I made my own 'ear hooks'. So this is just different, not necessarily better or worse.

But the good news is I can start keeping my phone in the belt pouch I bought for it long ago. The wired headset always got in the way when I tried it before. And I can pause the music without pulling out the phone, which was the primary goal.

Sunday, 1 November 2009

New gadget

I got another gadget today.

One of the things I like to avoid doing is pulling my phone out of my pocket. I have ringtones defined for my frequent callers and voice dialing set up for them as well. Using the headset on my existing phone (SE G502) I can take calls and make them with one button press on the headset. It plays music through the headset and interrupts the music on incoming calls. This is all good.

I also use the phone to tell me the time in preference to wearing a watch.

But what it doesn't do: well there isn't a manual way to turn the music off or adjust the volumne without getting out the phone. Also the plug falls out very easily. I replaced the headset to try and solve this but it still falls out.

So I got an HBH-DS980. It is a headset with a little dongle attached that hangs around my neck.
It talks bluetooth to the phone (no plug) and has music controls, caller display, track display and time displays. Pairing it to the phone was a snap. Solves the above problems nicely.

It will probably suck more power. But so far it hasn't.

Saturday, 24 October 2009

The Log Blog

When you log stuff from a Java program if you really don't know what you're doing (or don't care) you'll use System.out.println("something to log"). This not too bad for a tiny project. It works and it is very simple to implement.

But the fairly obvious downside it that if you want to turn the logging off you have to go edit the code. Finding all those log statements is a pain.

There are several logging libraries that have been around for years, notably Log4j, which solve this. They usually have these features:
  • A configuration file external to the program can turn logging on or off
  • You select the logging you want by level, so you can get minimal logging or more detailed logging by changing the external file.
  • You select which classes you want to log at whatever detail as well.
  • You can change the format to include the logging location if you want so it is easy to find just where the logged message came from.
I started off not liking the way Log4j has to work. You have to define a static in every class to enable logging for that class. It seemed like more work than I wanted to do (yes, just one line, I am really lazy). I built my own logger which worked differently, but I've since come around to the Log4j approach if only because all the open source libraries use that as well. Plus I am pretty sure mine was slow.

But there is a subtlety going on with this. You don't actually want to use Log4j directly because then you commit to using it for all your logging. If some library you want to use has decided to use another logger then you have to have two config files etc. This is bad. What you really want is an interface that delegates to a logger you can chose later.

This is the thinking behind Apache Java Commons Logging (JCL). It is an interface and one of the implementations is Log4j. There are others. Lots of the libraries you want to use call JCL. So one obvious thing to do is put JCL calls into your program and then configure it to use Log4j.

Except that there are two problems with JCL. The first is that the API could be better and the second is that there are class loader problems with JCL. It is inclined to not find the implementation classes (link).

So enter SLF4J. This is a newer interface which is slicker and simpler and doesn't have the classloader problems. It includes a dummy JCL interface which maps to the SLF4J interface so your libraries will still work. And there are log4j and other implementations. The particularly nice thing about the interface is that you can get away with using fewer isDebugEnabled() method calls because it evaluates the output later than JCL, ie after it had decided whether to log or not. That means your complex log message doesn't incur any overheads if logging is turned off.

It doesn't stop there. Like JCL, SLF4J supports several logging implementations apart from log4j. There is Java utils logging (JUL) which is bundled with Java. In a perfect world this would be the one we all use, but in practice almost no one does. This seems to be because it often takes more than a configuration file to set up. You have to write code, and no one wants to. We tried using JUL with Glassfish, using SLF4J. It does work, but we could not find a way to direct different output to different files so we went back to log4j.

But we did hit odd problems with log4j, I forget just what they were now, and they were mostly on JBoss rather than Glassfish. I think the embedded (JBoss) log4j collided with our configuration somewhow.

Then we started using logback. This is a descendent of log4j, so it is an implementation and SLF4J plays happily with it. Logback allows us to have separate configuration files for each application on the same server, plus separate log files.

We're very comfortable with this combination. There are some tricks to getting logback running optimally, but these are covered well in the online docs. We also realised that not configuring it generates a lot of log messages and slows the application down. This is a fair call on their part though. The default is to log everything, what else could it be? So if you go this way do make sure you supply a logging configuration file.

Friday, 2 October 2009

My First Open Source Project

I finally got around to doing this.

A couple of years ago I got sick of the way we were writing our documentation in Word and having to edit every document before each release to update the version number. I also got sick of the way we distributed discussion documents (again in Word) and people edited their comments into them. I'd get about eight copies back with conflicting comments. Finally, I got sick of the way I could never find any document a week after I put it into our 'Document Hiding System'.

I put together a simple arrangement that keeps the documents in XML rather than Word. They can be committed to subversion along with the source code. We get change tracking and merging for free. And we can bulk. generate PDFs with any version stamp (or other change) we like.

After I got all this going I discovered DocBooks which is similar. But it has a lot more tags and is missing some features of mine. Fewer tags is easier when editing XML, even with a friendly editor like the XML plugin inside Eclipse.

We integrate document publishing into our software builds now.

Since I've got a blog on Google I thought I might as well use their O/S project repository.

Take a look at maduradocs. I'm not yet using their source repository, but everything you need is in the zip attachment. I welcome comments and suggestions.

Edit: I've subsequently moved this to a maven project on github.

Monday, 28 September 2009

Stand-alone Java

This week I had to do something with Java I haven't done in a while. I had to build a small stand-alone application. It is to be deployed on a live site to manage some changes to the database there. I was going to just slip it into the web application so I would have all the usual stuff you get with that, ie libraries all there, database connections defined etc. But they want to run this with the app server turned off. So it has to be stand alone.

You can do these things quick and dirty but our support people are stressed out enough without giving them grief over a hard-to-install application. But I didn't want to go the whole way and use a real installer 'cos this is a one-off.

The first problem is that all those jar files are not in the classpath. Previously I have just wrapped the thing in a shell script (a bat script in this case because it is Windows). One of the differences between bat and sh is that the * doesn't do the same thing. In shell scripts it normally makes a list of all the files and that is easy for building class paths. Bat files don't do that.

When I can't use a shell script I fall back on ant which does a good job of putting together a classpath of everything in a directory and running the program. But they would have to install ant just for that. Seems like I could make their lives easier.

I read up on putting stuff the manifest file in the jar I'm buiding and that is the answer. This is what my manifest file ends up looking like:

Manifest-Version: 1.0
Ant-Version: Apache Ant 1.7.0
Created-By: 1.6.0_0-b11 (Sun Microsystems Inc.)
Build-Date: September 29 2009
Version: 02.05.07
Company: someone
Class-Path: lib/backport-util-concurrent.jar lib/cglib.jar lib/cl
asses12.jar lib/commons-beanutils.jar lib/commons-collections.jar lib
/commons-dbcp.jar lib/commons-digester.jar lib/commons-discovery.jar
lib/commons-pool.jar lib/dom4j.jar lib/ehcache.jar lib/ejb.jar lib/ej
b3-persistence.jar lib/habanero-core.jar lib/habanero-listener-select
ica.jar lib/habanero-test.jar lib/habanero-wip-hibernate.jar lib/hibe
rnate-annotations.jar lib/hibernate-commons-annotations.jar lib/hiber
nate.jar lib/jaxrpc.jar lib/jcl-over-slf4j.jar lib/jdom.jar lib/jta.j
ar lib/junit.jar lib/logback-classic.jar lib/logback-core.jar lib/oro
.jar lib/quartz.jar lib/saaj.jar lib/slf4j-api.jar lib/spring-aop.jar
lib/spring-beans.jar lib/spring-context.jar lib/spring-core.jar lib/
spring-dao.jar lib/spring-hibernate3.jar lib/spring-jdbc.jar lib/spri
ng-remoting.jar lib/spring-support.jar lib/spring-web.jar lib/tsom_co
re.jar lib/wsdl4j.jar lib/xercesImpl.jar lib/xml-apis.jar

The Main-Class is what gets launched by default. If you double-click the jar file (under Windows anyway) it will run this main class. The Class-Path contains all my jar file references. You can see they are all in the lib file and that is a relative reference. Java will look for the lib dir in the current directory and find the files in there.

To get this I use the following ant script in my build.

<path id="class.path">
<fileset dir="${build.release.dir}/lib" >
<include name="*.jar"/>

<pathconvert property="class-path" pathsep=" " dirsep="/">
<path refid="class.path"></path>
<map from="${build.release.dir}${file.separator}lib" to="lib" />

<jar destfile="${build.release.dir}/${}" compress="true" basedir="${build.classes.dir}">
<attribute name="Build-Date" value="${TODAY}"/>
<attribute name="Version" value="${build.version}"/>
<attribute name="Company" value="${}"/>
<attribute name="Class-Path" value="${class-path}" />
<attribute name="Main-Class" value=""/>

I already copied all the jar files into the lib dir (actually ${build.release.dir}/lib) and this is just building the classpath and the manifest.

Once the jar is built I zip everything up using this:

<zip destfile="${build.release.dir}/${}.zip"><zip destfile="${build.release.dir}/${}.zip">
<zipfileset excludes="*.zip" dir="${build.release.dir}"/>

Actually I have slipped some other stuff into my build.release.dir such as a pdf containing release notes and some config files. So the install is just an unzip, preserving the directories.

How about those database connections? I use system properties for that. I set up all the db stuff using SpringFrameworks and my beas file contains this:

<bean id="propertyPlaceHolder" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE">
<property name="location" value="">

That tells Spring to look for the properties file specified but if there is a system property that will override anything in the file. Then I can add this:

<bean id="Product2DataSource" class="org.apache.commons.dbcp.BasicDataSource" method="close">
<property name="driverClassName"><value>oracle.jdbc.driver.OracleDriver</value></property>
<property name="url"><value>jdbc:oracle:thin:@${host}:${port}:${database}</value></property>
<property name="username"><value>${product.user}</value></property>
<property name="password"><value>${product.password}</value></property>

This is standard Spring, of course. I expect the target environment will have different values, but probably the port will be the same. They can specify system properties in the command line that launches the program, something like:

java -Dhost=xyz -jar migrator.jar

I could have added some code to self-unzip the file but then I might as well put in a proper installer. This will keep it simple enough for the support people to use (they're clever guys, but they have a lot to do).

Saturday, 22 August 2009

Unit testing code

I had to build a complex module this last week and hand it to another developer to integrate into the app. This reminded me of a usually unmentioned value of unit tests: documentation.

Unit tests are all about writing little test jackets of code that exercise the production code and make sure it works. At their best they are designed to test the output and scream it if is not right, though lots of tests just print out something for a human to examine and verify.

I usually write unit tests to verify my code anyway, so of course I delivered my test to the other developer so that she could see how to call my module. She could run the test, step through the code and explore how it works, and she could verify my assumptions about the inputs (the spec was vague) because she is getting the real inputs and I had to mock up mine for the test.

When I have to work with code I don't know I usually look for a unit test first to tell me how it works. Perhaps projects should consider putting less effort into documentation (which no one reads) and more into unit tests. Documentation drifts out of date. Unit tests either work or they don't and even the exercise of fixing them can be useful in finding out how things work. Better to keep them maintained than fix them on the fly though.

Tuesday, 11 August 2009

A new monitor

The new monitor arrived today from Dell. I expected it on Thursday but it turned up two days early. I won't complain.

It is an SX2210, a 21.5" (we still use inches to measure these things, how long before everyone switches to metric?) and it plugs into my laptop. The resolution is slightly lower than the laptop but not that I notice. I like to have a second monitor when I am working and this will do the job nicely.

It has a webcam built in and some connections for a sound box that Dell sells. I don't do video calls and I already have good speakers when I want to watch a DVD. It also has a load of USB connections, so I could use it as a USB hub, but I don't see that happening either. What I wanted was a damn good monitor with a crisp, clear image, and I got that. It says it does HD and there's an HDMI socket in the back. So I guess when I start using HD stuff this will work.

Its max resolution is 1920 x 1080. My laptop is 1920 x 1200 and my other monitor (24") is the same (there is a good reason for having two of them but let's not get boring). What happens to Linux when it changes monitors? Good things. It auto senses and snaps up the right resolution no sweat. I haven't seen Windows XP do that, maybe Vista or 7 does, anyway Linux definitely handles it nicely. The third monitor (3? Yes, long story, but they are all in regular use) is not a wide screen job and nothing flash. I don't recall the numbers for it but they are smaller. Anyway, under XP I always had to manually modify the settings, under Linux I do not.

Friday, 7 August 2009


I ran into an interesting problem last week concerning Threadlocal. This is a class in Java. The details are here. You save something in a ThreadLocal class, usually held somewhere static, and any calls to fetch it out from the same thread will get the same value out again. You get the benefits of a static while staying thread safe.

This is really useful, but there are dangers, specifically there are problems with side effects. One of these I am about to describe.

Okay, we're about to commit using Hibernate. One of the fields in the Hibernate object maps to an object. Hibernate calls the object's method during the commit (I forget which method, probably serialize) and that method tries to get a configuration object from ThreadLocal. If it fails to get the configuration it instantiates one and carries on, loading the resulting configuration object into the ThreadLocal.

At this point bad things happen. That configuration object needs certain values and the default configuration does not do the job. We must have the configuration object loaded by something else and never by default.

This is an example of a side effect. Deliberate side effects can be good things. This one was not. It took a lot of time to figure out what was going on. We had no knowledge of the ThreadLocal. This was not our code, I won't embarrass the open source project it belongs to, but it would have been a lot better if an exception had been thrown if the configuration object was missing. We would have known what the problem was over a week ago.

So if you use ThreadLocals, or even statics, make sure you are really, really careful how they are initialised. You probably want to have them initialised in just one place using one mechanism.

Thursday, 6 August 2009

Thunderbird and Ubuntu

I run Thunderbird as my email client and some days it refuses to start.
I've found that reinstalling it and then rebooting helps, but not always.
Today I think I found the real reason is something do do with the dual monitor arrangement I have.
Usually there is a second monitor attached to my laptop. When the second monitor is missing but I have not changed the setting it screws up Thunderbird.

So a simple change of screen resolution (from 3840 x 1200 to 1920 x 1200 in my case) sorts it.
Now I can stop reinstalling and rebooting.

Glassfish plugin for Eclipse

I use Eclipse for Java development and usually the target application server is SJSAS, Sun's J2EE appserver. Sun seem to have trouble sorting out their naming of things sometimes. Recall Java version 1.5 or 5 or ??? It doesn't matter, but I think the story is that since they open sourced it they're calling it Glassfish. So, while my customer is calling it SJSAS I'm seeing Glassfish. This is fine.

At the moment the version we want is 2.1. There are preliminary versions of V3 available but not something I have time for just now.

What I do need is a better way to start and stop it and deploy the app for testing. They do this with a plugin in Eclipse which supports several app servers like JBoss etc. I tried various instructions I found on the web, people have gone to a lot of trouble on these. But the one that actually worked was here. The others involved downloading during the server setup and they gave me an ArrayOutOfBoundsException.

They basically use the Eclipse update mechanism to pull the plugin from
Then you can configure the server by using the server window, right click and pick 'New' and follow your nose.

However the update took a long time to install after it had pulled it down. I almost gave up waiting. It was sitting for ages taking about 50% of the CPU and looking a lot like an infinite loop from the outside. But it did finish eventually. I haven't verified all the functions but it definitely starts the server and stops it. I need to try out the deployment options.

Under JBoss my build just copies the ear to the deploy directory. That reloads automatically but if it doesn't (I don't always trust it) I can restart it from the plugin. I'm expecting to replicate that functionality for Glassfish but it won't be quite the same. I can copy to an autodeploy directory which ought to work but previous experiments seems to fail on this.

I will persevere.

Friday, 17 July 2009

Desktop shortcuts

Under Linux these are called launchers. It took a little while for me to spot this and I was trying to make these the hard way. But Ubuntu's right click menu on the desktop has a 'create launcher' option and that pretty much does it. You get to pick an application type, name it and put in a command. Simple. Well, not quite.

It is simple if you are just trying to run something like an executable which will pop up its own window and let you drive it. I wanted to run a build script and have it wait when it finished so I could see what happened.

The last bit is really important because builds often fail and it is, obviously, a waste of time to go test against a build that did not make it.

I googled and found various advice but none of it quite worked for me, though it did for other people. Here is what I ended up doing:

The command is
gnome-terminal --window-with-profile="keep" -e "/home/mrkimi/project/buildscripts/"

People suggested xterm instead of gnome-terminal but that did not work for me. Another thing that did not work was using $HOME instead of the full path. But the command is only half the story.

If you open up a terminal (and this is, of course, a gnome-terminal) there is a menu along the top of the window (no, I never noticed it before either). Use the Edit menu to open up the profiles dialog and create a new profile. I called mine 'keep'. Now edit the profile and look under the 'Title and Command' tab. There's a setting for 'When command exits' and I picked 'Hold the terminal open'.

This means that the terminal window doesn't just close, it waits there until I close it manually.

Some people suggested using the 'read' command to make the terminal wait, and that would save messing about with these profiles and settings. I gather read works like pause under windows. Except that it didn't for me, so I'm going with this approach.

Now I have my build script on a desktop icon just as I wanted and I can see how it finished.

You might wonder why I don't just launch the build from Eclipse. It is just an ant script and Eclipse is good at that. Well, normally I do but this particular project was put together by people who didn't know that and some of the decisions they made were a little strange. Some things you have to just live with, but a decent shortcut (sorry, launcher) to the build script makes it easier.

Friday, 10 July 2009

Linux is fast

I have now officially switched my laptop to Ubuntu 8.04 (Hardy Heron) from Windows XP.
The laptop is a Dell Inspiron 9400. I got it about two and a half years ago and I might have upgraded it sooner except that the new ones all come with Vista and I heard so much bad stuff about Vista that I held back. I figured the way to get past Vista was to switch to Linux and rather than deal with a new machine and a new o/s at the same time I also figured I could do it right away at my own pace.

I got a new disk: a 2.5" SATA drive that fits into the hard disk slot in the laptop, so I can keep the Windows disk for when I need it. I have been slowly walking stuff over to the new disk. I need this machine as my development environment so everything had to be verified before I could switch. I do this stuff carefully.

Anyway, I got to the point where I could run a build on my current project under Linux. This is Eclipse plus Ant building an ear file for deployment on JBoss and Glassfish. Everything works. But, even more interesting: it works much faster.

Doing a full build under windows takes 11 minutes and 40 seconds. Doing the same thing under Linux takes 44 seconds. Yes, 11 minutes shorter. Okay, I had a think about that and I find if I turn off Norton's virus monitoring for the relevant directories it drops to around 5 minutes. But we are still way too slow. My colleagues are using Windows XP on quad core desktops, newer and faster than my laptop. Their build times are two minutes. I'm still ahead.

Some of the difference can be accounted for by the new and much cleaner disk, it is possibly a faster disk too, not sure. Otherwise the hardware is identical. I'm not sure though, that seems a lot of time to account for with just the disk.

I have heard about Windows 7, and I've heard it is better than Vista. But right now I'm pretty happy with what I have. I've got a faster machine and I did not have to buy a new one.

Thursday, 2 July 2009


I got mine working. It was not easy or obvious and it took a fair bit of searching to find the right combination. Under Windows this is easy because there are no choices (well, no obvious ones to me). Under Linux you have to know more about what is going on and there is a lot to know. My approach to these things is to find out just enough to get it working. I have a lot of hard problems to solve, I don't want anymore, I just want this to work.

Under Windows I thought I just installed Subversion, actually I think someone set it up and put it all in a zip file which is what I loaded. That all just worked. So I took a closer look at my Windows system.

There is a plugin called Subversive from polarion. It depends on a connector called JavaHL which is a Win32 binary file, so no use under Linux. Okay, so I need to do something different. Googling revealed another product called Subclipse which is (I think) from Tigris who also do the Subversion server. Got that? Three products that start with 'Sub' easy to get mixed up here.

Both Subclipse and Subversive can use the JavaHL binary. Some people have pulled the JavaHL source and compiled it. But that seems to be no longer necessary because you can install it using the Package Manager now. I tried this, including editing the eclipse.ini file to add the special option but it did not work for me.

The other approach that both of these clients use is the SVNKit which is Java all the way down. I installed that and it works fine.

This is the list of relevant stuff in my Eclipse 'Software Updates and Add-ons' the Installed Software tab.

This is what my subversion page looks like in my windows->preferences:Team/SVN:SVN Connector

I have yet to find any missing features with this approach. Maybe there is a speed issue but I'm on a slow link at the moment and it is... slow, but no slower that I would expect. It handles file compare just fine, some things I read suggested some combinations of things wouldn't do that.

The crucial urls you need (and these things seem to shift over time so this is only correct today):

Just to be precise about versions:
Ubuntu 8.04 Hardy Heron
Eclipse 3.4

Friday, 12 June 2009


When I first heard about twitter I mostly ignored it like I ignored Instant Message. I have better things to do with my time than watch what other people are doing. IM always seemed to be just another source of interruptions. I shut down my email much of the time and check it when I'm not doing something more important. If people need to contact me urgently they can phone (unless I have turned my phone off... which I do).

Actually I even filter emails into ones that are addressed to me and ones that are only CC'd to me. The CC ones go into a different folder. I only check those when I am bored. Given some of the people I work with only seem to know about the 'reply all' button this saves me a lot of time. It is also quite nice to just clear that folder, knowing there is never anything important in it.

But back to Twitter. I work in a team spread over different locations. When time gets tight there are a lot of 'where are you up to?' emails from project managers. They like to have conference calls too, not quick ones either. It gobbles up time like crazy. So I started using twitter. If they want to find out what I'm up to they can check that. Saves a lot of back and forth.

Even more useful, I RSS my own Twitter into my email (this is simple enough on Thunderbird, probably on other email clients too). It seems narcissistic, but this gives me a simple way to track what I am doing. It comes in handy for filling out timesheets. The Twitter page itself is not very clear as to what time an entry was made, but the email always has a timestamp. It doesn't have to be dead accurate.

I could write a small application to do this, but it wouldn't tell my project manager what I am doing. Twitter manages both. I like the succinct format, just a few characters to say what I need and move on. Works for me.

Friday, 5 June 2009

Oracle XE

There are lots of things I don't like about Oracle, but some of them are historical. Back in the '90s I wrestled with upgrading from something like Oracle 6 to 7 on an Ultrix box for a week. It was just an upgrade on a dev machine and it was far more complicated than it should have been. They're better than that now. Even so I grit my teeth and worry whenever I have to do anything that looks like installing Oracle.

For developers it always seemed such overkill that to install a database you end up with an application server and a bunch of other stuff. I just wanted a database. I have a different application server (and, more to the point, my customers have a different application server). So I have to have two of them running on my dev machine.

Enter Oracle XE. They have done something good here. It is easy to install, free to developers and is just a database with the overhead of a database not a big fat application server. It runs fine on Windows and it runs fine on Linux (Ubuntu anyway, where I've tried it). In both cases the install is clean and simple. Go Oracle.

It doesn't allow you to have multiple databases, but it does allow you to have multiple schemas. I tend to work on multiple projcts at once which really need multiple databases (the projects typically have the same schema names but different data and slightly different definitions). There was a time where this would have been a pain but the support folks in my day job have got their act together and built decent db scripts for each project. So when I switch I can run the script and I'm done. There's an overhead there, but it is less of an overhead than running full Oracle.

With those issues sorted Oracle is a good development db. Though these days I like to hide everything behind Hibernate to keep portable.

Tuesday, 2 June 2009

Ubuntu Printing

I'm migrating my laptop (Dell Insprion 9400) to Ubuntu 'Hardy Heron" 8.04.

Tonight I tried hooking up to the printer, which is an HP 3420 connected through a TrendNet TE100-P1U print server, and that is plugged into a wireless hub. My laptop talks to the hub just fine already. The internet connection is on the hub as well.

Setting up the printer is fairly simple. On windows I had the following specified:
port name: IP_x192.168.123.10
printer name or IP Address:
Queue name: U1

For Ubuntu I found I just had to pick:
New Printer
LPD/LPR Host or Printer
Give it a host name ( in my case). This is a fixed address, I don't think you'd want DHCP addresses for your print server.
Ignore the Queue requested
Specify the printer type (HP then 3420 in my case)
...and then the fun starts. Well no. It prints blank pages.
It sounds like it is printing, looks like it is printing but what I get is a blank sheet of paper.

I found a few references on this. The most useful was here but it is kind of terse. The link he gives there (http://localhost:631) gets you browsing the CUPS UI. It isn't all that clear what you have to do. However I had not seen the CUPS UI before. It is probably a nicer way to control the printers than the UI on the desktop.

What I did was pick 'modify printer' and work through the wizard sequence to the page shown in the image. By default the other option for HP 3420 was picked and the solution was to pick the one I show in the pic. Then it all came right. The test page printed etc.

The underlying issue seems to be that by default the printer uses only the coloured inks and never the black ink. Like lots of people our coloured inks are empty and we only use the black one, so nothing printed.

From reading several other entries this is not specific to the 3420, or even HP printers. So your fix may vary a little from mine.