My first Cassandra contribution

A bit surprisingly and somehow accidentally, today I became a Cassandra contributor. I had a problem with the project I do for work, which made us unable to make our bulkloading script work together with Cassandra authentication (which I described in one of the previous articles on this blog), so we decided to try solving this issue on our own.

The solution was quite simple, but gave me a bit more of Cassandra knowledge and understanding. If you are interested in contributing to Cassandra I think you can take a look at this problem and the solution or even try to reproduce this problem (on Cassandra 1.1.0-rc1 or earlier) and then try to solve it on your own. As I said it was simple, so you won’t get frustrated with the problems you will face, but I think it’s good start for something more. Here is the link to Cassandra’s bug tracker issue:

https://issues.apache.org/jira/browse/CASSANDRA-4155

Working on interesting things, being nicely paid for this and contributing to remarkable Open Source projects in the same time – could it be any better? ;)

Installing Hadoop on Ubuntu 11.10 Oneiric Ocelot

Just a quick tip if you try to set up Hadoop on your Ubuntu 11.10 and you wonder if Maverick’s version will work – yes, it will. Just follow the installation guide inserting this:

deb http://archive.cloudera.com/debian maverick-cdh3 contrib
deb-src http://archive.cloudera.com/debian maverick-cdh3 contrib

in /etc/apt/sources.list.d/cloudera.list – that’s it!

OK, not exactely – some people (I was one of them) report a NullPointer Exception. Something similar to this:

1
2
3
4
5
6
7
8
9
Error: java.lang.NullPointerException
        at java.util.concurrent.ConcurrentHashMap.
            get(ConcurrentHashMap.java:768)
        at org.apache.hadoop.mapred.ReduceTask$
            ReduceCopier$GetMapEventsThread.
            getMapCompletionEvents(ReduceTask.java:2683)
        at org.apache.hadoop.mapred.
            ReduceTask$ReduceCopier$GetMapEventsThread.
            run(ReduceTask.java:2605)

check your Hadoop host settings (preferrably use IP instead of hostname) and/or /etc/hosts, which may contain a strange entry with something like .(null) – just leave one, proper hostname in that line. For me – it started to work after this fix.

I won’t risk saying that it’ll be OK for production enviroment (production server with Ubuntu? OK…), but for testing – it works perfectly.