Tony Blair – Peace Envoy

I’ve just read this article about Tony Blair, our beloved former Prime Minister, who is stepping down from his role as Middle East Peace Envoy.

http://www.bbc.co.uk/news/uk-32905468

Apparently he feels that he can best serve by not having any formal role. Now, I speak as someone who doesn’t really know what he’s done behind the scenes since 2007 but I do think there is something he could do that might make a big difference. This is to apologise for the mess that the Middle East is in ever since he and George W Bush invaded Iraq. There would be no ISIS, no crisis in Libya and Egypt if they hadn’t so dramatically upset an already very delicate and sensitive region. To this day I can find no credible reason why they did this AND why they thought this was a very good idea.

As a peace envoy to most Arab states he must be as welcome as an undertaker at a wedding measuring the guests up for their coffins. He should never have accepted the role in the first place and slunk away to hide under the nearest stone.

Watership Down – Richard Adams

Some time ago I read ‘Duncton Wood’ by William Horwood, and as I’ve said before I remember reading this book and enjoying it when I was younger, but now that I’m older I was less keen. One of the reasons for this was the length. When I was younger that didn’t bother me, in fact the longer the better since that meant I could stay longer ‘inside’ the book. Now, and perhaps it is because of re-reading the book, I was less thrilled and felt that the book had been dragged out longer than need be. It was therefore with a slight sense of trepidation that I picked up my old battered copy of ‘Watership Down’. It has been probably about twenty years since I last read this book and I was afraid that re-reading it might alter the fond memories I had of the book.

I needn’t have worried. Perhaps the mark of a great book is the joy to be found re-reading it and discovering new things and new pleasures. Perhaps, it is credit to the author who made the story just long enough, avoided too much sentimentality and who wrote a great story. The fact that the characters are rabbits does not disguise the fact that the challenges they face and over-come are human. We all face threats and problems and have to find solutions if we are to move on.

One thing that I noticed more with this re-reading is Richard Adam’s love of the English countryside beyond the obvious use of plants for the characters’ names. The seasons, the sounds and the smells help bring the book to life. Another thing I discovered is that Richard Adams is now 94 and hopefully still hale and hearty. Long may that continue.

When writing my book reviews I don’t say much about the plot. I don’t particularly want to spoil it for anyone picking it up for the first time. I first read my Nan’s copy sometime in the late 70s or the very early 80s. I remember being very proud that she let me borrow the book and then the enjoyment of reading it. My own copy has a still from the animated film, which I remember watching at the local cinema and being a touch disappointed with. I couldn’t quite understand the need to change the plot, it simply doesn’t need it, although the animation was rather good. Another thing my book has is an ‘official’ Watership Down bookmark that I made. It was rather sweet to find it and realise how much I enjoyed this book.

Curiously though I’ve never read any other Richard Adams’ books. Perhaps I ought to.

Is TDD Dead?

I have been watching the Google Hangout’s debate between Martin Fowler, Kent Beck and David Heinemeier Hansson concerning Test Driven Development and why we do it? The debate is titled ‘Is TDD Dead?’ a rather emotive title, guaranteed to grab attention! It worked with me. I’ve been using TDD for a number of years now and I must admit I’d rather use it than not. For me un-tested code is potentially full of bugs and very prone to becoming bloated and ‘smelly’. A good test harness helps prevent this, and it should also aid in understanding and documenting the code.

In the debate David HH takes the anti-TDD stance, not because he’s against testing code but he’s arguing against the ‘cargo cult’ that seems to have grown up around TDD. There is a danger with TDD in taking it too far. One of the comments many people make when first using TDD is that the code ends up in lots of little classes, so instead of searching throughout a large file for the relevant piece of code you end up searching through lots of files. 

In the debates the person I’ve enjoyed listening to and the one I’ve admired the most is Kent Beck. He argues from a pragmatic, common-sense point of view. One thing I particularly liked was his emphasis on design. Kent Beck uses TDD to give him confidence in the code but he doesn’t use it to force a design on the code. I think that might be the crux of David HH’s argument.

Anyway tonight I’m going to attend a talk about ‘Is TDD dead?’ at XP Manchester. It will be interesting to know what others think.

Use the new keyword if hiding was intended

A couple of months ago at work we put some new web services into production. All seemed to go well and everything seemed nice and fast but we noticed that some of these services were using a lot of CPU and memory. We had used JMeter to load test these services quite extensively so I was fairly convinced that there wasn’t really a memory leak but, of course, one can never be too certain.

Now, in these web services we know that one of the methods was used pretty extensively so to begin with I focussed my efforts on it to see if there was anything that might explain this curious behaviour.

It turns out that there was a memory leak and it wasn’t in the depths of the code but right at the top. I’ve got an ASMX and its code behind starts like this:

public class CustomWeb: System.Web.Services.WebService {
    private CustomService customService;
    private SitesListService sitesListService;
    public CustomWeb() {
         customService= new CustomService();
        sitesListService = new SitesListService();
    }
    public void Dispose() {
        customService= null;
        sitesListService = null;
        base.Dispose();
    }

I’ve removed most of the code but you can see a fairly plain constructor and a dispose method. Pretty basic stuff. This compiles perfectly well but there was a warning displayed regarding the Dispose method. This says.

CustomWeb.Dispose() hides inherited member ‘System.ComponentModel.MarshallByValueComponent.Dispose() Use the new keyword if hiding was intended.

Initially I read this quite literally. I didn’t want my Dispose method to be hidden in fact I wanted it to be called to mark the objects created in the constructor to be null to free up memory. As this web method inherits from the WebService I thought that perhaps its Dispose method needed to be over ridden but doing that caused an error to be shown.

I’ve never used the new keyword in a method signature before and doing so to hide a method didn’t really make sense. However, that apparently is what I should be doing. This was the cause of my memory leak. My Dispose method should be written like this:

public new void Dispose() {
    customService = null;
    sitesListService = null;
    base.Dispose();
}

It would seem that the previous way without using the new keyword prevented the base.Dispose method from being called so therefore every request to the web service was resulting in CPU usage and memory usage being much higher than expected. At a guess there might have been less damage caused by not having a custom Dispose method at all, that way the base method would be called and the resources freed. The new objects created in the constructor would have eventually been picked up by the garbage collector.

However, I’m glad we made this mistake since it illustrates a couple of things. Firstly, don’t ignore the warnings. Yes, your code might have built but it’s worth checking the warnings. The clues in the name after all. Secondly, now I know a little bit more about web services. It’s odd to think that these three letters ‘new’ have resulted in memory for my web services running at a consistent level, whereas previously they used to climb to about 1GB in size before then dropping, and the CPU usage now being 3 – 9% (depending on load) as opposed to 35% – 60%. That’s quite something.