An alternative CV strategy

This is my second attempt at writing this. Memo to self: after hitting the Post button, make sure the post has actually been published before navigating away from the page…


I’ve been reading a fair number of CVs recently, and I’ve been struck by just how much experience everyone seems to have. At least, everyone claims to have a breadth of experience that I just can’t match. I haven’t counted, but I suspect most of the CVs I’ve been looking at have listed over 100 technologies. In light of this, I’ve been considering how I’ll market myself when I’m next interested in getting a job.


There are a few things in my favour which most candidates don’t have, mostly in terms of community – MVP awards, book reviewing, web articles, this blog, newsgroup posts, open source contributions etc – but I don’t know how much attention prospective employers really pay to that kind of thing. What I find frustrating is the way that traditional CVs don’t really convey any of what I find important – either as a potentially employee or as someone involved (to whatever extent) in the hiring process. I have begun to wonder whether a list of values would do me any favours:


  • I prefer working code over perfect UML
  • I prefer whiteboards over Visio
  • I prefer code which can easily be read over code which runs 5% faster but no-one else understands
  • I prefer code reviews which force me to change my design over reviews which stroke my ego
  • I prefer being laughed at due to my trousers over being disrespected for being sloppy
  • I prefer going home at 5 to sleep on a problem over staying at the office until midnight and then being useless the next day
  • I prefer carrots over sticks
  • I prefer progress over process
  • I prefer keen developers with much to learn over experienced developers who feel they have nothing to learn
  • I prefer close collaboration over the heroic coder mentality
  • I prefer solving problems people are having in the real world over providing marketing with a new toy to show off

Maybe that doesn’t go far enough towards selling me though. How about some more direct statements?


  • I write clean code in a timely manner
  • I test my work and refactor mercilessly
  • I don’t assume my code is perfect
  • I love to learn new techniques and technologies
  • I love to teach, and can explain things clearly
  • I pick up new things quickly
  • I have an affinity for code which lets me solve issues quickly
  • I bring passion to whatever I do

If someone presented me with a CV based on the above lists, I’d be interested. Yes, I’d probably check that the candidate had worked in some sort of similar area before, but frankly if you take a bright person and ask them to learn Java or C#, it’s not going to take them that long to do it. Learning design principles takes longer (I’ll let you know if I ever think I’ve finished!) but with good mentoring, it’s not a problem.


CVs can’t be trusted. People can write pretty much anything on them. However, they’re making a choice about what image to present to the world – and that choice itself makes a statement. I want to work with smart people who love what they do. I want to see a spark in their eyes when they tell me what they’ve been up to. At an interview, I want them to be so busy getting me enthusiastic about what they’ve been looking at that I don’t have time for the standard questions.


You may well consider the lists above to be unprofessional to an extent. I agree – but I’m not sure whether it’s a problem. I enjoy my work immensely – so much so that I hardly think of it as work for a lot of the time. That’s not to say it’s not important to do a professional job – but there’s often not much of a gap between what I’m interested in for fun and what I earn money doing.


I suspect if I gave an unconventional CV to an agency they’d either demand a rewrite or they’d change it themselves. Maybe they’d be right to do so – maybe managers aren’t really keen on this sort of thing. What do you think? Comments are always welcome on my blog, but I’m particularly keen on feedback this time, as it could have a real bearing on what I do when I’m next in the job market.


Elegant comparisons with the null coalescing operator


A while ago I commented on how I’d like a return? statement, which
only returned if the return value was non-null. The purpose of this was to remove
the irritation of implementing Equals and IComparable.CompareTo
on classes with several properties. For an example of the kind of thing I mean,
consider an Address class with properties Country,
State, City, ZipCode and HouseNumber.
(Apologies to readers who aren’t American – while I feel a traitor to my country for
using state instead of county and zip code instead of post code, I’m guessing there are
more readers from the US than from elsewhere.)



This Address class needs (for whatever reason) to be comparable to itself,
comparing the properties in the order described above, in normal string comparison order.
Let’s see how annoying that is without doing anything clever. (I haven’t included
any property implementations or constructors, but I’m sure you can all guess what they’d
look like. Similarly, I haven’t overridden object.Equals or object.Hashcode,
but the implementations are trivial.)


using System;

public sealed class Address : IComparable<Address>
{
    string country;
    string state;
    string city;
    string zipCode;
    int houseNumber;
    
    public int CompareTo(Address other)
    {
        if (other==null)
        {
            return 1;
        }
        int ret = country.CompareTo(other.country);
        if (ret != 0)
        {
            return ret;
        }
        ret = state.CompareTo(other.state);
        if (ret != 0)
        {
            return ret;
        }
        ret = city.CompareTo(other.city);
        if (ret != 0)
        {
            return ret;
        }
        ret = zipCode.CompareTo(other.zipCode);
        if (ret != 0)
        {
            return ret;
        }
        return houseNumber.CompareTo(other.houseNumber);
    }
}


That’s ignoring the possibility of any of the properties being null. If
we want to include that possibility, it’s worth having a static helper method which
copes with nulls, along the lines of object.Equals(object, object).



Now, if we don’t care about doing more comparisons than we really want to and
potentially creating an array each time, it wouldn’t be hard to implement a series
of overloaded methods along the lines of:


public static int ReturnFirstNonZeroElement(int first,
                                            int second,
                                            int third)
{
    return first != 0 ? first :
           second != 0 ? second :
           third;
}


(The array part would be when you implement ReturnFirstNonZeroElement(params int[] elements)
after you’d got enough overloads to get bored.)



That still ends up being a lot of code though, and it’s doing unnecessary comparisons.
I’m not keen on micro-optimisation, of course, but it’s the inelegance of it that
bothers me. It feels like there must be a way of doing it nicely. With
C# 2.0 and the null coalescing operator, we do. (At this point I’m reminded that
the irritation actually came when writing Java, which of course doesn’t have anything
similar. Grr.) For those who are unaware of the null coalescing operator (and it’s one of the
least well publicised new features in C# 2.0) see
my brief coverage of it.
Now consider the following helper method:


public static int? CompareFirstPass<T>(IComparable<T> first, T second) 
    where T : IComparable<T>
{
    if (first==null)
    {
        return -1;
    }
    // Assume CompareTo deals with second being null correctly
    int comparison = first.CompareTo(second);
    return comparison==0 ? (int?)null : comparison;
}


In short, this returns the result of the comparison if it’s non-zero, or null otherwise.
Now, with the null coalescing operator, this allows the Address class implementation
of CompareTo to be rewritten as:


public int CompareTo(Address other)
{        
    return other==null ? 1 :
           Helper.CompareFirstPass(country, other.country) ??
           Helper.CompareFirstPass(state, other.state) ??
           Helper.CompareFirstPass(city, other.city) ??
           Helper.CompareFirstPass(zipCode, other.zipCode) ??
           houseNumber.CompareTo(other.houseNumber);
}

It’s short, simple and efficient. Now, doesn’t that make you feel better? :)

Broken windows and unit testing

There’s quite possibly only one person in the world reading this blog who doesn’t think it’s got anything to do with Vista. The windows in the title have nothing to do with Microsoft, and I’m making no assertions whatsoever about how much unit testing gets done there.


The one person who understands the title without reading the article is Stuart, who lent me The Tipping Point before callously leaving for ThoughtWorks, a move which has signficantly reduced my fun at work, with the slight compensation that my fashionable stripy linen trousers don’t get mocked quite as much. The Tipping Point is a marvellous book, particularly relevant for anyone interested in cultural change and how to bring it about. I’m not going to go into too much detail about the main premises of the book, but there are two examples which are fascinating in and of themselves and show a possible path for anyone battling with introducing agile development practices (and unit testing in particular) into an existing environment and codebase.


The first example is of a very straightforward study: look at unused buildings, and how the number of broken windows varies over time, depending on what is done with them. It turns out that a building with no broken windows stays “pristine” for a long time, but that when just a few windows have been broken, many more are likely to be broken in a short space of time, as if the actions of the initial vandals give permission to other people to break more windows.


The second example is of subway trains in New York, and how an appalling level of graffiti on them in the 80s was vastly reduced in the 90s. Rather than trying to tackle the whole problem in one go by throwing vast resources at the system, or by making all the trains moderately clean, just a few trains were selected to start with. Once they had been cleaned up, they were never allowed to run if they had graffiti on them. Furthermore, the train operators noticed a pattern in terms of how long it would take the “artists” in question to apply the graffiti, and they waited until three nights’ work had been put in before cleaning the paint off. Having transformed one set of trains, those trains were easier to keep clean due to the “broken windows” effect above and the demotivating aspects of the cleaning. It was then possible to move onto the next set, get them clean and “stable”, then move on again.


I’m sure my readership (pretentious, eh?) is bright enough to see where this is leading in terms of unit testing, but this would be a fairly pointless post if I stopped there. Here are some guidelines I’ve found to be helpful in “test infecting” code, encouraging good practice from those who might otherwise be sloppy (including myself) and keeping code clean once it’s been straightened out in the first place. None of them are original, but I believe the examples from The Tipping Point cast them in a slightly different light.


Test what you work with


If you need to make a change in legacy code (i.e. code without tests), write tests for the existing functionality first. You don’t need to test all of it, but do your best to test any code near the points you’ll be changing. If you can’t test what’s already there because it’s a Big Ball of Mud then refactor it very carefully until you can test it. Don’t start adding the features you need until you’ve put the tests in for the refactored functionality, however tempting it may be.


Anyone who later comes to work on the code should be aware that there are unit tests around it, and they’re much more likely to add their own for whatever they’re doing than they would be if they were having to put it under test for the first time themselves.


Refactor aggressively


Once code is under test, even the nastiest messes can gradually get under control, usually. If that weren’t the case, refactoring wouldn’t be much use, as we tend to write terrible code when we first try. (At least, I do. I haven’t seen much evidence of developers whose sense of design is so natural that elegance flows from their fingers straight into the computer without at least a certain amount of faffing. Even if they got it right for the current situation, the solution isn’t likely to look nearly as elegant in a month’s time when the requirements have changed.)


If people have to modify code which is hard to work with, they’ll tend to add just enough code to do what they want, holding their nose while they do it. That’s likely to just add to the problem in the long run. If you’ve refactored to a sane design to start with, contributing new elegant code (after a couple of attempts) is not too daunting a task.


Don’t tinker with no purpose


This almost goes against the point above, but not quite. If you don’t need to work in an area, it’s not worth tinkering with it. Unless someone (preferrably you) will actually benefit from the refactoring, you’re only likely to provoke negative feelings from colleagues if you start messing around. I had a situation like this recently, where I could see a redundant class. It would have taken maybe half an hour to remove it, and the change would have been very safe. However, I wasn’t really using the class directly. Not performing the refactoring didn’t hurt the testing or implementation of the classes I was actually changing, nor was it likely to do so in the short term. I was quite ready to start tinkering anyway, until a colleague pointed out the futility of it. Instead, I added a comment suggesting that the class could go away, so that whoever really does end up in that area next at least has something to think about right from the start. This is as much about community as any technical merit – instead of giving the impression that anything I had my own personal “not invented here” syndrome (and not enough “real work” to do), the comment will hopefully provoke further thought into the design decisions involved, which may affect not just that area of code but others that colleagues work on. Good-will and respect from colleagues can be hard won and easily lost, especially if you’re as arrogant as I can sometimes be.


Don’t value consistency too highly


The other day I was working on some code which was basically using the wrong naming convention – the C# convention in Java code. No harm was being done, except everything looked a bit odd in the Java context. Now, in order to refactor some other code towards proper encapsulation, I needed to add a method in the class with the badly named methods. Initially, I decided to be consistent with the rest of the class. I was roundly (and deservedly) told off by the code reviewer (so much for the idea of me being her mentor – learning is pretty much always a two-way street). As she pointed out, if I added another unconventional name, there’d be even less motivation for anyone else to get things right in the future. Instead of being a tiny part of the solution, I’d be adding to the problem. Now, if anyone works in that class, I hope they’ll notice the inconsistency and be encouraged to add any extra methods with the right convention. If they’re changing the use of an existing method, perhaps they’ll rename it to the right convention. In this way, the problem can gradually get smaller until someone can bite the bullet and make it all consistent with the correct convention. In this case, the broken windows story is almost reversed – it’s as if I’ve broken a window by going against the convention of the class, hoping that all the rest of the windows will be broken over time too.


This was a tough one for me, because I’ve always been of the view that consistency of convention is usually more important than the merit of the convention. The key here is that the class in question was inconsistent already – with the rest of the codebase. It was only consistent in a very localised way. It took me longer to understand that than it should have done – thanks Emma!


Conclusion


Predicting and modifying human behaviour is an important part of software engineering which is often overlooked. It goes beyond the normal “office politics” of jockeying for position – a lot of this is just as valid when working solo on personal code. Part of it is a matter of making the right thing to do the easy thing to do, too. If we can persuade people that it’s easier to write code test-first, they’ll tend to do it. Other parts involve making people feel bad when they’re being sloppy – which follows naturally from working hard to get a particular piece of code absolutely clean just for one moment in time.


With the right consideration for how future developers may be affected by changes we make today – not just in terms of functionality or even design, but in attitude, we can help to build a brighter future for our codebases.