June 2008 Archives

Over the years, I have tried many different approaches to software estimation.  From the Ceiling and Weller method (look up to the ceiling, scratch your chin and say "Well-errr"), FITA analysis (Finger In The Air) through various variations (both formal and informal) of metric based estimation techniques to things like IFPUG etc.

Today, I had something happen that reminded me why software estimation is so hard. I've just wasted about 8 hours trying to figure out an issue with what I thought was some bizarre firewall problem with Windows Vista SP1 x64 when it turns out that some code that I wrote was actually working correctly and was picking up a HTTP proxy preference that was pointing to a server that no longer exists.  I'd just forgotten I'd fixed this bug, and that I had the proxy set in my preferences.

I took two lessons from this frustrating day.

  1. We need a better error message when your proxy is no-longer available
  2. Software estimation is hard

In discussions with my wife she frequently struggles when some nights I finish work complaining that I am 3 days behind and the very next afternoon I can be caught up or even a little ahead.

The best way I have come up with to explain this, is to get her to imagine that I had given her 60 Suduko puzzles all rated as 10 minute puzzles.  Should take you 10 hours right?  Now see how long it takes you to do each puzzle.

suduko_times Suduko is the closest analogy to computer programming that I can find for "normals" - i.e. people that don't code.  This only really works if the person does Suduko puzzles, but my wife does so it works in our house.  Suduko is a numeric analytical problem solving activity.  While there are tricks and techniques to solving some puzzles, there is a significant challenge and difference to each one. Looking at the puzzle, it is hard to know if it is going to be hard or easy.  You can get stuck down blind alleys and have to start all over. Also when you "get into the zone" you can often make surprising intuitive leaps that often defy verbal explanation afterwards. Finally, solving a hard Suduko puzzle quickly involves a fair degree of luck and depends on your state of mind at the time of trying the puzzle.  There is a great amount of satisfaction to be gained from solving a Suduko puzzle along with a high degree of frustration when you cannot solve one - you know that it must be possible after all.

So.  Each task (solving a Suduko puzzle) should take about 10 minutes, with-in a certain error range.  In software estimation a really good developer doing a well known and well defined task can only hope to get their estimate to around 25% accuracy (a "hard estimate") - and then there are always the odd random occurrences that throw you way off.

I've been in interviews where the candidate swears blind that they always finish a task on time.  This tells me two things about that candidate:

  1. They are a liar
  2. They either never estimate how long something is going to take up-front or they over-subscribe to the Scotty principle of estimation. (Or as Microsoft are fond of saying, under promise - over deliver)

Does this mean that we should give up on estimation?  Of course not.  Planning a project is going to be pretty hard if you have no idea of roughly when you are going to be finished, what the end result is going to do or how much it is going to cost.  Not to forget that when estimating a large number of tasks you can rely on the fact that some of the work you finish early will offset the work that takes longer than expected.  However, understanding the nature of software development and how it differs from, say, laying bricks, helps you be more likely to succeed.

The problems inherent in software estimation also help me to understand why Agile software development methodologies work. 

  • You are forced to break problems down into small bits that can be managed, tracked and measured.
  • Small iterations means that you can only get so far behind.
  • You re-estimate and re-prioritise work at each iteration when a hard estimate on that task is possible (and therefore your estimates are more likely to succeed)
  • You frequently listen to the person who will be using the software
  • You can load your iterations so that there is always work to do of the correct priority to the customer when you have finished a task early and things of the appropriate priority that you can drop off the list when something takes longer

There are also many other techniques that people can adopt from Agile methodologies or just from common sense to reduce the amount of time taken when a task is taking longer than expected, such as

  • Limit distractions. You need time to concentrate on a single problem otherwise you will never finish it.
  • Pair programming. Identify when a second pair of eyes is needed and quickly as for help in your team (can be hard when your team are distributed across multiple time-zones)
  • Daily progress meetings. The guy that has been stuck for 24 hours on what he thinks is a firewall issue is quickly identified)
  • Lack of ego.  I'm the "Windows guy" in our company, if I say something is a firewall issue with Windows Vista SP1 x64 and a probably collision with the new Eclipse 3.4 launcher executable - then the new intern who only knows about the Mac he used at college should feel like he can ask a "dumb" question and say "Have you checked your proxy settings after that bug you were fixing yesterday".

Anyway.  All common sense stuff and writing this post has been a nice way to get rid of the frustration of finding why I was stupidly stuck for the past day.  Now to stop distracting myself further and get on with some more work :-)

Many people know that Team System and Team Foundation Server are incredibly extensible platforms.  Mike Azocar (a fellow Team System MVP) has come up with a great idea to see up can come up with the coolest Team System add-on.  If you have an idea that you've been waiting on trying out then now might be a good time as you get the chance to win a one year Team Suite MSDN Subscription (worth over $10,000) among other valuable prizes and the exposure for your gadget on most of the world's leading Team System blogs.

Because I doubt I'm going to be able to do better than my robot rabbit TFS add-on, I'm going to help judge the competition.  I am really looking forward to seeing what folks can come up with.

For more information, see Mike's blog post.  Happy coding!

Thomson Reuters I am proud to announce that Microsoft have just published a joint case study with us on the success Thomson Reuters have had using Team Foundation Server in a mixed development shop.  This customer is particularly interesting, not just because they keep giving us great feedback on our product that we have been incorporating into Teamprise, or because they are a large, well know and well respected brand.  From the case study;

"The Online Services group at Thomson Reuters is responsible for the storage and retrieval of online assets. Of the 220-member team, approximately 150 are development engineers or quality engineers. Although the team does some programming using the Microsoft® .NET Framework, the group primarily develops in Java on computers that run a variety of operating systems, including Linux, Linux 64, UNIX, Macintosh, and Windows®. About 90 percent of the programmers in Online Services work in Eclipse or Rational Application Developer (RAD), and up to 50 percent of the testers work in Eclipse. All of the team’s build computers run UNIX or Linux."

Anyway, thanks to Mac and the people at Thomson Reuters for agreeing to share their experiences.  Hopefully other organizations considering Team Foundation Server to manage the whole software development process will find the case study interesting.

To read the case study in full, see Microsoft Case Studies: Thomson Reuters Unify Development Processes with Team Foundation Server and Teamprise.  I've also got a PDF version available here.

Last Check-in Date Explained

I've been doing a lot of work with the Team Foundation Server 2008 SP1 Preview, and even recorded a podcast about it (also see Brian Harry's blog post for more details on TFS 2008 SP1 features).

One out of the many new features introduced in TFS 2008 SP1 is the "Last Check-in" column in the source control explorer. It is a handy little thing that I think a lot of people will find useful. 

Last Check-in Date Column in Visual Studio Source Control Explorer

However just a couple of warnings for you for behaviour that you might not expect at first.

  1. The date shown for folders is the date that the folder was added, not the last date that any contents of that folder where checked in.  That means you cannot use it to drill down onto the most recently changes files - to find that out you should still do a "View History" on the parent folder and look at the changesets.
  2. If you are using a Visual Studio 2008 SP1 client (or Teamprise 3.1 for that matter when it is released) and you point it at a server prior to TFS 2008 SP1 (i.e. TFS 2005 or the RTM release of TFS 2008) then you do not get any data in this column because the server doesn't send back that data to the client.

Otherwise it works pretty much as you expect.  Most useful is that you can obviously sort the column to find the recently changed files in a big list of files.

Archives

Creative Commons License
This blog is licensed under a Creative Commons License.