Monday, January 11, 2010

Technical debt or tumour?

I've found that technical debt is not a good enough metaphor for when you do it quick instead of doing it right.
Telling management that when you do it quick you will incur in debt that has to be paid later helps them understand the problem but I think the problem is much worse than just incurring debt.

Because, for me, incurring debt means that later on paying it with some interest you can get away with it. And I think that this interest is localised to the bit that we just rushed in.
This might be the case if the rushed area is isolated and its interface is well designed. But how often is this the case?

I think most people agree with the "broken window" effect. An already broken/neglected code base tend to degrade faster then one which isn't.

If we consider that a technical debt is a "broken window", we can see that it might behave more like a tumour. Areas that need the broken bit now need to adapt to it, which will possibly increase the debt, which might complicate other areas that depend on the just affected area and so on... spreading like a tumour...

On top of this, even areas that are not directly related to the broken bit might start neglected because of the broken window effect.

So, if the area is removed fast enough (the debt is payed in a timely manner), you're safe, otherwise you're dead because your technical debt might have already leaked into too many places and soon people will start talking about the big rewrite...

Tuesday, December 1, 2009

I was hired in December last year to lead and mentor a team of 4 devs and a tester. It was my first time as a team leader, even though I've already taken the role of mentoring some people before when I worked closely with some colleagues. I want to write down what went right and what could have been better and what lessons have I learned from this experience.

Context:
A rewrite of one of main server applications of the company to be more flexible, faster and more reliable. The team would be doing it the XP way. Only I, in the team was familiar with XP practices.

the team:

The team of constituted of two senior devs, one with 10+ and other 5+ years of experience, a junior dev with almost 2 years of experience, me and a young and inexperienced tester.

The team was never steady and the tester was almost always busy testing other applications, or bug fixes from the previous versions.
The senior devs, that built the previous version, were being pulled away from the project to fix problems/bugs related to it.
The junior dev was still in another project and only joined the team 4 months after it started.

I was able to pair programme for most of the first 6 months with one dev at a time, while others were busy doing things unrelated to the project.
the way I mentor:As an XP advocate, I used pair programming as a way to pass on knowledge of the practices. I believe that being assertive and making people think is more effective as they will find the answers by themselves, so they will retain the finding more easily.Some people might think that this way will came out patronising and arrogant.
I think that if people are truly interested in learning, they will be thankful and happy that I didn't gave them the recipe and that I went out of my way to explain to them how things work when they admit they were lost. But I was wrong.
Most people think differently from me and even though they are interested in learning, the method I used, and that it was effective to other people before, was not suitable for them. I failed to realise that this way of mentoring was the root cause of the problems that arose.the progression:

While the progression with one of the senior devs was steady and fast, as he asked for help when he was lost and tried to take in my advice, the other senior dev resisted and never asked for help, so progress was fairly slow.
I have to say that I'm a very patient person and I let people take their time to think about the problem and the solutions. If I find that people are not communicating, I ask questions like: What's the problem we have to solve? What do we want to accomplish here? How can we fix this now? or Can you see a better way of doing this? Depending on the way you take this questions in, you get different outputs.
If I'm not afraid to be wrong, and I think for a bit, take a stab at it and say something, worse case scenario, I was wrong and learn something.
So, I always tried to guide them to the solution by asking more questions or saying things like: What if such and such happens?
But If I'm afraid to be wrong... I might spend a lot of time thinking or just wishing I was not being put in a position where I have to provide answers.
The silence or the questions that come next make the situation worse as I've already closed my mind to find a solution and I'm just wishing for this situation to end. I might just say something just to get out of this uncomfortable position.
And then I say something without really thinking about it, and chances are, I will say something wrong or just plain illogical. So When this happens I tried to show them that maybe they needed to read this on the web or buy a book about the subject. This might come out wrong again depending how I say this things. I tried to be sensible and I think that they took it the right way but I'm not too sure. Though in the end this way was preventing work to be done and slowing the all thing down.

With more or less bumps in the way, we were able to build some software using new tools for them like a mocking framework. And they started testing their software with three kinds of tests (have a look at how I test software), a CI server that continuously integrated our code and run our tests and pairing all the time to spread knowledge. This apart from the other practices like iterative development...

the XP values and corporate values bump into each other:

I mostly struggled to pass on some values that were against the corporate culture. Things like high code quality with simple things like naming, or just plain consistent formatting of the code were a struggle all the way. I tried to infuse this attention to details by showing them that code that it consistent takes less time to read and is much more pleasant. And that good naming is as important as any other thing.
Test driven development was way beyond their capacity as they has their debugging practice too much ingrained in their way of working. I used every example I could to show them what they were missing if they write the tests last but to no avail.

I now think that the push model of programming might be a mental leap that some people find very hard to get their heads around. And that is why TDD hasn't taken off yet, and also because it's hard to do well. It took me 3 years to kind of master it and I know I still have a lot to improve.

Overall:

What I think I failed mostly with was not being able to make retrospectives part of the development process where this and other problems should have surfaced much earlier and should have saved a lot of time.

I've also took some decisions unilaterally which kind of killed the team spirit. Even if I were sure that their contributions were not that big, I should have discussed the problems within the team and facilitate the team to get to an agreement instead of almost imposing decisions on others.

Sunday, November 1, 2009

How I test my apps...

Introduction:


After reading XP books like 3/4 years ago, I had the necessary information to get an insight on why tests were so important. I had a course in Uni where we went through most of the XP practices on a project, which I think was very important, but the concepts didn't sink in at that time.


About a year ago, I had the opportunity to start a project from scratch and that was great to put what I've read and tried to apply in other projects but only with a limited success because of the quality of the code base.


After reading the xUnit test patterns book, I got a zillion different concepts in my head. The book is really comprehensive. Still, it's not easy to get people to agree on terms like functional testing, accepting testing, end-to-end testing and so on... So, I'll use what I think it suits me, giving an explanation of what I mean by each.


Unit Tests - Tests that are focused on the smallest unit. In the case of java, it's a single class only. These tests are usually very comprehensive, they test pre and post conditions of methods, messages sent to collaborators and class invariants.


Integration tests - Tests that integrate my code with one or more external libraries and/or frameworks. This tests usually wire my application using spring/guice, do some database stuff, send mails or send messages through a messaging service like JMS and invoke web-services.


Acceptance tests - Tests that test the whole system from the user perspective. In a web application I'd use Selenium. I've also used Fitnesse to talk to a server through XMPP.


TDD:


TDD is a practice that I just love and can't really code without anymore. Even when I do a spike, it's so ingrained in myself that I have to start by writing a test. And maybe that's why I'm not so impressed with REPL consoles.

When I started doing some katas in scheme using the /The Little schemer/ book I had to create a small set of functions that enabled me to create tests with a name and a reporting utility to flag which tests failed.

REPL consoles are cool but they are just like manual testing, it doesn't scale and you can't re-use what you've done later.

I can now go back to the scheme code and know exactly what the function does just by looking at its tests. :)


TDD also helps me design my software using a push model. I create more functions/classes as I need instead of trying to guess what is necessary. I heard of people that are very good at building applications from the bottom up, I'm the opposite, I suck!


My style of TDD is very mock based, I also stub but only values not objects that I want to verify behavior.


My desires while testing:


Have good branch code coverage.

No duplication of tests, ie, minimise the number of tests that break for the same reason.

Use TDD to drive my design.

Fast build to have feedback as soon as possible.


So how do I test/design my applications:


I usually write ONE acceptance test per story.

If I'm using Fitnesse I will create the test remotely, disable it on the server, import it to my local server and re-enable it. I only re-enable it in the server after I've completed the story. Now, if I run this test it will fail. I think it's very important to keep verifying that your tests fail when they should fail, otherwise the test is useless or testing the wrong thing.


In order to carry on, I usually have to create an event that represents the user interaction. It can be just a wrapper for the HTTP parameters, an XMPP message, etc...


I then write ONE integration test for the component that will process the event. I create the expected response, I feed the event to the component and check that I don't get the expected response.


Next, I create ONE unit test for the component using the same event that I created for the integration test. This is usually the simplest happy path. I code the component until I get the expected response. Most of the times what happens is that in order to get the expected response I have to collaborate with other classes/libraries. I mock those and stub their responses.


Next, I go to the collaborators and write unit tests for those and code them using the same approach.


This will lead me down a path between objects until I hit a boundary or reach a dependency that already does what I need. So, when all of the mocked collaborators have been coded with the functionality I want, I'm done.


So far, I've been navigating down the path, it's time to go up and wire everything together using a DI library.


When done, I should be able to run the integration test and see it pass. If it doesn't, that means I have written tests that assumed something that was not true. I found that just by looking at the expected and the actual values of the integration test I can find straight way what the problem is.


With this I have my happy path implemented. :D

This is all good but I need to accommodate different scenarios, maintain class invariants, check pre and post conditions and handle errors.


By the way, If my system is only a single component then the acceptance test I wrote should pass now. If it’s not, I'll have to choose between coding a component at a time until it's completely done or code the happy paths for each of them till I get the acceptance test pass and come back later to each one for the rest of the coding. There are pros and cons for each approach and I don't have a preferred way yet...


So to finish off the components by tackling the different scenarios, handle errors, etc.., I use the following rule:

If a scenario/error doesn't need to integrate with an extra external library, i.e. you need to send an email when an error occurs but not when everything is ok. Than I just add unit tests to the appropriate places. Otherwise, I create an integration test and go from there all the way down as if it's a new feature.

Class invariants and pre and post conditions can usually be handled within unit tests only.


I continually do this until all the scenarios/errors/etc are covered and that means the story is done done. Normally when handling errors I try to find if the error can be handled then and there or if it has to be handled up the chain, which might mean I have to write tests to handle the error in another class.


How does this fulfills my desires:


Have good branch code coverage - The unit tests completely driven by TDD make sure I get all my code covered.


No duplication of tests - Because I've minimised the number of integration and acceptance tests, usually at most I get three tests broken when the happy path doesn't work.


Use TDD - By going layer by layer down the path I could TDD all my code in an organised way.


Fast build - The slow tests are the acceptance and integration test ones. By restricting those I speed up the build by a lot.


Do I get lots of bugs:


Exploratory testing is of course a very important part of the testing. I'm not going to say that no bugs were found, I would be lying, but all the bugs found so far were overlooked scenarios or just miss understandings of the requirements.

JB Rainsberger gave a talk about how integration tests are a scam. I agree with him so I used them only to make sure I test only the integration and not all the possible combinations. The same goes for acceptance tests. And if the client wants to write their own that’s ok but they are not going to run as part of the build.


What happens when I find a bug:


It depends on the bug. :P

If I can look at the logs and see in what object the problem is, I write a test with the appropriate context for the object and provide the input. If it fails, I'll fix the bug and check again through the process that found it, that it’s now fixed.

If I don't know the object but I know that the problem is in a given component, I'll write an integration test to find where the problem is. After I find the appropriate object I write a unit test and delete the integration test.

If I have no clue I write an acceptance test and go from there all the way down. But when I find the object I just write a unit test and delete all the other tests up.


I have found that with proper logging and keeping strict invariants in my classes I could find the errors straight away.


I just started reading Steve Freeman's and Nat Pryce's book and I found that they do some things that I also do, so I must have gotten something right. :)

Sunday, October 25, 2009

Passionate Developers

background:

I learned how to programme in 1996 when I went to Uni, just by myself, I did all the exercises in the Pascal book that was referenced in the course bibliography.
I spent hours and hours doing every exercise, even if they were quite similar to the ones I've done just before and sometimes I even repeated some. I really enjoyed it.

But I went to do something else afterwards like play computer games and Magic The Gathering for the next 5 years. I still kick myself for those wasted years but some people told me that strategy games and MTG might have developed in myself skills that are useful to programming. I don't know if that's true or not...

The truth is that in 2001, most of my friends/mates at Uni were working and making money out of the dot com bubble while I was spending money on cards and computer games. So I totally missed it and I think this was good, because some got burned for life with the way things got done at those times.

what makes me tick:

When I got bored of playing games and Magic, I decided to finish my degree. And as soon as I started coding again the feeling that I had when doing those Pascal exercises came back. I thought: What have you been doing? All the fun is in programming...

I set myself to get back all those wasted five years.

As I coded and I did quite a few in those times because I would try to do as much as possible, specially in group assignments where I wouldn't mind that my group mates slacked, as it meant more coding for me to do. :)
I would redo courses, to improve my grades and do more coding. I ended up doing 25% more courses per year than a normal student. Still my grades sky rocketed. I was just having the most fun of my life.

And then I made a stupid mistake, I embarked in the business side of development because I didn't want to be just a code monkey, even though I loved it so much.
My degree thesis was about business concepts to describe an organisation. I would sit down with someone and jot down all the tasks and the contexts in which they were performed, describe interactions within the organisation. It was tedious and it took a lot of will power to do it. Still my professors gave me the opportunity to do a Masters, looking at the amount of effort I put to finish that thesis. I was offered to help lecture a course on Distributed System which had been my favourite course. I wasn't a very good teacher but it was about how to code and I loved it. I realised that coding was the way, the thing that I really loved.

passionate people and not so passionate people:

What I felt at that time was that I bounded with some students that shared this passion. I might not have been very good at passing on information but I definitely nailed it on passing on passion and enthusiasm.

I dropped out of the Masters, at half way. I went to code for a consultant company. I wanted to know how the enterprise world was. My skills were on Java so I went to write Java code.
My hunger for knowledge was still growing up tremendously. I started reading books and books. I would work hours and hours even on weekends. There were people in the company with 10+ and even 20+ years that knew so much about software and I enjoyed a lot having lengthy conversations with them. I learned a lot.

Ken Thompson, in Coders at Work, says that what he looks for in a developer is passion. If I ever have to hire someone, I will definitely look for enthusiasm and passion. I firmly believe that enthusiasm and passion when put in practice produce the best results.

When you love what you do, you don't want to just get it done. You want to be proud of it. You don't slow down because you know that more is coming. You still take your time because things have to be done right but you get excited when you finish something and you look forward to the next task.

I found in the software craftsmanship community a bunch of people that love what they do and want to improve the state of things in our industry.

I think it's a pity that people come to software development for the wrong reasons, be it money or any other. I hear so many people saying that we don't need more developers, we desperately need better ones. I wonder if this is not the same in other industries. I know of some people that went to Medicine for the wrong reasons and are practicing today and they are not good.

My contribution to improve things is to become the best I can and try to set a good example wherever I work. I also try to pass on my enthusiasm and passion to others but I know that this cannot be really passed on, people have to take it. If someone wants to turn myself into a salesman, regardless of the amount of passion for his profession, it will fail because I don't like sales.

working for people instead of working for companies:

I was hired to the company where I'm about to leave by another very passionate developer/manager. We had loads of fun and interesting conversations before he left the company 4 months later. We paired some of the times, we learned new technologies and we discussed stuff we read in books, blogs and mailing lists.

When you work in an environment like that where one of your hobbies is also your work, you don't care if it's Monday or Friday because all the days are nice, fun and very interesting.

When things stop being like that you have to either change your organisation or change your organisation, I was told. So, after trying to change my organisation for some time, I decided it was time to change my organisation and leave.

I think I found another place with passionate people where I can have that feeling again. And that made me realise that it's not so much about the company where I work, it's the people in it. Passionate people that make me feel excited and have lots of fun, while doing things that I'm proud of. That's where I want to be.

Tuesday, January 6, 2009

Can I do it

It's the third time I'm trying to keep a blog.
This one I was planning to but lack of time mainly prevented me from doing it.

I'll possibly rant a lot as I stumble lots of times through stuff, sometimes it's my own fault other not so much. :)

Monday, May 7, 2007

First Post

I have other blogs but I thought that I should have a new one, which should be started from scratch.
It will be about prototypes, tests and ideas I have in software design, test, implementation... basically about all the life cycle of building software.
It's going to be fun for me to write, and maybe it will be usefull for somebody else.