³ÉÈËÂÛ̳

« Previous Main Next »

Test-driven Development in Agile Projects

Profile picture for Giv Parvaneh Giv Parvaneh

Post categories: , ,

15:15 UK time, Thursday, 19 November 2009

Hi, I'm , and I'm a senior PHP web developer at the ³ÉÈËÂÛ̳. When I started working here earlier this year, I quickly realised that I would have to re-think the way I develop web applications. As a public service and one of the most visited websites in the UK, the ³ÉÈËÂÛ̳ has an obligation to deliver quality products; the websites we build not only have to conform to web standards and be fully accessible, they need to be scalable to handle millions of users. Quickly hacking things together isn't really an option here...

The need for speed

Developers at the ³ÉÈËÂÛ̳ tend to use as a way to quickly release iterations of products. But where does fit in with the short development and release cycles? How can we maintain the quality of our code when things need to change so fast?

A confession

OK: I'm guilty of releasing untested code in the past, where I've needed to meet deadlines. I'm probably not alone. And if you're a developer who's worked in a big team, you'll know that often you'll have to work with code that was written by your predecessors or contractors who didn't create 'test-friendly' code. Often this means having to waste time re-writing the code so that it can be tested. Naturally, this eats into your valuable development time during .

The states that we should value "Working software over comprehensive documentation". Some might see a problem here, because this seems to place a higher importance on "working software" than the quality of that software. The way I see it, you can't truly maintain "working software" if it's not well-tested... and the truth remains that many developers compromise good test coverage in order to complete tasks by the deadline. So what can we do?

A compromise

One thing I love about Python is its built-in support for documentation and testing. Even if you're not in the habit of writing unit tests for your code, you should at least comment your code so that you and other developers know what your classes and methods do. Python's "" kills two birds with one stone: as you write your comments, you can optionally specify how the code should be used and what the expected outcome should be. Comments now serve as valuable documentation for your code, and can be executed as a test to ensure your logic is intact.

This method is not supposed to replace unit tests. In fact, there's only so much you can do with doctests, and your code can quickly become a huge mess if you try to write too many. But the idea is that they can be quickly used when you don't have time to write a full test suite. Some tests is better than no tests.

Let's have a look at an example in Python first:

def greetings(your_name):
    return 'hello, %s!' % (your_name)

This function takes a person's name as an argument and returns a greetings message. Let's add this description and a test to the docs:

def greetings(your_name):
    """
    This function takes a person's name and 
    returns a greetings message
    >>> greetings('Auntie')
    'hello, Auntie!'
    """
    return 'hello, %s!' % (your_name)

When you execute this script, Python is smart enough to know ">>>" means the comment needs to be interpreted as code and what the returned value should be.

We use PHP at the ³ÉÈËÂÛ̳, so I did some research and came across a PHP equivalent called , which can be installed via Pear. Here's the PHP version of the above function:

/**
* This function takes a person's name and 
* returns a greetings message
* 
* greetings('Auntie');
* // expects:
* // 'hello, Auntie!'
* 
*/
function greetings($your_name)
{
    return 'hello, ' . $your_name . '!';
}

You can test this by running $ phpdt example.php in your terminal. The comments inside the <code> tags will be interpreted as PHP code.

Testing times

These are simple examples, but you can see how easily you could combine your comments with some testing to make sure that any accidental changes in your code are caught. Again, doctests aren't meant to replace unit tests, but they might help to keep things working when you're faced with the need to write code faster than your tests can keep up.

I plan to try adding doctests to my code, as well as continue writing proper unit tests in . I'd be really interested to hear from anyone who has tried this approach: on large applications, is this a good way to move fast without letting your test coverage slip?

Bookmark this page:

What are these?

Comments

  • Comment number 1.

    "Cutting tests to meet a deadline is like taking up smoking to lose weight."

  • Comment number 2.

    If you're dropping or cutting back on tests to meet your deadline you're doing it wrong. Compromise on features, not on quality. Features which didn't make it into this release will make it into the next. Tests which didn't make it into this release are unlikely ever to make it into a release.

  • Comment number 3.

    @craigwebster Well, if we're making sweeping generalisations, then if your team's culture requires you to cut back on tests to meet your deadline then you've got far, far bigger problems. :-)

    Even the best development teams build up a little technical debt over time and in my experience, that debt often creeps in near a tight deadline. There are valid reasons for introducing the debt as a trade off against other factors (not delivering a feature or delivering late, for example) and it's OK to make that decision sometimes.

    So long as you commit to paying up the debt again quickly, that is. ;)

  • Comment number 4.

    @mathie That's a fair point. There are exceptions to every rule, but in order for them to be exceptions and not the norm it's good to establish the rule first - even if that is a sweeping generalisation.

    In my experience the only kind of technical debt that gets repaid is tangled but well tested code that was written in a hurry. Maybe it's not DRY or KISS or maybe it's particularly hacky, but it works and can be proven to work so it's okay. This debt can be removed with a reasonable degree of confidence that nothing is breaking thanks to the tests. If the code isn't tested then the debt tends to stay because no one has time to write the tests now, and eventually knowledge of how that code is used disappears. When that happens any change may introduce a bug so no change happens and the technical debt lives forever.

  • Comment number 5.

    As a developer, saying "compromise on features, not test coverage" is easy. But in real world situations, convincing clients and management that for whatever reason feature X will be delayed by some time because you don't have time to write tests is less than trivial.

    Maybe one day everyone will understand the processes and the reasons for them as well as the developers themselves, but educating clients on the actual benefits of unit testing isn't simple.

  • Comment number 6.

    - Glen
    In real world situations repeatedly compromising on test coverage and code quality leads to to the project requiring such technical debt it contracts septicaemia. More than once I've overheard developers state that they're waiting for a poisoned project to die so they can ditch the client. That to me is unprofessional. Have the courage to face your client down. We should adopt the oath 'epi dhlhsei de kai adikihi eirxein'.

Ìý

³ÉÈËÂÛ̳ iD

³ÉÈËÂÛ̳ navigation

³ÉÈËÂÛ̳ © 2014 The ³ÉÈËÂÛ̳ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.