As discussed in the previous article, it has evolved from TDD. So there was something amiss in TDD that led to this evolution.

For those who don't know about TDD, it is a development practice that is more technology focussed in nature. It states that.

  • As a developer you should always write a test first for the unit you are going to code for. This would obviously fail in the begining as there is no implementation
  • Then implement the unit, as in write the code
  • And then verify that the implememtation get the test to pass.

Consider the same addition example, if we were to follow TDD,

  • we would have written a test for the addition method first, say a test method that checks the result of adding two positive numbers.
  • Only then you write the implementation of the addition method so that the tests pass.

But TDD gives no clarity of how much to test.i.e. should we test for negative numbers, should we test for decimal etc. And hence there is a clear chance of missing an expected behavior of the method while implementing it.

It is not driven through a specification or the so called desired behavior. And because of this, the boundaries are not clear.

This very reason set up the premise for BDD. It advocates that the business value is specified upfornt in terms of the desired behavior of the unit, and that is what gets implemented.

When the team sits together to define the expected behavior, it talks beyond just the need for an addition method. The team agrees on all the behavior that the business expects from the addition method.

So the most important reason on why BDD is a better practice is, in it's core it brings business and technology together. The team now knows how much to test in one go and where are the boundaries.

results matching ""

    No results matching ""