- Unit tests must be written by the developers of the code. Having a separate test team implementing them doesn't work as well. First, they take longer to write. The person developing the code knows how it is supposed to operate and can write tests for it quickly. Anyone else has to spend time learning how it is intended to be used from the minimal (at best) documentation or conversations with the developer. Having the developer write the unit tests acts as a form of documentation for everyone to follow. Second, the tests take a lot longer to come online. They are available days, weeks, or even months after the code is checked into the product.
- Unit tests must be written and passing before the code is checked in. Writing unit tests should be considered part of the development process. Nothing is code complete unless there are passing unit tests for it. Checking in code without passing unit tests should be treated like checking in a build break. That is, it is unacceptable.
- The unit tests must never be allowed to fail. Just recently I saw a bug in our database about a unit test that was failing. The bug was filed several months ago. This just cannot be allowed to happen if you are going to get the full value out of the unit tests. Unit tests should act as canaries. They are the first things to fall over when something goes wrong. If they ever break, they must be fixed immidiately.
- Before any checkin, existing unit tests must be run and made to pass 100%. This is a corrolary to points 2 and 3 but I want to make it explicit. Just as you would get a buddy build, a code review, and a smoke test before every checkin, you must also pass the unit tests.
- Unit tests must be granular and comprehensive. A test case which plays back some media is not really a unit test for a media pipeline and it certainly isn't sufficient. At a minimum, each external interface should be tested for all expected values. An even better system of unit tests would verify internal interfaces directly as well. A rule of thumb is that unit tests should achieve at least 60% code coverage before the first checkin is made.
- Standardize a mechanism for storing, building, and running unit tests. Don't leave it up to each individual or small team. There should be a standard harness for the tests to be written in. There should be a convention followed by all for where to check them into the source tree. Unit tests must be built regularly. In my opinion, the unit tests should be written in the same harness used by the test team for their tests. The unit tests should be checked into the build right alongside the code that they are testing and should be built with each build. A build break in a unit test is just as bad as a build break in the shipping code.
- The unit test framework must be lightweight. If the framework is to be the same one the test team uses (highly recommended), it must be one that can be run easily. If running unit tests requires anything more than copying some files and running an executable, it is too heavy. Expecting developers to install a whole test framework to run their tests is a prescription for disaster.
- Run unit tests as part of the daily testing process. If the tests use the same harness, they can be leveraged as part of the daily tests. The tests for external interfaces can be especially useful in assessing the quality of the product.
sábado, diciembre 22
Prescriptive Advice For Successful Unit Testing
At the beginning of the Vista (then Longhorn) project our team
decided that we would implement unit tests. This was the first attempt
in our locale to try to use them. We had some successes and some
failures. Out of that I have learned several things. This is an
attempt to codify what I have learned and to try to set out a
prescription for what I feel it would take to leverage them fully. What
follows are my recommended practices for implementing unit tests:
Suscribirse a:
Comentarios de la entrada (Atom)
No hay comentarios.:
Publicar un comentario