Modern programming environments have come a long way, with powerful development libraries, rapid development frameworks and fancy new language features. But I think the biggest advance in the past ten years has been the widespread adoption of automated testing. In my opinion, there are really two basic ways to produce quality code.
<li><strong>Quality by Specification</strong> - tightly controlling product specification and defining up front as much of the functionality as possible to minimize unknowns and limit project risk.
- Quality by Specification focuses on making the definition of the project as stable as possible so that the code-base can settle on a stable configuration and quality assurance staff staff can verify the program against a written definition. By making changes few and predictable, testers can verify the product functionality with a manageable amount of effort.
- Pros – The earlier the scope of the project is understood, the easier it is to plan and execute. For line of business applications and enterprise projects where the requirements are well-known ahead of time, this approach may work well.
- Cons – Often software is not completely understood before the user has interacted with it. As a result, either the specification process is long and expensive, or software is not specified well enough. This is especially true for a very dynamic product, such as a consumer web site where the features can change very rapidly.
<li><strong>Quality by Verification </strong>- continuously monitoring and verifying program outputs with automated scripts to make sure they match up with the programmer's intent.
- Quality by Verification focuses on making tests automated and quickly repeatable so the project can change rapidly without unsettling the quality assurance staff. As the programmers’ intent changes, the tests move with them. Automated tests also uncover hidden dependencies between different pieces of a large project (as breaks ripple through a project under rapid change, the tests highlight the issues and point developers to the problem.)
- Pros – Results can be seen earlier and the project can change and grow more quickly as requirements are discovered. Testing allows for dynamic development environments like scripting languages. Automated tests can live as long as the code and offer continual validation of program output long after they are created.
- Cons -Automated testing can be difficult without a disciplined approach to software architecture. It is often not feasible to adapt a code-base to automated testing if it was not designed with that in mind to begin with.
I hope it’s clear that the most successful teams use elements of both approaches, but I think it’s been interesting to see the shift from specification-based quality management to an increased emphasis on automated testing.
What approaches do your teams use? Where do the best quality gains come from?
I’m a little late on this announcement, but last week Eli Lopian announced that the newest version of TypeMock (version 4.1) now supports NCover 2.0.
If you haven’t tried out TypeMock yet, you should try it out. TypeMock is a powerful type replacement framework that surpasses anything else I’ve ever seen. TypeMock uses the power of the CLR profiling API’s to allow you to swap out virtually any type on the fly while testing. It is especially useful for testing code that wasn’t designed to be mocked up easily. For instance, you can use this framework to mock up pieces of the .NET framework libraries, or a third-party vendor component, or just legacy code written before you decided on a testable design approach.
Collecting coverage data on large assemblies can be tricky at times. When the CLR loads an assembly, NCover analyzes the assembly to note the classes and methods that it contains. In most situations this analysis takes less than a second, but some projects with large assemblies (many megabytes) can run into longer load times.
Yesterday an NCover user reported an interesting issue when running NCover on a service. The user reported that he would start collecting coverage data on the service, it would run for about 30 seconds, and then the service would die. We asked him to send us a copy of the NCover log, and it looked an awful lot like an unhandled exception was occurring in his application. The only problem was, when he ran the service normally, no unhandled exception occurred.
It turns out that Windows Service Manager only gives a service 30 seconds to start by default, and then it ends the service’s process, assuming that something is wrong. In order to get the Service Manager to wait long enough for the service to initialize properly, we need to increase the service timeout. That timeout can be overridden with a Windows registry key. To override it, open up regedit, and browse to
If it’s not already there, create a new DWORD value named ServicesPipeTimeout. Set the value of that key to the number of milliseconds that you would like Windows Service Manager to give your service to start up.
Problems like this can be pretty tough to solve, so please don’t hesitate to contact us when you run into issues. Just submit a ticket through the support section of the website, and someone from our support team will help you troubleshoot your problem.
In the past two weeks we’ve worked hard to resolve those small nagging bugs that show up after any release, and we’re proud to announce the resulting NCover 2.0.2. Among the improvements in this release are:
<li>Fixed a NullReferenceException that bothered many users generating HTML reports, especially those using C++ code.</li>
<li>Resolved a problem in the HTML output that caused links on modules pages to not work.</li>
<li>Fixed other miscellaneous bugs in the way HTML output is rendered.</li>
<li>Tweaked NCoverExplorer to give better status indication in the saving dialog, so that large coverage XML files don't make NCoverExplorer look like it is not responding.</li>
<li>Fixed issues involving running NCover 2 under TestDriven.NET on 64-bit Windows.</li>
<li>And...several other minor fixes.</li>
Download NCover 2.0.2, and please don’t hesitate to contact our support team either via the forum or by submitting a trouble ticket if you run into any problems.