Programming the Microsoft Windows Driver Model

The Windows Hardware Quality Lab

Microsoft really wants hardware devices and their associated drivers to meet certain minimum standards for quality, interoperability, and ease of use for consumers. To that end, Microsoft established the Windows Hardware Quality Lab (WHQL) in 1996. WHQL s basic mandate is to publish and administer an evolving set of Hardware Compatibility Tests for systems and peripherals. Successfully passing these tests confers three basic benefits:

Your starting point for working with WHQL is http://www.microsoft.com/hwdq/hwtest. As I remarked in Chapter 1, it s important to get started early in a development project because there are a number of legal and business hurdles to surmount before you even get to the point of asking WHQL to test your hardware and software. Once past those hurdles, you will need to acquire systems and hardware that you wouldn t necessarily need to own except that you need them for some of the prescribed tests for your class of device.

Running the Hardware Compatibility Tests

When you re ready to begin the WHQL certification process, you ll start by running the relevant Hardware Compatibility Tests. Microsoft distributes the HCT as part of certain MSDN subscriptions and beta programs. You can also download the tests over the Internet. The installation wizard allows you to pick one or more categories of test to perform. If you re just testing the driver for one device, I recommend that you select just the one test category that includes your device in order to minimize the number of meaningless choices that you might have to make later. For example, if you install tests for Pointing And Drawing Devices and Smart Card Readers, you ll have to pick one device in each category before the Test Manager will let you begin testing in either category.

To provide a concrete example, I decided to run the tests for a USB gaming mouse for which I wrote the driver. Figure 15-14 shows how I selected the relevant test category while installing the HCT.

Figure 15-14. Selecting a test category.

The HCT setup program automatically kicks off a wizard that allows you to select a device for test. A dialog box reminds you that you need to have the hardware and software installed at this point. For each of the categories you installed, you ll fill in a dialog box like the one shown in Figure 15-15. My device appears as a HID-compliant mouse. Microsoft is listed as the manufacturer because HCT thinks that my device uses the standard MOUHID.SYS driver. In fact, my mouse is a nonstandard HIDCLASS minidriver with many features that need to be tested beyond the basic things that HCT will test.

Figure 15-15. Selecting a device for testing.

HCT presents several additional dialog boxes before it gets to the point where testing can begin. Figure 15-16 illustrates the basic test manager dialog box. Ideally, you would just press the button labeled Add Not Run Tests, which would populate the right-hand pane with all of the tests. A bit of circumspection is called for here, however.

Figure 15-16. The Test Manager dialog box.

One of the tests the ACPI stress test runs for many hours, if it runs at all. Many computers can t run this test, and the laptop on which I was doing this testing is one of them. To run this test, you need XP Professional on a desktop system that supports the S1 and S3 states or a notebook that supports S1 or S3. (I was using Windows XP Home Edition because USB wake-up stopped working on the notebook if I upgraded, and USB wake-up testing was the only reason I bought that particular notebook.) I suspect that I ll never own a computer that can run this test because I tend to buy computers with the operating system preinstalled and then upgrade the operating system as part of a beta program, whereupon power management stops working.

The USB Manual Interoperability test requires several hundred dollars worth of multimedia hardware that I would have no use for beyond running this one test suite. (Figure 15-17 is a screen shot from a test run when I made the mistake of allowing this test to commence.) This test is pretty important from the hardware point of view because it verifies that commonly used USB devices will continue to work with your device plugged in and vice versa.

Figure 15-17. Required hardware topology for the USB Manual Interoperability test.

Others of the tests are actually useless for telling me anything about the quality of my own driver. The DirectInput Mouse test verifies that Microsoft s drivers interact correctly with DirectInput, a fact I never doubted. The USB Selective Suspend test isn t currently very important for a HID device because HIDCLASS never suspends a device in the first place: most devices can t wake up without losing an input event. In fact, all of the automated USB tests relate to hardware issues. I decided to let them run in this particular example because I was working closely at the time with a leading firmware engineer in getting this product to market. When I was done selecting the tests that I expected to be able to perform whether or not they would succeed was a different question, to which I actually wanted the answer my Test Manager dialog box looked as shown in Figure 15-18.

Figure 15-18. I m ready to start testing

The very first thing the test engine does is engage the driver verifier on the wrong drivers and reboot the computer. Remember that HCT thinks MOUHID.SYS is the driver for my mouse. In reality, the verifier should be getting turned on for my minidriver instead. Attempting to do that by hand would invalidate the test run, though, so I allowed the test run to continue. I m told that newer versions of the HCT will do a better job of identifying which driver needs to be tested. I later ran tests with the verifier turned on for my driver. It was a good thing I did because I caught a rookie mistake in the way my HID minidriver was forwarding a device IRP_MJ_POWER with the minor function code IRP_MN_SET_POWER and the power type SystemPowerState after waiting for its interrupt-endpoint polling interrupt request packet (IRP) to finish.

The Mouse Functionality test (see Figure 15-19) is the one most relevant to the quality of my driver in that it verifies whether I am actually delivering mouse reports in the format expected by the system. Because my mouse lacks an actual wheel (users can program some of its buttons to act as a wheel), I had to fudge part of the functionality test with another mouse attached to the same system.

Figure 15-19. The Mouse Functionality test.

The Public Import and Signability tests both asked whether my product installs it s [sic] own driver. I answered that it does and pointed the test engine to a directory where I had placed my INF and all the other files that get installed on any platform. The import test verified that my driver wasn t calling any verboten kernel-mode functions. The signability test verified, among other things, that all files copied by my INF file were in fact present. (Recall that the CHKINF doesn t do this.)

The CHKINF test ran CHKINF on the wrong INF file, namely the Microsoft-supplied INPUT.INF. Being a good citizen, I ran CHKINF myself. The PERL test script initially failed because it lacked a copy of STRICT.PM, which I found in the HCT directory and copied by hand. The test report told me that a RunOnce entry running CONTROL.EXE (my solution to a client request to automatically launch their control panel) was not allowed because it didn t involve RUNDLL32. Since I had always regarded that particular client request as a bad idea, I resolved to use the test failure as a lever to get my client to change his mind. Mind you, I m sure I could have thought of a way to use RUNDLL32 to launch a control panel applet, but doing that would defeat the real but unstated goal of the test, which is to make sure that a server-side install can proceed without the intrusion of user-interface elements.

The remainder of the tests I scheduled happened without my needing to intervene, which is why I guess they re called automated tests. In the end, I got the test log shown in Figure 15-20.

The reason that the Enable/Disable test failed to generate a log is that it generated an exception in user mode. Some part of the test engine caught the exception and silently terminated that test.

Figure 15-20. Test results after running selected tests.

I worked with my firmware engineer colleague to iron out the failures in the various USB tests. In doing this, it would have been very helpful to correlate test failures with the HCT documentation entries for the same tests. For example, the USB Address Description test log referred to a test assertion numbered 9.22.6. After opening the HCT 10.0 documentation from the Start menu, I browsed to the section labeled Resources/WHQL Test Specification/Chapter 9 USB Test Specification/USB Test Assertions/Address Test Assertions, where I found the information shown in Figure 15-21. Test assertion number 9.22.6 is, uh, well, something important, probably.

Figure 15-21. Documentation for test assertions.

You ll notice that many things went wrong in the testing process. To summarize:

What you would do in a similar situation is ask for help. WHQL personnel monitor several newsgroups on the msnews.microsoft.com news server, including microsoft.public.development.device.drivers and microsoft.public.windowsxp.winlogo. WHQL also responds to e-mail requests for assistance at addresses accessible from the WHQL home page, http://www.microsoft.com/hwdq/hwtest.

Submitting a Driver Package

The last step in running the Hardware Compatibility Tests would be to create a WHQL submission package. You ll want to do this separately for each operating system that your driver supports and then gather together the resulting CAB files in one convenient place. Your next step, which I think you should actually have performed months prior, would be to visit http://winqual.microsoft.com and get yourself signed up as a WHQL client company.

Given a login ID and a password, you can log on through the winqual page to do any of several things:

For this chapter, I wanted to create a new submission for a new hardware device. Figure 15-22 is a screen shot showing the starting point for a brand-new submission.

Figure 15-22. Initial screen for a new WHQL submission.

From the point shown in Figure 15-22, Web forms lead through the process of characterizing your submission in a relatively painless way. You ll answer questions such as these:

I learned a few tricks in the process of running through the Web forms for the first time. As I mentioned, you want to be sure to have all the distribution packages and test results handy. You have plenty of time to finish the process, but the Web application will time out after about an hour so don t plan on having a power lunch in between steps. Some of the choices you make can t be undone except by backing up. Choosing the wrong directories for certain options can add hours to the process if it forces the application to navigate large directory trees in its search for files. The forms warn you about the last two of these gotchas, so I don t think you re likely to go wrong.

Категории