Friday, August 25, 2006

Using Windows to Drive High-Speed Digital Logic

Scott Hanselman is obviously working on another Coding for Fun article.  ;-) 

It looks like he's simply trying to drive a high-powered IR transmitter directly from a Windows application to simulate a Sony remote control. However, he's also running into a challenge that anyone who's tried to use Windows for realtime systems has experienced: there are no guarantees surrounding timeslices in user mode.

I once wrote software that had to do something very similar (this was in an era before .NET). In my case, I was flashing an 8-bit Atmel AVR microcontroller using its SPI bus. The AVR was installed in a device that [I believe] used the RTS flag of RS-232 to both clock and set the SPI data, which was interesting to say the least.

As a background, SPI (Serial Peripheral Interface) is a 3-wire protocol that uses a Master/Slave configuration (in this case, the AVR was always the Slave device).  There's a Master-In-Slave-Out line (MISO), a Master-Out-Slave-In line (MOSI) and a Serial Clock line (SCLK) controlled by the master device.  While SCLK is low, each device is free to change its respective data line state.  The slave reads in the next bit from the MOSI line on the rising edge of SCLK, and the master reads in the next bit from the MISO line on the falling edge of SCLK.  Then the cycle repeats.

The In-Circuit Programming circuitry of this particular device used a Resistor-Capacitor (RC) network and an inverter to both delay and invert the SCLK logic and put the results onto the MOSI line.  So, by carefully timing the transitions of SCLK, I was able to clock in a bit and then prepare for the next bit.

Example: to set the value of 1011, I might have to do:

                  _______          
SCLK/RTS _______| |_______||________|

Which, due to delay and inversion, puts the MOSI line into the following state:
          _________         ___________________
MOSI |_______|
^ ^ ^ ^
Reads 1 0 1 1

The delay that I had to work with was set by a combination of the RC values, the inverter's voltage threshold before it would change state, and the inverter's switching time.  Roughly put, though, it was a really small value represented in microseconds, much like Scott's requirements for the Sony IR Remote protocol.  So, in certain critical areas, I had to make low-high-low SCLK transition that were less than the delay factor, otherwise I would get a wrong value clocked in.


I no longer have the source code to refer to, but I believe that I ended up writing a command-line application in C (no fancy C++ stuff) to call a Win32 API function like Scott is doing.  However, like Scott, I was not getting consistent results, regardless of what priority my process was set to.  The reason that I came up with: the Windows Kernel could at any time steal timeslices from my process when it had something important to do.  My user mode application was always going to be a second class citizen.


Now, one member of this project took my prototype and set off to write a Kernel-mode driver (this was too advanced for me at the time, and probably still is today).  Even with this approach, he still had timing issues every now and then, depending on what was running on the system.


Instead of trying to get Windows to speed up, I took another approach and made the hardware slow down.  By increasing the resistor's value, I was able to increase the delay factor to a point where I could get consistent results from the Windows app.


I wish I still had all of that hardware and source code today.  I would love to see if the same issues exist on modern hardware, as opposed to the PIII-500 with 256MB of RAM running Win98SE.  But, based on what Scott is experiencing, it looks like the issues are still there.


, , , ,