Little problem with my 1st PIC code..Please Help!!!

joeyd999

Joined Jun 6, 2011
5,287
Ok, Eric.

I figured that if I do this, I want to show you an example of what I consider to be good code practices.

First, you'll notice there is a lot of code, and much of it is overkill for your little application. But, it is easily extensible for when you want to add features as your skill improves.

The first (and most important!) thing I would like you to remember is this:

Never, never, NEVER, ever hard code time delay loops. I've seen this a lot here, and it is an incredibly bad practice. What's the point of using a 2 MIPs processor (assuming 8Mhz) when you are going to spend 1.9999 MIPs in a tight, closed, timing loop? You could be using those instruction cycles for something useful, like some advanced digital signal processing or perhaps creating a responsive user interface (i.e. press a button, and something happens *immediately*). Since you are a beginner, I urge you to not get into the habit of using loops for timing!

Notice also the code is broken into two files: a .inc (header) file and a .asm (code) file. I put program definitions in the .inc, and code in the .asm. This helps to keep things nice and clean. I try not to hard code constants in the .asm file; instead, I set them as labels (equ) or string substitutions (define) in the .inc file. This is similar to how one would do good code in 'C'.

I also usually have multiple files, one per "device" in my system that contains all the code for the device (i.e. a device driver). In this way, I can reuse the code for other projects that use similar devices.

Pay special attention to the "system clock". Every single project I do has a master system clock based on TMR0. Each time TMR0 rolls over, in interrupt is generated (every 1.024ms in this case). The servicing routine increments the _TIMER register and computes a value in the _TMRCHG register, the bit of which represent bits that have changed when _TIMER was incremented. Later on, you will see this construct is *invaluable* in creating periodic timing signals that your main program can use to drive its execution.

The main program is a simple loop. It does 3 things: it gets the current system time (into the "synchronized" TIMER and TMRCHG registers), it gets the current state of the "keypad" (in your case, only one push button), and processes a "state machine" that drives your LEDs based upon time and pushbutton presses.

I'll let you "discover" how the GETKEY routine works to debounce the pushbutton, as well as how the state machine works (hint: use the simulator and watch windows).

By the way, please notice that the pushbutton is checked every 33ms, regardless of what is going on elsewhere in the program. Therefore, the response, as far as the user is concerned, will be instantaneous.

The code, especially the state machine, could actually be further simplified without departing from my "good practices" principles, but I wanted to try to keep things clear so as not to confuse you too greatly.

The code is written to use the 8mhz intosc. Put both files into a directory, make a project and add 'fourleds.asm' to the project. Set 'case sensitivity' to off and radix to decimal, otherwise you will get build errors.

Be sure to remove the .txt from the end of the file names.

BTW, I wrote this quick, and only simulated it. Hopefully it will work without problems on your hardware.

When you have questions, I'll be happy to explain more. Good luck and have fun!

EDIT: I just notice some of the comments in the state machine are wrong...I copied and paste and didn't update the comments. Sorry.
 

Attachments

Last edited:

MMcLaren

Joined Feb 14, 2010
861
THE_RB said:
then wait for button released as a debounce (after the event)
But wouldn't that prevent the LEDs from flashing as long as a button is being held pressed? Then again, I'm not sure I have the OP's LED sequences figured out.

I was wondering if the OP shouldn't include the switch test as part of his loop timing. For example, build the entire LED loop timing sequences around a 25-msec switch sample/debounce interval. I would also use a switch state latch and simple parallel switch state logic to allow detecting (and acting on) the "new press" state while ignoring the other switch states in order to keep the display loop going even when a button is being held in the pressed state. Here's a quickly thrown together example (with my apologies to the OP if it seems a bit advanced);

Rich (BB code):
;
;  swnew  __---___---___---____   sample active lo switches
;  swold  ___---___---___---___   switch state latch
;  swnew  __-__-__-__-__-__-___   changes, press or release
;  swnew  __-_____-_____-______   filter out 'release' bits
;    
loop
        DelayCy(25*msecs)       ; 25 msec debounce interval
        comf    PORTC,W         ; sample active lo switches
        andlw   b'00000100'     ; on the RC2 pin
        xorwf   swold,W         ; changes, press or release
        xorwf   swold,F         ; update switch state latch
        andwf   swold,W         ; filter out 'release' bits
        skpnz                   ; new press? yes, skip, else
        goto    display         ; branch
newpress
        movf    timer,F         ; display on (timer running)?
        skpnz                   ; yes, skip (turn if off), else
        goto    newcyc          ; branch, start new display cycle
        clrf    timer           ; clear the timer and
        clrf    PORTC           ; clear the LEDs (display "off")
display
        movf    timer,F         ; timer running?
        skpnz                   ; yes, skip, else
        goto    loop            ; loop, wait for new press
        decfsz  timer,F         ; 500 msecs interval?
        goto    loop            ; no, branch, else
newcyc
        movlw   500/25          ; 500 msecs / 25 msec loop time
        movwf   timer           ; reset timer for 500 msecs
        movlw   1<<LED1|1<<LED2 ; mask for LED 1 & 2
        xorwf   PORTC,F         ; toggle LED 1 & 2 (1/2 sec intervals)
        movlw   1<<LED2|1<<LED3 ; mask for LED 3 & 4
        btfsc   PORTC,LED1      ; LED 1 lighted? no, skip, else
        xorwf   PORTC,F         ; toggle LED 3 & 4 (1 sec intervals)
        goto    loop            ;
;
 
Last edited:

MMcLaren

Joined Feb 14, 2010
861
Never, never, NEVER, ever hard code time delay loops. I've seen this a lot here, and it is an incredibly bad practice. What's the point of using a 2 MIPs processor (assuming 8Mhz) when you are going to spend 1.9999 MIPs in a tight, closed, timing loop? You could be using those instruction cycles for something useful, like some advanced digital signal processing or perhaps creating a responsive user interface (i.e. press a button, and something happens *immediately*). Since you are a beginner, I urge you to not get into the habit of using loops for timing!
Joey, please forgive me but would you consider qualifying that statement with something like "in my opinion", please? What you may consider "incredibly bad practice" may well be perfectly "viable practice" in many situations. Suggestions like this are subjective at best and make me cringe when I see them stated so authoritatively.

I hope you accept my comment in the positive manner that I intended.

Cheerful regards, Mike
 

joeyd999

Joined Jun 6, 2011
5,287
Joey, please forgive me but would you consider qualifying that statement with something like "in my opinion", please? What you may consider "incredibly bad practice" may well be perfectly "viable practice" in many situations. Suggestions like this are subjective at best and make me cringe when I see them stated so authoritatively.

I hope you accept my comment in the positive manner that I intended.

Cheerful regards, Mike
Hi, Mike.

No forgiveness necessary, as you haven't offended me, but...

I'm going to stand my ground here, and I intended the statement to sound as it did. Delay loops are bad practice. Period.

I've been coding for 30 years. Never seen a hard coded loop that was necessary (*unless* interrupts were not available, which is pretty much never these days), and they usually cause problems (like lack of UI responsiveness, among many others).

I accept your comment in the positive manner it was intended, and I hope you accept mine similarly.
 

MMcLaren

Joined Feb 14, 2010
861
I've been coding for 30 years. Never seen a hard coded loop that was necessary (*unless* interrupts were not available, which is pretty much never these days), and they usually cause problems (like lack of UI responsiveness, among many others).
Joey, may I ask how much of that 30 years has been with PIC microcontrollers Sir?
 

joeyd999

Joined Jun 6, 2011
5,287
Joey, may I ask how much of that 30 years has been with PIC microcontrollers Sir?
I've programmed PICs almost from the beginning. Shipped a product based on the 16C54 in 1992. That was actually my 1st experience with OTP micros. Prior was either an off-chip UV eraseable eprom, or, a factory masked rom (aaarrrrggh!).

That (1992) was when I fell in love with Microchip, and have been faithful ever since.

And, yes, I did use timing loops in the '54, as it had no interrupt facilities.

I published a short bio here:

http://forum.allaboutcircuits.com/showpost.php?p=376421&postcount=225
 
Last edited:

MMcLaren

Joined Feb 14, 2010
861
Hi Joey,

Sounds like you have much more PIC experience than I do. Bravo!

Thirty years ago (1981) I had just finished developing the firmware for the EPG (Electronic Program Guide), the earliest version of what became The TV Guide Channel. My contract was for little cash up front but produced about 100K in royalties.

Later... Regards, Mike
 

Thread Starter

Eric007

Joined Aug 5, 2011
1,158
@joeyd999: I had a quick look at your code and I must say that your approach and code is brilliant!!!

I really like the way you broke your code into a .inc file and .asm file! From now on I will adopt this style...

But I still need to have a proper look at your code and understand everything...
Tomorrow I will write your code into my chip and see how it behaves...

My design worked perfectly with my code but I know that one can tell that I'm still a beginner when it comes to PIC programming...

Thanks for all your comments/explainations and everything else...

However, I was wondering if you could provide with links or pdf file so I can read and learn more coz I wana be able to use all the feature of the PIC chip...

Thx!!!
 

joeyd999

Joined Jun 6, 2011
5,287
@joeyd999: I had a quick look at your code and I must say that your approach and code is brilliant!!!
Thanks! IMHO, I agree... :D

I really like the way you broke your code into a .inc file and .asm file! From now on I will adopt this style...
It will serve you well...especially as your code grows larger.

My design worked perfectly with my code but I know that one can tell that I'm still a beginner when it comes to PIC programming...
I was too, once.


Thanks for all your comments/explainations and everything else...
You're welcome.

However, I was wondering if you could provide with links or pdf file so I can read and learn more coz I wana be able to use all the feature of the PIC chip...

Thx!!!
I am pretty much self taught. There is not a particular reference that I can think of to point you to. Microchip has a lot of good stuff on their website (though I often don't like their code samples). Best thing to do is just practice, practice, practice...and have fun!
 

THE_RB

Joined Feb 11, 2008
5,438
But wouldn't that prevent the LEDs from flashing as long as a button is being held pressed? Then again, I'm not sure I have the OP's LED sequences figured out. ...
Not in most cases;
1. press button -> instant LED on
2. delay 100mS
3. LED off
4. delay 200mS ->returns
(now calls "debounce" to ensure buttons are free for >20mS)

For simple flash and mode changover type code that should work fine and was suggested as it is very easy to understand and implement even with beginner level skills. :)

...
I was wondering if the OP shouldn't include the switch test as part of his loop timing. For example, build the entire LED loop timing sequences around a 25-msec switch sample/debounce interval. I would also use a switch state latch and simple parallel switch state logic to allow detecting (and acting on) the "new press" state while ignoring the other switch states in order to keep the display loop going even when a button is being held in the pressed state.
...
Absolutely! That is a professional and highly functional way of doing it. My suggestion was not meant to be an example of the very best way but was just a very simple way that would require minimal adaptation of the OPs code.


And regarding the "hard coded timing loops" I'm with you on that one. Although I'm no stranger to doing specialised timing systems with a PIC, it is still really handy and good practice sometimes just to hard code it.

I'm finding I am doing that more and more since switching to doing most of my PIC coding in C now.

In a simple PIC C project something like;
// show an idicator flash
if(whatever)
{
LED =1;
delay_ms(100);
LED =0;
delay_ms(200);
}

Can be both good fast coding practice to get the job done (and easy to tune the delays) and if delay_ms is a function also results in quite compact code.
 

joeyd999

Joined Jun 6, 2011
5,287
In a simple PIC C project something like;
// show an idicator flash
if(whatever)
{
LED =1;
delay_ms(100);
LED =0;
delay_ms(200);
}

Can be both good fast coding practice to get the job done (and easy to tune the delays) and if delay_ms is a function also results in quite compact code.
I guess this is acceptable if all you are doing is flashing an LED and doing *nothing else* (or you don't mind burning 600,000 instruction cycles between doing other things!).

On the other hand, with the code I supplied Eric, you can do this:

Rich (BB code):
main	

	...
	...
	call	flashled
	...
	...

	goto	main

	...

flashled

	btfss	flash	;if flag bit flash=1, then flash led.
	return		;nope.

	btfsc	tc131ms	;time to change led state?
	btg	led1	;yup, toggle it

	return		;and go do something else
No wasted instruction cycles. They are all available to do something (many things) else concurrently.

I like to think about instruction cycles like pennies. Spending a few of them here and there unwisely is no big deal. But when you start throwing away millions of them, well, now you're talking about real money...
 
Last edited:

THE_RB

Joined Feb 11, 2008
5,438
I get your point, and respect it! I've done apps that squeezed every last cycle out of the poor little PIC.

But I've also done apps where the ENTIRE app was something like this;
while(1)
{
read_adc();
if(adc > x) LED = 1;
Delay_mS(100);
LED = 0;
Delay_mS(400);
}

And I was not at all worried about wasting cycles... ;)
 

joeyd999

Joined Jun 6, 2011
5,287
I get your point, and respect it! I've done apps that squeezed every last cycle out of the poor little PIC.

But I've also done apps where the ENTIRE app was something like this;
while(1)
{
read_adc();
if(adc > x) LED = 1;
Delay_mS(100);
LED = 0;
Delay_mS(400);
}

And I was not at all worried about wasting cycles... ;)
Thanks., THE_RB...without realizing it, you have proved my original point -- that hard coded delay loops are incredibly bad practice -- dramatically!

Before I explain, let me preface with this:

These days, a typical application for me is a self-contained battery-operated hand-held instrument. It usually consists of some sort of sensor (or sensors) for measuring some quantity(ies), an analog front-end for driving the sensor(s) and conditioning the sensor signals, a keypad (perhaps multiplexed) or one or more pushbuttons, a display (could be just LEDs or a multiplexed or character LCD, or a combination of the above), and a sounding device (a speaker or piezo element). The unit may also have to communicate with other devices (via RS-232, an optical interface, or a 1 or 2 wire interface, etc.) with either standard or proprietary protocols.

The analog front-end may include EEPOTs or PGAs (for automatic calibration under CPU control). The sensor may also require "excitation" of some form, perhaps a periodic well-define signal from the CPU (either constant or variable such as from a PWM). In this case, analog conversions are usually required to be synchronized with the excitation signal to eliminate synchronous noise.

Generally, I utilize at least two analog channels of the on-chip A/D converter. I addition, a particular application may also require an off-chip A/D like a high resolution delta-sigma. I may require other hardware support, like off-chip EEPROMs (either SPI, I2C, or 1-wire), etc.

The "user interface" is extremely important. This is what my customers see when using my products. As a general rule, I insist that when a button is pressed, something must happen immediately as feedback to the user (i.e. a beep from the speaker, something changing on the display, an LED turning on or off, etc.). Any delay will make the user think the product is broken.

In addition, each button or key on the keypad may have multiple functions which can be activated by actions such as "press", "press and release", "press and hold", "press, hold, and repeat", "double-click", "multi-click", etc.

In the background, the CPU is constantly reading the A/D converter(s) on multiple channels, performing signal processing (perhaps utilizing single or double precision floating point math), monitoring the overall system (i.e battery levels, self-diagnostics), etc.

Finally, real-time results must be displayed in a timely and consistent fashion (like 3 or more updates on the LCD per second).

Mostly, I use 18F series parts of various flavors, depending on the peripherals I need for the application. A typical application requires about 10,000 lines of assembly code (including comments, definitions, and white space). The code must be readable and understandable (for easy maintance), and, most importantly, extensible. I usually add features over time, and they need to "drop-in" seamlessly without affecting the operation or timing of the original code.

I can tell you with 100% confidence that, given one of these applications, a single hard-coded timing loop will break the entire system. It is simply impossible to write such a beast, and have it function properly, with such loops.

Now, back to your example:

Your "application" may be simple, but it does not work, at least from the viewpoint of reliability, which I insist on. The problem is that, if your analog input is close to the threshold value to turn on your LED, the LED may turn on, or it may turn off. Or it may blink randomly. Did you intend this?

The smart guy in the back of the room says, "well, just add a little bit of digital hysteresis." Yes, but how much? And, you lose resolution and accuracy.

The source of the problem is noise. No matter how good a PIC A/D converter is, there will always be noise in your input signal, and it will cause a variation in your conversion results.

So, here's another assertion I will make, mostly to make MMcLaren cringe:

Never, never, NEVER, ever rely on a single A/D conversion to be representative of your input signal. It is incredibly bad practice.

Microchip provides you with an *awesome* A/D converter that can perform thousands of conversions per second. Why do just 2 per second? Simply averaging, say, 256 conversions every 1/2 second will improve your signal-to-noise ratio (for asynchronous noise) by a factor of 8 (the square root of the number of samples). Now, you can more reliably test against a fixed threshold, and your LED will be more consistent. If you wish, you can also provide digital hysteresis (to prevent random blinking), that consists of a threshold of a fraction of a bit.

The smart guy in the back of the room says, "ok, then why not implement multiple conversions/accumulations within the delay routine?"

That will definitely help eliminate asynchronous noise, but at a significant cost: you must account for the number of conversions and the time consumed by them, and, you are still susceptible to synchronous noise. In the former, your code becomes unmaintainable -- if you change the number of conversions or the Tad time, you must also now adjust your static timing loop. In the latter, your digital value may change over time (drift in a sinusoid fashion) due to noise caused by the CPU or CPU driven signals at a different period than your a/d accumulation period. Think of a "beat" frequency.

The smart guy in the back of the room says, "ok, then lets just run the a/d accumulations in the background (via interrupts), synchronized with system noise, and go back to our original loop in the foreground."

Getting better. But, now your loop time is indeterminable, especially if you are running other interrupts in the background. Each interrupt steals instruction cycles from you main code, causing your hard coded delays to become randomly longer than desired.

The only real solution to these problems is to eliminate the hard coded delays and run all your timing from a consistent system clock, similar to the code I provided Eric.
 

MMcLaren

Joined Feb 14, 2010
861
Joey,

Could I impose and take advantage of your expertise, please? Do you have any experience with Maxim/Dallas OneWire™ devices? If so, is there a way to eliminate the delays in the critical "r/w slot" timing? The driver (below) was designed to support any clock, 4..32 MHz (or 4..40 MHz with a minor change), and run on 12, 14, or 16 bit core devices.

I believe this may be one example where using a cycle accurate fixed delay subsystem makes sense but I'd be very excited to see a simpler/better solution.

Thanks in advance. Regards, Mike

Rich (BB code):
;
;  Ow.rwByte, send byte in WREG. send 0b11111111 to read a byte 
;  with read result in 'OwByte'.
;                                 Mike McLaren, K8LH
        radix   dec
Ow.rwByte
        movwf   OwByte          ; 
        movlw   8               ;
        movwf   BitCtr          ;
        rrf     OwByte,W        ; put bit 0 in Carry
rwloop  call    Ow.rwBit        ; send bit in Carry
        rrf     OwByte,F        ; 
        decfsz  BitCtr,F        ; done (all 8 bits)?
        goto    rwloop          ; no, loop, else
        retlw   0               ; exit
Rich (BB code):
;
;  Ow.rwBit (4..32 MHz clock), send bit in Carry. use carry = 1
;  to read bit from DS18B20, placing read result in carry.
;
        radix   dec
Ow.rwBit
        movlw   MaskLo          ; start 60 us rw slot
        tris    GPIO            ; falling edge             0 us
        goto    $+1             ;
        goto    $+1             ;
        skpnc                   ; skip if bit = '0', else
        movlw   MaskHi          ; mask to release buss
        tris    GPIO            ; low pulse is 1..8 us
        uDlyCy (14*usecs-8)     ; 14 us minus 8 cycles
        btfss   owpin           ; sample owpin at exactly 14 us
        clrc                    ; clear Carry if '0'
        uDlyCy (47*usecs-3)     ; balance of 60 us slot
        movlw   MaskHi          ; mask to release buss
        tris    GPIO            ; read/write slot ends at 61 us
        retlw   0               ;
 
Last edited:

joeyd999

Joined Jun 6, 2011
5,287
Do you have any experience with Maxim/Dallas OneWire™ devices? If so, is there a way to eliminate the delays in the critical "r/w slot" timing?
Never used it. Took a look at the data sheet. Gimme some time to think about it.

The driver (below) was designed to support any clock, 4..32 MHz (or 4..40 MHz with a minor change), and run on 12, 14, or 16 bit core devices.
Sexy code. I like it. It is especially nice that it accommodates both hardware and clock speed changes (though, reassembly is necessary, of course).

I believe this may be one example where using a cycle accurate fixed delay subsystem makes sense but I'd be very excited to see a simpler/better solution.
Doesn't violate my principles...if interrupt are not available, delay loops are ok, maybe even unavoidable. In order to use this part, assuming it could not be interrupt driven, interrupts must be turned off or you will never get the timing right. And, if interrupts could be used, you'd be limited to only one source (the timer driving the one-wire interface), or, if supported, one high-priority interrupt. So you'd pretty much have to block execution of all other code anyway.

I had a similar issue with the Microchip UNI/O interface. The signal edges (in my implementation) are only 10us apart. With an 8mhz processor, that's only 20 instruction cycles. Not really enough time to do context save/restore + bit manipulations. I had to punt. Basically, shut down the entire system, stop all interrupts, do the necessary UNI/O processing, and restart/resynchronize the entire system. The total UNI/O stream took at most a few hundred milliseconds, and I arranged it so that UNI/O processing took place during a time when it was not expected the user would be manipulating the device.

The nice thing about the UNI/O is the timing is self-calibrated, and the bit frequency can vary 10:1 (from 10kbps to 100kbps). I was running 50kbps at 8MHz, so I could also run as high as 16MHz or as low as 1.6MHz without modifying the code.

You're code looks pretty optimal. But I'll see if there is something I can do with it. No promises. :)

Edit: When Microchip develops a hardware peripheral for one-wire and/or UNI/O on the PIC, then even *these* loops won't be necessary!

Edit 2: BTW, when I mentioned 1-wire in my post above, I was not referring to the Maxim OneWire interface...I've actually developed a few of my own single wire interfaces (PIC-to-PIC) that do operate within the context of interrupts.
 
Last edited:

MMcLaren

Joined Feb 14, 2010
861
Joey, thank you for taking time and for taking a peek.

My apologies to Eric for the slight diversion.

Eric, how's it going? Are you adding an LCD or LED display now?

Regards to all, Mike
 

joeyd999

Joined Jun 6, 2011
5,287
Joey, thank you for taking time and for taking a peek.

My apologies to Eric for the slight diversion.

Eric, how's it going? Are you adding an LCD or LED display now?

Regards to all, Mike
Yes, this thread got hijacked, didn't it...

Couple of quick notes before getting back on topic:

In my post above, I indicated 256 A/D accumulations improves S/N by 8:1. Obviously, SQRT(256)=16, so it should have been 16:1. The number of available bits increases by 8, and the bit jitter (due to asynchronous noise) shifts right by 4 bits.

Regarding the UNI/O, to optimize for maximum bit rate (regardless of fosc < 16Mhz), I needed specific delays of 1, 2, 3, 4, 5, 6, 7, 9, 14, 15, 19 and 24 instructions cycles. For 1, 2 and 3, I used nops and bra $+2, for everything else, here is my solution:

Rich (BB code):
;************************************************
;** DELAYnn -- delay for nn instruction cycles ** 
;**   including CALL and RETURN                **
;************************************************

delay24	rcall	delay05
delay19	rcall	delay04	
delay15	nop
delay14	rcall	delay05
delay09	bra	$+2
delay07 nop
delay06 nop
delay05	nop
delay04	return	

;** END DELAYnn **
It uses no registers, 9 code words, and 2 stack levels.
 

MMcLaren

Joined Feb 14, 2010
861
Regarding the UNI/O ... here is my solution: ... It uses no registers, 9 code words, and 2 stack levels.
So UNI/O timing is such that you can use the same code, delays, and timing with any clock between 1.6 and 16 MHz? If so, I don't think you could come up with much simpler or tighter delay code. It doesn't mess up the status register and it could be easily modified to support the 14-bit core devices. It's not really very "general purpose" but then it doesn't have to be for that application.

Regards, Mike
 
Last edited:

joeyd999

Joined Jun 6, 2011
5,287
So UNI/O timing is such that you can use the same code, delays, and timing with any clock between 1.6 and 16 MHz? If so, I don't think you could come up with much simpler or tighter delay code. It doesn't mess up the status register and it could be easily modified to support the 14-bit core devices. It's not really very "general purpose" but then it doesn't have to be for that application.

Regards, Mike
Yes, the UNI/O automatically adjusts for bit rates from 10kbps to 100kbps. With my solution, in order to run > 16Mhz, additional delays would need to be added. Below 1.6Mhz, there is no solution without further optimizing the code, which I don't think is possible.

BTW, the Microchip solution (AN1183) gives a data rate of 37.04 kbps (@ 8Mhz) which is 25% slower than mine, and *does not* support data buffering (even though the app note claims to). Mine supports 16 byte write and 64 byte read buffers.
 

MMcLaren

Joined Feb 14, 2010
861
Joey, I apologize for dragging this out, but, what was the device you were using? I remember coming across a Microchip product with UNI/O a couple years ago but I can't remember what it was... AN1183? Serial EEPROM?
 
Top