So here's what I want to do:
Read a serial stream using an AVR (ATMega8) (serial stream protocol details here):
http://www.guysoflidar.com/files/v1_protocol.pdf
I'm quite familiar with the AVR, just never had to write a software serial decoder before...
Would you suggest I use the device's hardware timers and trigger an interrupt when it reaches a certain count?
It sounds like normal async serial, but with a lot more databits than 8. I've never used AVR, but if there is an interrupt on level change (and poll it at first so you can syncronize it with the idle part), you could use that to handle all the receiving before returning.
With that many bits there isn't any room to drift out of sync. Writing the timed code loop should be simpler if the CPU oscillator/timer is matched exactly to the bit rate. Otherwise seems like there'd be a lot of fractional CPU cycles to account for..
I'm assuming you even need to read all the bits though, worst case.
I wouldn't use the interrupts much like this, except to start it would be nice (if possible), unless you really need to have it interrupt something. Could be error-prone also if that was able to affect the interrupt timing.
No, that's a combined synchronous serial, much like the 1wire protocol. Measure the time of each falling edge, then measure the time to the next rising edge -- the 1/0 state is when the edge spacing is more/less than ~380us. Very easy using something like the PIC's capture function.
Ah, this type of serial is new to me, that's pretty interesting, thanks for the correction. Yeah the PIC's input capture peripheral sounds good for that, I don't know what the ATMega8 has though (of course it can be polled and timed too if need be).