05 March 2015

The History of Jitter

The story of jitter spans 45-baud telegraph machines to 160-Gbaud optical fiber
Figure 1: The story of jitter spans 45-baud telegraph
machines to 160-Gbaud optical fiber
Jitter is a signal-integrity gremlin that's been with us for a long time. In fact, it's been with us since before anyone really needed to care about it. But as time has worn on, our perception of jitter has certainly changed, and with it our approaches to diagnosing it, measuring it, and ultimately dispatching it. Here, we'll begin a traversal of the "jitter story," surveying where we've been, where we are, and where we may be going in our dealings with the phenomenon.

There's no simple, straight path through the history of jitter. Rather, it's a story of numerous instruments, inventors, and twists and turns. We know, however, that it is borne of the ascent of serial data rates from a 45-baud telegraph receiver to the venerable 9-pin serial port to optical fiber carrying signals out to 160 Gbaud and up (Figure 1). Along the way, we've seen real-time oscilloscopes, sampling oscilloscopes, time-interval analyzers, phase-noise analyzers, and bit-error-rate (BER) testers thrown at the problem in our efforts to understand and tame it.

Jitter happens when data edges and their associated clock signals aren't marching in step
Figure 2: Jitter happens when data edges and their
associated clock signals aren't marching in step
To take a step back for a moment, why do we care about jitter? The short version: It causes bit errors. Fundamentally, jitter is a horizontal (or time-based) phenomenon in which the edges of waveform transitions arrive early or late with respect to the clock that is latching the signal. If, for instance, the data edge arrives after its companion clock edge, then a bit that was supposed to be latched as high will be latched as low. Wrong edge timing begets incorrect latching which begets bit errors.

In the early days of digital logic—the 1960s—the issue surrounding timing measurements and proper latching concerned setup and hold times. Investigation of setup and hold performance was relatively straightforward, even with the analog oscilloscopes of the day. One would trigger on the clock and measure the time from one edge to the next using cursors. In other words, you'd try to duplicate the timing diagrams on the datasheet to see if you fell within the requisite timing margins.

Remember the carefree 70s and 80s, when no one really cared very much about jitter?
Figure 3: Remember the carefree 70s and 80s, when
no one really cared very much about jitter?
The fact is that in those early digital days, jitter just wasn't much of an issue. Even into the 1970s and 1980s, with parallel buses, data rates in the tens of megabits/s, and rise times of nanoseconds, jitter still did not raise many alarms. If a unit interval was that long with correspondingly long setup and hold times, then the thickness of the edge relative to the overall parameters made the likelihood of timing uncertainty causing a bit error extremely low (Figure 3).

But by the late 1990s, the scenario was very different with respect to jitter. The transition from parallel to serial data buses was well underway. Data rates had climbed into the gigabits/s range while rise times had dropped into the hundreds of picoseconds. As a result, a little fuzziness on a rising or falling edge had become much more significant with respect to the entire unit interval.

Thus, in the late 1990s, the question had become, "How do I characterize setup and hold times with any real level of certainty?" Which is to say, how much jitter is there? Ah, NOW it matters! One simplistic method that gained prevalence was to measure the peak-to-peak jitter on eight clock edges. Obviously, this is not a particularly accurate method, as there will be a good amount of variation on any eight given edges of a clock output. One thing had become clear: Jitter affects setup and hold margins. The longer we measure for, the shorter setup and hold times become, and the tighter the margins become.

Around this time, some advances in measurement technology arrived that allowed edge times to be analyzed with a bit more detail. Stay tuned for subsequent posts that continue the story of jitter.



No comments:

Post a Comment