Signals are commonly understood to be functions of time obtained by observation of physical variables. In this guide, a signal is defined more restrictively as a finite sequence of integer samples, usually obtained by digitizing a continuous observed function of time at a fixed sampling frequency expressed in Hz (samples per second). The time interval between any pair of adjacent samples in a given signal is a sample interval; all sample intervals for a given signal are equal. The integer value of each sample is usually interpreted as a voltage, and the units are called analog-to-digital converter units, or adu. The gain defined for each signal specifies how many adus correspond to one physical unit (usually one millivolt, the nominal amplitude of a normal QRS complex on a body-surface lead roughly parallel to the mean cardiac electrical axis). All signals in a given record are usually sampled at the same frequency, but not necessarily at the same gain (see section Multi-Frequency Records, for exceptions to this rule). MIT DB records are sampled at 360 Hz; AHA and ESC DB records are sampled at 250 Hz.
The sample number is an attribute of a sample, defined as the number of samples of the same signal that precede it; thus the sample number of the first sample in each signal is zero. Within this guide, the units of time are sample intervals; hence the "time" of a sample is synonymous with its sample number.
Samples having the same sample number in different signals of the same record are treated as simultaneous. In truth, they are usually not precisely simultaneous, since most multi-channel digitizers sample signals in "round-robin" fashion. If this subtlety makes a difference to you, you should be prepared to compensate for inter-signal sampling skew in your programs.
Go to the first, previous, next, last section, table of contents.