Time-Domain Signal Analysis
Prince E. Adjei
Kwame Nkrumah University of Science and Technology
Topic: Time Domain Analysis Module 1: Time and Frequency Domain Analysis
Biosignal Processes And Analysis (BME 366)
© 2025 Prince E. Adjei
(1). Importance of Time-Domain Features
(2). Descriptive Statistics for Biomedical Signals
(3). RMS and Peak-to-Peak Analysis
(4). Segmentation Methods in Signal Processing
(5). Noise Quantification
Topics:
Time Domain Signal Analysis
Learning Objectives
Justify the use of time-domain features in biosignal analysis.
Compute and visualize descriptive statistics (mean, median,
variance) on biosignal data using graphical overlays.
Calculate RMS and peak-to-peak values, and relate them to
physiological conditions such as muscle fatigue.
Differentiate between fixed and sliding windowing methods,
and apply segmentation using hop-size heuristics.
Review
Noise in bio-signals refers to any unwanted interference or
disturbances that affect the quality of signals.
Types of noise
Random noise: Unpredictable background fuzz from
electronics or digitization (Example, thermal noise).
Physiological noise: Unwanted signals from other body
processes (e.g., breathing or eye blinks in EEG).
Structured noise: Predictable interference with a clear pattern
(Example,50/60 Hz powerline )
Time Domain Signal Analysis
Time domain signal analysis involves examining how the signal varies
over time.
Time-domain features are numerical characteristics derived from the
signal’s amplitude variation over time.
They describe the shape,energy, and structure of the signal without
converting it into the frequency or other domains.
These features are especially useful in biomedical signal processing
(like ECG, EMG, EEG), where signals are often analyzed in real-time or
on resource-constrained devices.
Why time-domain features?
Low computational cost Efficient to calculate, suitable for real-
time applications and embedded systems.
No transformation required Unlike frequency-domain features,
no need for FFT or complex processing.
Clinically relevant Directly relates to physiological events (e.g.,
ECG heart rate, EMG muscle activation).
Intuitive interpretation Features like mean, peak, and RMS are
easy to understand and visualize.
Robust to short windows Can be computed over small
segments, making them responsive to changes.
Descriptive statistics
Descriptive statistics are numerical measures that summarize and
describe the main characteristics of a dataset.
In signal processing, especially biomedical signals like ECG or EMG,
they help you understand the shape,central tendency, and
variability of the signal over time.
Common descriptive statistics include mean,median,variance,
and standard deviation
The mean is ameasure of central tendency that gives the average
value of a set of numbers.
In signal processing, it represents the average amplitude of a signal
over time and is often used to understand the baseline level of the
signal.
For a signal x[n] with N samples:
Mean
Where:
μ is the mean (average), N is the total number of samples and x[n] is
the signal value at index n
The mean is sensitive to outliers large spikes or noise can distort it.
Why use mean in signal analysis
Represents average signal level: The mean gives a quick summary of
the signal’s typical value over time.
Helps detect baseline drift: Changes in the mean across windows can
indicate baseline shifts, which are common in ECG recordings due to
movement or respiration.
Estimates signal strength (EMG): For rectified EMG signals, the mean
reflects muscle activation level, helping monitor fatigue or exertion.
Reduces random noise (averaging): Averaging multiple trials smooths
out noise and highlights consistent signal patterns.
The median is a measure of central tendency that represents the
middle value in a dataset that has been sorted.
It divides the signal into two halves: 50%of the values lie below it,
and 50% lie above it.
How to Calculate the Median
Given a set of N signal values x[n], sorted in ascending order:
If Nis odd:
Median
If N is even:
Robust to outliers: Unlike the mean, the median is not affected by
extreme values (spikes, noise).
Good for artifact-prone data: In ECG, EMG, or EEG, sudden artifacts
won’t heavily distort the median.
Useful for trend analysis: Tracking changes in median over time can
help detect drift or physiological transitions.
Why use Median in Signal Analysis
Variance is measure of the dispersion of the data points around the
mean, calculated as the average of the squared differences from
the mean.
In other words, it quantifies the spread or variability of the signal.
Variance
𝒙𝒊= each signal value
μ= mean of the signal
N = total number of values
Variance helps differentiate active vs. resting states in biosignals.
High Variance
Signal values fluctuate significantly
Indicates a dynamic or noisy signal
Example: EMG signal during muscle contraction
Low Variance
Signal values are relatively stable
Suggests a consistent baseline or low activity
Example: ECG signal at rest
Interpreting Variance in Signal Analysis
Graphical overlays on ECG strip
Root Mean Square (RMS) is a measure of the signal's power or
energy content over time.
Used extensively in EMG to quantify muscle activation levels.
RMS increases with muscle fatigue, effort, or stimulation.
Example: Higher RMS in the biceps EMG during lifting indicates
more muscle recruitment.
Mathematically :
Root Mean Squared
Peak to Peak
Peak to peak is the difference between the signal’s maximum
and minimum values.
Mathematically : Peak to Peak (P2P)=max(x)−min(x)
Used in ECG to assess QRS complex strength or detect motion
artifacts.
In EEG, it helps detect epileptic spikes or large voltage shifts.
Example:
A reduced P2P in an ECG may indicate signal attenuation from
poor electrode contact.
RMS & peak-to-peak
Standard deviation measures how spread out or dispersed the values in
a dataset are from the mean (average).
Mathematically
Where:
𝒙𝒊= each data point, μ= mean of the data and N = number of data points
Low σ:Data points are close to the mean smooth, stable signal
High σ:Data points are widely spread signal has more variability or
noise.
Other descriptive statistics
Skewness is a statistical measure that describes the asymmetry of a
distribution.
Mathematically :
Positive skew: Long tail on the right (values are concentrated on the
left).
Negative skew: Long tail on the left (values are concentrated on the
right).
Zero skewness: Symmetrical distribution (like a normal curve).
Helps detect bias in signal values. For example, skewed EMG
amplitude during fatigue.
Other descriptive statistics
Other descriptive statistics
Kurtosis is a statistical measure that describes the shape of a
distribution's tails about its overall shape.
Mathematically:
Describes the peakedness and tail heaviness of a distribution.
High kurtosis (>3): Sharper peak, heavy tails (more extreme
values/outliers).
Low kurtosis (<3): Flatter peak, lighter tails (fewer outliers).
Kurtosis 3: Normal distribution (mesokurtic).
Identifies signals with frequent spikes or bursts (e.g., seizure
activity in EEG).
Other descriptive statistics
1. Why are time domain features
useful in biomedical signals?
2. When is median preferred over
the mean in signal analysis?
3. What does RMS and variance tell
us about a signal?
4. What does peak to peak of a
signal measure?
Questions
Windowing and Segmentation
Windowing is the process of breaking a long signal into smaller, shorter
segments called windows so we can analyze local behaviors within
those segments.
Segmentation is a more semantic process dividing a signal into
meaningful sections, often based on events or physiology.
Biomedical signals are often non-stationarytheir properties (like
frequency, amplitude, or shape) change over time.
Helps us focus on short durations where the signal is quasi-stationary.
Detect time-localized events like QRS complexes, muscle fatigue, or
epileptic spikes.
Fixed vs Sliding Window
Feature
Fixed Window
Sliding Window
Definition
Non overlapping segments
Overlapping segments
Use case
When events are well aligned or periodic
For continuous montoring and peak
detection
Pros
Faster computation
Better temporal resolution
Cons
May miss key transitions
More computation; redundancy possible
Hop size is how much the window moves forward each step.
Common Rule:
50% overlap: balances redundancy and temporal resolution.
Other options:
25% hop (75% overlap): better detection accuracy
100% hop (0% overlap): fastest, but risk of missing fast events
Hop-Size Rule of Thumb
Event Based Segmentation
A method of dividing a signal based on the occurrence of meaningful
events, rather than fixed time intervals.
In ECG, this typically means segmenting around R-peaks (heartbeats)
or QRS complexes.
Event-based segmentation allows:
Focused analysis on individual events (heartbeats, spikes, bursts)
More meaningful feature extraction
Reduced noise by ignoring uninformative regions
Event Based Segmentation
1. Thresholding
You set a value threshold to detect significant events.
Useful for identifying sudden changes in the signal (like the QRS
complex).
2. Hysteresis
Prevents false or repeated detections due to noise.
Uses two thresholds:
Ahigh threshold to detect event onset
Alow threshold to end the event or ignore small fluctuations
This avoids jittery or unstable segmentation.
Event Based Segmentation
In the Pan-Tompkins algorithm, event-based segmentation is centered
on R-peaks, which are detected using:
Slope, amplitude, and width analysis
Bandpass filtering, differentiation, squaring, and moving window
integration.
Once R-peaks are detected, you can:
Segment each heartbeat from one R-peak to the next (R-R interval)
Or extract pre-R to post-R windows
These R-peaks become your anchors for event-based segmentation.
1. What is the main difference
between windowing and
segmentation?
2. Why is windowing important in
analyzing biomedical signals?
3. What does hysteresis prevent
in event-based segmentation?
Questions
Noise taxonomy
Physiological Noise
Origin: From the patient’s own body, but not from the target
organ/system.
Examples:
Muscle activity (EMG) contaminating EEG
Eye blinks (EOG) affect EEG.
Respiratory movements distorting the ECG baseline
Motion artifacts during EMG or PPG.
Challenge: Hard to remove, because it's still biological and often
overlaps with the signal of interest.
Noise taxonomy
Instrumentation Noise
Origin: From the hardware or electronics used to record the signal.
Examples:
Thermal noise in amplifiers
Quantization noise from ADC (analog-to-digital conversion)
Electrode-skin impedance fluctuations
Characteristics: Typically random, but can sometimes be mitigated
by high-quality hardware and filtering.
Noise taxonomy
Environmental Noise
Origin: From the external environment, not the body or the device.
Examples:
Powerline interference (50/60 Hz hum)
Electromagnetic interference (from phones, Wi-Fi, fluorescent
lights)
Movement of cables or devices causes artifacts
Characteristics: Often structured/predictable (periodic), and
filtered using notch filters or shielding.
Quantifying Noise
Baseline Wander (BW)
Slow, low-frequency drift in the signal baseline usually below 0.5
Hz.
Cause: Respiration, body movement, electrode shifts
Seen in: ECG, EEG
How to quantify:
Estimate using a high-pass filter or polynomial fitting
Measure the standard deviation (σ)or peak-to-peak of the drift
Example: In ECG, BW can mask P-waves or distort ST segments
SNR (Signal-to-Noise Ratio): Measures how strong your desired signal
is relative to background noise.
Formula:
Higher SNR = Cleaner signal
Typical range in bio-signals:
ECG: ~1030 dB
EEG: 010 dB
EMG: Varies widely depending on contraction
Quantifying Noise
6 dB/Bit Rule What Is It?
Every 1-bit increase in the resolution of an analog-to-digital converter
(ADC) improves the Signal-to-Noise Ratio (SNR) by approximately 6.02
dB.
Where:
N = number of bits of the ADC
1.76 dB is a constant from uniform quantization theory
In biomedical signal processing:
You want high signal fidelity
Low-resolution ADCs add quantization noise (discretization error)
More ADC bits = higher resolution = less quantization noise
Quantifying Noise
Case Study: EMG Envelope Extraction and SNR Computation
To assess muscle activity and quantify noise in an EMG signal, we
extract its envelope and compute the signal-to-noise ratio (SNR).
Steps:
Rectify: Convert all EMG values to positive to reflect absolute muscle
activity.
Moving RMS: Apply a moving root-mean-square window (e.g., 100 ms)
to smooth the signal and obtain the amplitude envelope.
Log scale: Apply a logarithmic transformation to compress the range
and enhance small fluctuations.
Compute SNR: Estimate signal power from the RMS envelope and noise
power from baseline wander (via low-pass filtering), then compute the
SNR
Higher RMS suggests stronger activation or fatigue; lower SNR indicates
more noise (e.g., from movement artifacts or poor electrode contact).
Summary
Time-domain analysis looks at how a signal’s amplitude changes
over time no need for frequency transforms.
Time-domain features like mean, RMS, and variance summarize
signal behavior and are useful for pattern detection and
diagnosis.
RMS indicates signal energy.
Peak-to-peak gives the total signal range (max minus min
amplitude).
Summary
Segmentation divides signals into smaller windows (fixed or sliding).
Sliding windows often use 50% overlap for smoother analysis.
Event-based segmentation detects key signal events using
thresholds and hysteresis to avoid false triggers.
Noise sources can be physiological (e.g., breathing), instrumental
(e.g., electrode movement), or environmental (e.g., powerline
noise).
SNR compares signal strength to noise level. Higher SNR means a
cleaner, more usable signal.
Recommended Text
Rangayyan, R. M. (2015). Biomedical signal analysis: A case-study approach
(2nd ed.). IEEE Press Series in Biomedical Engineering. WileyIEEE Press.
ISBN: 978-0-470-01139-6.
Palaniappan, R. (2011). Biological signal analysis. University of Essex
Delgutte B (2007). Course materials for HST.582J / 6.555J / 16.456J,
Biomedical Signal and Image Processing, Spring 2007. MIT OpenCourseWare,
Massachusetts Institute of Technology.