Convolution Interactive Visualization

Interactive demonstration of convolution: flip, slide, multiply, sum process in time and frequency domains

Continuous:
(f * g)(t) = ∫ f(τ)·g(t-τ) dτ
Discrete:
(x * h)[n] = Σ x[k]·h[n-k]
Convolution Theorem:
FFT(x * h) = FFT(x) · FFT(h)

Time Domain Convolution

Input Signal x[n] x[k]
Flipped Kernel h[-k] h[-k]
Shifted Kernel h[n-k] h[n-k] (n=0)
Product x[k]·h[n-k] x[k]·h[n-k]
Output y[n] = x[n] * h[n] y[n]

Current Metrics

Position n: 0
Output y[n]: 0.000
Sum of Products: 0.000
Overlapping Samples: 0
Calculation: y[0] = 0

Animation Controls

0
5

Signal & Kernel Configuration

Understanding Convolution

Step 1: Flip the Kernel

First, flip the kernel h[k] to get h[-k]. This mirrors the kernel horizontally.

Step 2: Slide to Position

Shift the flipped kernel to position n to get h[n-k]. This determines where we're computing the output.

Step 3: Multiply

Multiply the input signal x[k] by the shifted kernel h[n-k] at each overlapping position.

Step 4: Sum

Sum all the products to get the output value y[n] at position n.

Key Concepts

Commutative: x * h = h * x
Associative: (x * h) * g = x * (h * g)
Identity: x * δ = x (delta function)
Convolution Theorem: Time convolution = Frequency multiplication

Applications

  • Image filtering (blur, sharpen, edge detection)
  • Audio processing (reverb, equalization)
  • Neural network convolution layers
  • Probability distributions
  • Signal smoothing and noise reduction