Personal Homepage

Personal Information:

MORE+

Main positions:Director, High Performance Computing Platform, PKU
Degree:Doctoral degree
Status:Employed
School/Department:Institute of Theoretical Physics

Lei Yian

+

Education Level: Postgraduate (Doctoral)

Administrative Position: Associate Professor

Alma Mater: Peking University

Blog

Current position: Lei Yian Homepage / Blog
The Classical Nature of Quantum Mechanics from the Perspective of Signal Processing
Hits:

Introduction: A Demystified Mathematical Framework

In the pantheon of modern physics, quantum mechanics is often revered as the most enigmatic and counterintuitive theory. From Schrödinger’s cat to quantum entanglement, from the uncertainty principle to wave–particle duality, these concepts appear to defy our most basic intuitions about reality. Yet when we shift our gaze toward an ostensibly unrelated field—signal processing—we encounter a striking revelation: the core mathematical structure of quantum mechanics is identical to the analytical tools engineers routinely employ to process signals. This correspondence is not merely superficial; it reflects a deep mathematical isomorphism, revealing a crucial insight: much of quantum mechanics’ perceived “mystery” is artificially constructed. At its heart, it is simply a universal mathematical framework for describing wave-like systems.

1. Hilbert Space: A Shared Mathematical Stage

1.1 Hilbert Space in Signal Processing

In signal processing, Hilbert space serves as the foundational mathematical framework for signal analysis. Each signal is treated as a vector in this infinite-dimensional space. Engineers decompose complex signals using orthogonal basis functions—most commonly sine and cosine functions—much as one naturally describes any point in three-dimensional space using x, y, and z coordinates.

The Fourier transform, a cornerstone of signal processing, is essentially a change of basis within Hilbert space. It maps a time-domain signal into its frequency-domain representation, revealing the constituent frequency components. This process is entirely deterministic, computable, and devoid of any mystery.

1.2 Hilbert Space in Quantum Mechanics

Quantum mechanics employs precisely the same mathematical structure. Quantum states are represented as vectors in Hilbert space, and observables correspond to operators acting on this space. Wavefunctions are expanded using orthogonal basis functions, analogous to spectral decomposition in signal processing.

The Schrödinger equation—the bedrock of quantum mechanics—is mathematically indistinguishable from classical wave equations. It is a linear partial differential equation governing the time evolution of a system’s state. If we interpret an electron’s wavefunction not as a mysterious “probability cloud” but as a mathematical description of an oscillatory mode, the entire picture becomes immediately transparent.

2. Projection and Measurement: The Quantum Avatar of Filtering

2.1 The Essence of Signal Filtering

Filtering is among the most fundamental operations in signal processing. A low-pass filter preserves low-frequency components; a high-pass filter retains high-frequency ones. Mathematically, filtering is a projection of the signal onto a subspace of Hilbert space. This projection is linear and reversible (provided no information is lost), fully consistent with the formal definition of a projection operator.

When processing an audio signal with a digital filter, we are performing a sequence of such projections. Each filter corresponds to a projection operator mapping the original signal into a specific frequency band. There is no “collapse” or “uncertainty”—the output is entirely determined by the input and the filter’s characteristics.

2.2 The Mathematical Nature of Quantum Measurement

In quantum mechanics, measurement is described as the “collapse” of the wavefunction—a notion that sounds profoundly mysterious. Yet mathematically, quantum measurement is nothing more than a projection operation in Hilbert space. Measuring an observable projects the system’s state onto the eigensubspace associated with that observable.

This is mathematically identical to signal filtering. The only difference lies in interpretation: in signal processing, we say we “extract specific frequency components”; in quantum mechanics, we say “measurement causes wavefunction collapse.” The former sounds mundane; the latter, mystical. But the mathematics tells us they are one and the same.

3. Fourier Transform and Momentum Representation: Two Tellings of the Same Story

3.1 The Universality of Frequency-Domain Analysis

The Fourier transform is the Swiss Army knife of signal processing. It reveals a fundamental truth: any signal can be expressed exactly—not approximately—as a superposition of sinusoidal waves of different frequencies. Music, speech, images, video—all complex signals can be decomposed via the Fourier transform into simple frequency components.

In communication systems, modulation and demodulation techniques are built entirely on this frequency-domain understanding. AM encodes information in the amplitude of a carrier wave; FM encodes it in frequency. These techniques process vast amounts of data daily, underpinning the operation of modern communication networks.

3.2 Position–Momentum Duality

In quantum mechanics, the transformation between position and momentum representations is precisely the Fourier transform. This is no coincidence—it reflects a universal property of wave systems. Heisenberg’s uncertainty principle, often touted as quantum mechanics’ most “mysterious” feature, is in fact a direct mathematical consequence of the Fourier transform: precise localization in time precludes precise localization in frequency, and vice versa.

Signal processing engineers are intimately familiar with this trade-off. A short pulse necessarily has a broad spectrum; a narrowband signal must persist over a longer duration. This is why mobile communications must balance bandwidth against data rate, and why high-fidelity audio requires higher sampling rates. All are manifestations of the same underlying mathematical principle.

4. Time–Frequency Analysis and Quantum Wave Packets: The Art of Localization

4.1 Wavelet and Gabor Transforms

Wavelet and Gabor transforms (short-time Fourier transforms) address a key challenge in signal processing: how to obtain simultaneous time and frequency information. These methods employ localized oscillatory functions as bases, yielding a two-dimensional representation of signals in the time–frequency plane.

Gabor functions—sinusoids modulated by Gaussian envelopes—exhibit excellent localization in both time and frequency. They form an overcomplete basis capable of flexibly representing diverse signal features. Audio compression (e.g., MP3) and image processing (e.g., JPEG2000) rely heavily on such time–frequency analysis techniques.

4.2 Quantum Wave Packets and Coherent States

Quantum wave packets and coherent states are mathematically identical to Gabor functions. They describe quantum states that are partially localized in both position and momentum. Photons in a laser beam occupy coherent states; atomic vibrations in a crystal lattice form phonon wave packets—both are quantum manifestations of localized oscillatory modes.

Crucially, quantum wave packets evolve according to the same mathematical laws as classical wave packets. Phenomena such as dispersion, interference, and modulation were already thoroughly understood in classical wave theory long before quantum mechanics emerged. Quantum mechanics merely assigns a different physical interpretation to these shared mathematical structures.

5. Operator Algebra and Non-Commutativity: From Filter Banks to the Uncertainty Principle

5.1 Non-Commutativity in Filtering

In signal processing, the order of operations matters profoundly. Applying a low-pass filter followed by a high-pass filter yields a different result than the reverse sequence. This non-commutativity is a generic feature of linear operators and arises in any system involving projections and transformations.

Digital signal processor (DSP) design must carefully account for operation sequencing. Cascaded filter banks, multirate signal processing, and adaptive filtering all involve the deliberate orchestration of non-commuting operators. Engineers routinely manage such “uncertainty” relationships—they simply don’t label them as mysterious.

5.2 Commutation Relations in Quantum Mechanics

The canonical commutation relation in quantum mechanics—[x^,p^]=iℏ—is often presented as a fundamental hallmark of the quantum world. Yet this is merely the mathematical expression of the duality between Fourier-conjugate variables. Position and momentum operators fail to commute for the same reason that time-domain differentiation and frequency-domain multiplication do not commute.

A deeper insight is this: non-commutativity arises from our attempt to fully describe a system using limited information. When we choose one basis (e.g., position) to represent a system, information about its dual basis (momentum) becomes inherently “uncertain.” This is a fundamental limitation of information theory—not a mystical property of matter.

6. Entanglement and Correlation: From MIMO to Quantum Correlations

6.1 Correlations in Multi-Antenna Systems

Modern wireless communication leverages Multiple-Input Multiple-Output (MIMO) technology to boost data rates. Signals across multiple antennas exhibit strong spatial correlations, forming complex channel matrices. These correlations are exploited for spatial multiplexing and diversity, dramatically enhancing system performance.

In MIMO systems, the signal received at one antenna contains information about signals at others. Through appropriate signal processing, multiple parallel data streams can be separated and recovered. Such “entangled” signals are fully describable using classical linear algebra and probability theory.

6.2 Demystifying Quantum Entanglement

Quantum entanglement, famously dubbed “spooky action at a distance” by Einstein, has become the emblem of quantum mysticism. Yet mathematically, an entangled state is simply a vector in Hilbert space that cannot be expressed as a direct product of individual subsystem states—a structure that arises naturally in any multi-variable correlated system.

The key insight is this: entanglement describes correlation, not causation. Measuring one particle in an entangled pair reveals information about the other, but no information or energy is transmitted. This is analogous to rolling two correlated dice: knowing the outcome of one allows inference about the other, yet no “spooky” influence occurs.

While violations of Bell’s inequalities demonstrate that quantum correlations exceed classical probabilistic bounds, this may simply reflect an overly narrow definition of “classical.” If we accept that wave-like behavior and resonance are fundamental attributes of matter, strong correlations become entirely natural.

7. Modulation and Quantum Superposition: A Unified View of Information Encoding

7.1 Classical Modulation Techniques

Modulation is central to communication systems. Quadrature Amplitude Modulation (QAM) encodes information as points in the complex plane—each symbol being a superposition of sinusoids with specific amplitudes and phases. Orthogonal Frequency-Division Multiplexing (OFDM) goes further, using thousands of orthogonal subcarriers to transmit data simultaneously.

These techniques succeed because linear superposition enables parallel transmission of multiple information streams over a single physical channel. 4G, 5G, Wi-Fi—all modern communication standards rest on this principle. The gigabits-per-second data rates they achieve are a testament to the power of superposition.

7.2 Quantum Superposition and Quantum Computing

The quantum superposition state ∣ψ⟩=α∣0⟩+β∣1⟩  is mathematically identical to a QAM symbol. The “parallelism” of quantum computing stems from linear superposition, just as OFDM’s parallelism arises from subcarrier orthogonality.

Quantum algorithms—such as Shor’s and Grover’s—cleverly exploit interference effects to amplify the probability amplitudes of correct answers. This is fundamentally the same principle as adaptive beamforming: by adjusting phases, signals constructively interfere in desired directions and destructively interfere elsewhere.

The real challenge lies not in superposition itself (a universal feature of linear systems), but in maintaining coherence. In communications, this corresponds to controlling phase noise and frequency offset; in quantum systems, it means suppressing decoherence. Both face the same engineering challenge.

8. From Classical to Quantum: Continuity, Not Revolution

8.1 A Continuous Transition Across Scales

Physical phenomena evolve continuously from macroscopic to microscopic scales—there is no sharp “quantum boundary.” Mesoscopic systems—such as Josephson junctions, quantum dots, and ultracold atoms—exhibit a smooth transition from classical to quantum behavior. In these systems, we can directly observe how wave-like properties become pronounced as scale decreases.

The Superconducting Quantum Interference Device (SQUID) is a perfect example. It is a macroscopic superconducting loop that exhibits quantized magnetic flux—not because it “becomes quantum,” but because the coherence length of the superconducting state allows the entire loop to behave as a single coherent wave mode. Quantum effects are thus macroscopic manifestations of coherence, not exclusive to the microscopic realm.

8.2 Resonance and Energy Levels: From Musical Instruments to Atoms

The harmonic series of musical instruments provides an ideal analogy for atomic energy levels. A violin string supports only certain vibrational modes—discrete frequencies (the fundamental and its integer multiples)—due to boundary conditions: the ends must be nodes.

Electron orbitals in atoms follow the same principle. Electron wave modes must satisfy specific boundary conditions (decaying to zero at infinity), naturally yielding discrete energy levels. The Balmer series of hydrogen emission lines is mathematically isomorphic to the harmonic series of wind instruments.

The difference lies only in complexity: instruments involve one- or two-dimensional vibrations; atoms are three-dimensional. The restoring forces are mechanical in instruments, electromagnetic in atoms. But the governing mathematics—wave equations and boundary conditions—is identical.

9. Environment and Decoherence: From Noise to the Quantum–Classical Transition

9.1 Noise in Communication Systems

Every communication engineer knows that signals are inevitably corrupted by noise—thermal noise, shot noise, 1/f noise. These random disturbances limit channel capacity, as formalized by Shannon’s theorem. Strategies to combat noise include error-correcting codes, spread-spectrum techniques, and diversity reception.

Crucially, noise does more than interfere—it “classicalizes” the signal. In high-noise environments, fine phase information is lost, leaving only amplitude information reliably transmittable. This explains why AM radio is easier to receive than FM, despite inferior sound quality.

9.2 The Essence of Quantum Decoherence

Quantum decoherence describes how quantum superpositions lose coherence through interaction with the environment. From a signal processing perspective, this is precisely the process by which phase noise destroys signal coherence. The countless degrees of freedom in the environment act like myriad noise sources, rapidly randomizing the phase relationships of a quantum state.

The decoherence timescale depends on the strength of system–environment coupling. At room temperature, molecular decoherence occurs on femtosecond timescales; in millikelvin dilution refrigerators, superconducting qubits can maintain coherence for milliseconds. This difference is fully explainable in terms of signal-to-noise ratio—lowering temperature reduces thermal noise.

Quantum error correction, dynamical decoupling, and topological protection—techniques for preserving quantum coherence—bear deep resemblance to classical noise-suppression methods. Both are battles against entropy, striving to preserve information integrity in noisy environments.

10. A Classical Picture of Quantum Mechanics: Synthesis and Outlook

10.1 The Universality of Mathematical Structure

Viewing quantum mechanics through the lens of signal processing reveals a startling truth: its mathematical framework is universally applicable to wave phenomena. Hilbert spaces, linear operators, Fourier transforms, projection measurements—these tools are indispensable across acoustics, optics, communications, and quantum physics.

This universality is no accident; it reflects a deep mathematical reality. Principles such as linear superposition, orthogonal decomposition, and duality are essential for describing any wave system. Much of quantum mechanics’ “strangeness” stems from our insistence on interpreting inherently wave-like phenomena through a particle-centric worldview.

10.2 The Significance of Demystification

Recognizing the classical roots of quantum mechanics carries profound implications:

First, it simplifies conceptual understanding. Students need not grapple with confusing notions like “wave–particle duality” or the “observer effect.” Instead, they can build intuition from familiar wave phenomena and gradually extend it to quantum systems.

Second, it fosters technological innovation. Once we acknowledge the deep connection between quantum techniques and classical signal processing, a wealth of mature engineering methods becomes available for adaptation. Quantum error-correcting codes, for instance, draw directly from classical coding theory.

Finally, it restores unity to physics. There is no need to erect a chasm between classical and quantum realms. Nature is continuous—and our theoretical frameworks should reflect that continuity.

10.3 Future Directions

This perspective points toward several promising research avenues:

  • Develop a unified wave theory encompassing classical waves to quantum fields across all scales.

  • Systematically apply signal processing methodologies to quantum technologies to accelerate the practical realization of quantum computing and communication.

  • Re-examine foundational questions in quantum mechanics using the language of information theory and signal analysis.

  • Explore new physical pictures that interpret “particles” like electrons and photons as stable wave modes or resonant structures.

Conclusion: Returning to Physics’ Simple Truths

The mystification of quantum mechanics is a historical accident of 20th-century physics. In their quest to understand atomic structure, physicists developed a powerful mathematical toolkit—but burdened it with excessive philosophical interpretation. Through the mirror of signal processing, we see quantum mechanics for what it truly is: the natural mathematical language for describing wave and oscillatory systems.

This does not diminish quantum mechanics’ greatness. On the contrary, recognizing its classical roots deepens our appreciation for nature’s unity and elegance. From a vibrating violin string to an electron in an atom, from a modulated radio wave to an entangled photon pair—the same mathematical principles resonate throughout.

Scientific progress lies not only in discovering new phenomena, but in uncovering deep connections between seemingly disparate ones. When we strip away the mystique of quantum mechanics, we reveal a more beautiful truth: the universe is a grand symphony, and we are only beginning to grasp its harmonious laws.

Demystification is not diminishment—it is elevation. It brings quantum mechanics down from the altar and into the workshop, transforming it from metaphysics into engineering, from confusion into inspiration. In doing so, we not only understand nature more deeply, but also open vast new pathways for technological innovation.

The next century of quantum mechanics will be the century of its reintegration with classical theory. By then, the qualifier “quantum” may become redundant—for we will have realized that so-called quantum phenomena are simply the natural expressions of a wave-dominated universe. And that is the ultimate goal of physics: to understand the most complex phenomena through the simplest principles.