Personal Homepage

Personal Information:

MORE+

Main positions:Director, High Performance Computing Platform, PKU
Degree:Doctoral degree
Status:Employed
School/Department:Institute of Theoretical Physics

Lei Yian

+

Education Level: Postgraduate (Doctoral)

Administrative Position: Associate Professor

Alma Mater: Peking University

Blog

Current position: Lei Yian Homepage / Blog
What exactly is a quantum?
Hits:

1. Fundamental Concepts of the Quantum

The term "quantum" in physics originally denotes the "smallest unit," referring to the indivisible minimal "portion" of certain physical quantities (such as energy, momentum, angular momentum, etc.). At its core, this concept embodies discreteness: under specific conditions, certain physical quantities can only assume finite, discontinuous values (a process known as "quantization").

However, it is crucial to note that "quantum" is not synonymous with "elementary particle." A quantum does not represent a specific microscopic entity. For instance, the Planck energy quantum, photons (quanta of the electromagnetic field), phonons, and others are manifestations of quantization in different physical processes or fields. Non-elementary particles, such as atoms and nucleons, can also be described as quanta or, more precisely, characterized within a quantum mathematical framework.

Elementary particles (e.g., electrons, quarks, photons) constitute the fundamental building blocks of matter or fields in modern physics. While certain properties of these particles are indeed "quantized," the term "quantum" encompasses a broader scope: it includes not only the particles themselves but also quantum fields, quantum states, quantum excitations, energy level transitions, and all related quantized phenomena.

Equating "quantum" with specific "elementary particles" is a common misconception. Quantum physics investigates the ubiquitous quantized behaviors and probabilistic superpositions observed in natural phenomena, rather than being confined to particular known microscopic particles.

Further implications of the quantum concept will be discussed below.

2. Historical Roots in Ancient Atomism

The historical lineage of the "quantum" concept can be traced back to ancient Greek atomism (Atomism) in the pre-Christian era. Philosophers such as Democritus and Leucippus posited that the world consists of exceedingly small, indivisible "atoms" (from the Greek atomos, meaning "uncuttable") and void. They attributed the generation and transformation of all things to the combinations and motions of these fundamental particles. This marked the initial introduction of the idea that nature is composed of discrete minimal units, which conceptually resonates with the "discreteness" inherent in quanta.

Nevertheless, it is worth emphasizing that atomism did not become the prevailing view at the time. Many Greek thinkers, such as Aristotle, maintained that the world is fundamentally continuous, with matter capable of infinite subdivision, thereby rejecting the notion of indivisible particles. This "continuum perspective" profoundly influenced subsequent European physics and philosophy, persisting into the modern era.

Similar ideas emerged not only in ancient Greece but also in ancient India. Around the 6th century BCE, the Indian Vaisheshika school proposed the concept of "anu" (or paramanu), denoting extremely minute, indivisible fundamental particles of matter. They argued that the world is essentially composed of these minute, independent entities and void, with all phenomena arising from their combinations and transformations. The Indian notion of anu, emphasizing indivisibility and discreteness, parallels Greek atomism in intriguing ways.

These Eastern and Western conceptions of "minimal units" serve as philosophical precursors to the "discreteness" and "minimal quanta" in modern quantum theory. Although ancient atoms or anu differ fundamentally from the "quantum" in contemporary physics in terms of theoretical content and formulation, they both reflect humanity's early philosophical inquiries into the structure of nature—specifically, whether it possesses minimal, indivisible, and discrete components. It was only millennia later that physics, through rigorous mathematical and experimental validation, transformed these ideas into core concepts of the discipline.

3. Historical Roots in Ancient Greek Elementalism

Parallel to the development of atomism was the ancient Greek theory of elements. The most renowned formulation, the "Four Elements Theory," was proposed by Empedocles in the 5th century BCE. He posited that all things in the universe are composed of four fundamental elements—earth, air, water, and fire—mixed in varying proportions. These elements undergo mutual transformations in different natural processes (such as combustion, evaporation, and solidification), giving rise to the diversity of entities and their changes.

Aristotle further elaborated on this idea, associating the transformations among elements with four qualities: dry, wet, cold, and hot. This framework provided the theoretical foundation for the natural worldview that dominated medieval Europe for two millennia. Elementalism emphasized continuous change and infinite divisibility: any substance could undergo continuous transformation through the recombination of elements. This represented a "fluid-continuum" perspective on the world, asserting that nature lacks minimal indivisible units and consists solely of infinitely subdivisible variations.

The "element-continuum view" was not unique to ancient Greece. In ancient China, there existed the "Five Elements Theory" (metal, wood, water, fire, earth), while in India, a similar system of "earth, water, fire, air, and space" as the five great elements prevailed. These theoretical systems interpreted the world as constituted by a few basic, continuous, and miscible substances, with their essence being a "continuum."

This line of thought profoundly influenced the development of physics. For example, the concept of fields can be regarded as a modern extension of the "continuum view": electromagnetic fields, gravitational fields, and quantum fields all describe physical quantities that are continuously distributed and variable across spatial scales. The emergence of quantum theory can be seen as a synthesis and continuation of the "continuum-discreteness" dichotomy in modern science. Today, scientists have evolved the foundational description of the physical world from "merely particles" or "merely continuous fields" to a unified portrayal that incorporates both discreteness and continuity, integrated within the framework of Quantum Field Theory (QFT).

4. Attitudes of Ancient Greek Scholars Toward Atomism and the "Continuum" View

Although atomism is widely celebrated in later eras and regarded as a precursor to modern scientific thought, in the ancient Greek academic community, the dominant position was actually held by the continuum view of matter. Mainstream scholars, represented by Aristotle, generally maintained that matter is infinitely divisible and that the world lacks absolute minimal indivisible units. They viewed space and matter as essentially seamless, fluid entities capable of arbitrary subdivision—a concept akin to what we today term a "continuous medium."

In works such as Physics and Metaphysics, Aristotle systematically expounded the principle of "infinite divisibility" and denied the existence of "atoms or void." He argued that motion and change require the continuity of space and matter to ensure coherence; otherwise, the functioning of the world would encounter unacceptable "ruptures" and "voids." This perspective spread widely in the Mediterranean world, becoming the mainstream in Greek-Roman and medieval European natural science and philosophy.

This explains why the views of prominent ancient Greek atomists like Democritus and Leucippus failed to gain mainstream acceptance in Greece itself for an extended period, only to be "rediscovered" and exalted in later scientific history. In contrast, the notion that matter behaves like a flowing stream—capable of division into arbitrarily small parts while retaining its "continuity"—was deemed more aligned with "common sense" at the time.

The influence of this continuum view was extraordinarily profound, shaping Western conceptions of the universe's essence. It extended into the 17th-century Newtonian view of continuous space and the rise of "continuous medium theory" in the 19th century. Even in the early 20th century, prior to the quantum revolution, scientists generally assumed that all physical quantities should vary within a continuous and smooth background—the idea of discreteness in "quanta" was initially regarded by the mainstream scientific community as a temporary, even heretical, expedient.

Throughout the history of physics, this deeply ingrained "continuum-infinite divisibility" conception subjected theories of "quantization" and "discrete minimal units" to considerable skepticism. After over two millennia of intense intellectual debate, modern physics has sought to reconcile "continuity and discreteness"—that is, recognizing that the world simultaneously possesses both continuous and discrete essential attributes, manifesting in different ways across varying scales and phenomena.

5. Planck's Energy Quantum

At the dawn of the 20th century, physics was entering a pivotal phase of transformation. Toward the end of the 19th century, experiments on black-body radiation revealed energy distributions that defied explanation, particularly in the high-frequency region, leading to the "ultraviolet catastrophe"—according to classical theory, black-body radiation energy should approach infinity as frequency increases, which starkly contradicted experimental observations.

In 1900, Max Planck proposed a revolutionary hypothesis: the energy radiated outward by a black body is not released continuously but only in the form of minimal, indivisible "energy units"—termed "energy quanta"—with each quantum having a magnitude of E=hνE=hν (where hh is Planck's constant and νν is the radiation frequency).

Mathematically, the total energy of the black body becomes a discrete sum of integer quanta, with the "energy level spacing" hν embodying the discreteness of energy transitions.

Physically, this implies that energy flow is no longer akin to a continuous trickle of water but rather occurs in "discrete packets" exchanged among entities. This approach shattered the prevailing notion of "continuous energy variation," marking the inception of quantum theory.

Notably, Planck himself initially regarded the "energy quantum" merely as a mathematical assumption and did not believe that energy was truly transmitted in discrete blocks. He inclined toward viewing it as a "statistical tool" necessitated for explaining experimental results. It was only several years later, when Albert Einstein further endowed quanta with physical reality, that this perspective gradually gained acceptance in the physics community.

6. Einstein's Light Quantum

In 1905, Albert Einstein, addressing the photoelectric effect experiment, proposed the light quantum hypothesis. He argued that light is not, as posited by classical electromagnetic wave theory, a continuous flow of oscillatory energy, but rather consists of discrete, indivisible "energy packets" (termed light quanta, later known as "photons"). The energy of each photon is given by E=hνE=hν (where hh is Planck's constant and νν is the frequency), aligning precisely with the "energy quantum" in Planck's black-body radiation framework.

Einstein successfully explained the following experimental conundrums using this idea:

When light illuminates a metal surface, electrons are ejected only if the frequency exceeds a certain threshold, and the velocity of the ejected electrons increases solely with light frequency, independent of intensity—this contradicts the expectations of pure wave theory.

Light of different frequencies behaves like discrete spheres, each "striking" the metal; upon absorbing the full energy of a single photon, metal atoms can overcome binding forces and release electrons.

Physically, Einstein conceived of light as sometimes exhibiting wave-like behavior and at other times resembling a "shower of photons"—a stream of countless "particles" that deliver energy in granular form to objects. What appears macroscopically as continuous light is, in reality, a particle flux, fundamentally overturning prevailing notions of energy, waves, and particles.

Einstein's contribution lay in elevating quanta from mere mathematical expedients for "energy exchange" to entities with physical reality. Every interaction between light and matter occurs via these "indivisible packets," embodying "light quantum realism."

In contrast to Planck: Einstein explicitly maintained that quanta constitute the essence of light, asserting that light is truly composed of "particles," and formulated the photoelectric effect equation accordingly. This not only established the experimental foundation for quantum mechanics but also prompted subsequent validations of fundamental quantum phenomena, such as wave-particle duality.

Impact: Einstein's light quantum concept serves as the origin of the modern photon notion, directly advancing developments in lasers, quantum communication, quantum computing, and related applications. It also laid the groundwork for a century-long philosophical debate on "wave-particle duality" in quantum physics.

7. The Debate Between Planck and Einstein on the Light Quantum Concept

During the nascent stages of quantum theory, although the "energy quantum" elegantly resolved the black-body radiation puzzle, the two foundational figures—Max Planck and Albert Einstein—held profound disagreements regarding the essence of "quanta." These divergences extended beyond physical models to encompass underlying philosophical tensions.

Planck's Position: When proposing the energy quantum hypothesis, Planck consistently regarded it as a mathematical tool devised solely to explain specific experimental phenomena (such as black-body radiation), without implying that energy must be granular and indivisible in all contexts. He did not endorse Einstein's assertion that the essence of light consists of discrete "light quanta (photons)," instead favoring classical wave theory and maintaining reservations about the reality of energy quanta. It was only after subsequent experiments (e.g., the Compton effect) confirmed photon reality that this view gradually gained acceptance in the scientific community.

Einstein's Position: Einstein resolutely maintained that the nature of light comprises minimal energy units as "photons," with light-matter interactions explicable only through quantized energy "granules." He successfully applied this perspective to explain the photoelectric effect and predicted additional quantum phenomena, playing a pivotal role in advancing modern quantum theory.

Essence of the Debate and Later Reflections: The core disagreement centered on whether quanta represent instrumental mathematical assumptions or the ontological structure of the physical world—Planck leaned toward the former, Einstein the latter. Notably, although Einstein initiated the modern era of quantum physics with his light quantum hypothesis, he later expressed confusion and ambivalence about the photon concept in his later years. As quantum mechanics evolved rapidly, the physics community no longer viewed light particles merely as simple microscopic "spheres" but incorporated more abstract and counterintuitive properties, such as wave-particle duality, probability amplitudes, and uncertainty. Adhering to realism, Einstein harbored deep suspicions toward quantum mechanics' non-locality and uncertainty principle. He even publicly stated: "If it were not absolutely necessary, I would never have proposed the concept of the photon." He critiqued the mainstream Copenhagen school's "wave-particle duality" interpretation as overly enigmatic—relying on probability and statistics on one hand while occasionally invoking "real particles" on the other—this theoretical hybrid left many physicists, including Einstein, perpetually uneasy.

Debates Often Overlooked in Popular Science and Textbooks: Contemporary textbooks and popular science literature frequently gloss over the matter with phrases like "Planck proposed the quantum hypothesis, and Einstein introduced the light quantum," without emphasizing that Planck himself never accepted Einstein's realist view of light quanta and explicitly rejected it on multiple occasions in public forums and academic debates. He long maintained that wave theory of light would not be overturned and that energy quanta were merely empirical coincidences. It was not until the 1920s, with the establishment of modern quantum mechanics and experiments such as the Compton effect and photon momentum demonstrations, that this perspective gradually shifted among mainstream physicists.

Insights and Reflections: This debate illustrates that scientific truth does not proceed along a smooth path with rapid consensus; even the originators of major concepts, upon deeper deliberation, may become increasingly perplexed. Revealing these disagreements and steadfast positions aids in understanding the complex mechanisms of scientific theory formation and their profound philosophical implications. It also enables the public and students to penetrate simplified narratives, confront the enlightening conflicts of great minds, and inherit a spirit of skepticism, rather than uncritically accepting purported "mainstream views" that may not even exist.

8. Several Mathematical Definitions of the Quantum, and the Lack of Definition in Textbooks for the "Quantum" Concept

Remarkably, despite "quantum" having become one of the most central foundational concepts in modern physics, most prevailing undergraduate or graduate physics textbooks offer scant clear and systematic elucidation of what "quantum" precisely entails or how it should be rigorously defined. Typically, textbooks address it only in introductory or historical overview sections, referencing the ancient Greek meaning of "atomos" or Max Planck's 1900 "energy quantum" proposal, often stopping at superficial mentions without providing comprehensive definitions or substantive discussions of its physical and philosophical implications. Such content is largely historical: "'Quantum' originally refers to the smallest portion," followed immediately by formulas and theoretical derivations, devoid of argumentation, commentary, or explanation.

This phenomenon warrants significant attention. On one hand, the term "quantum" is deeply embedded in various theoretical structures and disciplinary systems; on the other, textbooks seem to tacitly assume that its meaning is "self-evident," requiring no dedicated clarification across fields. Consequently, beginners struggle to develop a comprehensive understanding of "quantum" as a physical entity, mathematical expression, and philosophical construct. Many may erroneously perceive "quantum" merely as "the smallest energy packet" or a generic term for all minimal units, failing to grasp its specific meanings, conditions, and scopes of applicability in different physical theories.

In fact, the quantum—or, more precisely, the quantum state—does possess clear and rigorous mathematical (or "formulaic") definitions within physical theory frameworks, which textbooks should introduce and emphasize as essential knowledge. Strictly speaking, in modern physical theory systems, "quantum" is represented by the following exemplary mathematical formulations (or "formulation definitions"):

State Vector Representation—The quantum state as a vector in Hilbert space HH (e.g., ∣ψ⟩ψ).

Integral Representation—Described by the wave function ψ(x)ψ(x) and its modulus squared ∣ψ(x)∣2ψ(x)2 in position space, characterizing "probability amplitude density" and probability distribution.

Summation Representation—Superposition of discrete eigenstates ∑ncn∣n⟩ncnn, embodying the "state superposition essence" of quanta.

Density (Reduced) Matrix Representation—Mathematical encapsulation of quantum or mixed states, ρ=∣ψ⟩⟨ψ∣ρ=ψψ or more generally ρρ.

Operator Eigenvalue/Eigenstate Definition—Physical properties of the system defined based on the eigenvalues and eigenstates of corresponding operators.

These mathematical definitions directly delineate the essential connotations of "quantum": it is not a concrete small particle but a superposition state of probability amplitudes (complex amplitudes). Each specific realization and mathematical expression reveals the ontological status and cognitive function of "quantum" in modern physical theory. Only by transcending historical introductions or crude analogies and delving into these structured, formulaic expressions can one truly comprehend what constitutes a "quantum" and why it so profoundly subverts human intuitive perceptions of the world, scientific paradigms, and philosophical presuppositions.

Textbooks' neglect of quantum definitions and related controversies bears considerable responsibility for societal misconceptions and misuses of the quantum concept. Many textbooks merely recount the historical origins and basic formulas of quanta without providing systematic scientific definitions for the core question of "what is a quantum," nor elucidating the long-standing internal debates and diverse interpretations within the physics community. This leads to superficial, fragmented, or even erroneous understandings among numerous readers and some professionals. Such "ambiguous treatment" in pedagogy diminishes the professional depth of science education and indirectly fosters confusion and overgeneralization of the quantum concept in societal cognition, contributing to the proliferation of pseudoscientific phenomena like "quantum wellness" or "quantum medicine." Meanwhile, the genuine essence of rigorous quantum theory—probability amplitudes, state superposition, the uncertainty principle, etc.—remains poorly understood and disseminated. Thus, textbooks' avoidance of definitions and disputes not only undermines educational rigor but also exerts negative consequences on scientific popularization and rational discourse.

Nevertheless, human understanding of the "quantum" concept has undergone an evolutionary process, making it challenging for textbooks to elucidate it clearly.

9. According to Mathematical Definitions, a Quantum is a Collection or Superposition of Probability Amplitudes

From the various mathematical definitions outlined above (state vector, wave function, summation, density matrix, etc.), it is evident that a "quantum" is by no means a microscopic particle-like entity but rather a mathematical ensemble of system possibilities (i.e., all observable outcomes), composed jointly of "probability amplitudes" (complex amplitudes).

Within the Hilbert space framework, any quantum state ∣ψ⟩ψ can be expressed as a linear superposition of numerous observable eigenstates: ∣ψ⟩=∑ncn∣n⟩ψ=ncnn, where cncn represents the "probability amplitude" corresponding to each eigenstate, and their modulus squared ∣cn∣2cn2 denotes the probability of measuring that eigenstate.

In the case of continuous observables (such as a particle's position), ∣ψ⟩ψ is represented by the wave function ψ(x)ψ(x), which similarly constitutes a "superposition of amplitudes" across all possible components.

Intuitive Physical Imagery:

One may analogize a "quantum" to a complex harmony—not a single note, but an intricate superposition of multiple tones, where only upon "observation" does one perceive a specific melody, while prior to observation, it exists as an entangled whole of diverse harmonies. This corresponds to the wave picture, with eigenstates akin to pitches.

In general, the states describable for a particle (position, momentum, energy, etc.) are not determinate but "simultaneously encompass" all possible states—this is the essence of "quantum superposition" and the "collection of probability amplitudes."

This complex superposition implies that, until the moment of observation, outcomes cannot be predicted, and only upon measurement does it "collapse" into one localized state. It is akin to different listeners perceiving varied interpretations from the same sound, or detecting overtones.

This fundamentally subverts the classical worldview: the notion of "state = absolute attributes of physical entities" is supplanted by "state = weighted sum of all possibilities."

Quantum objects do not adhere to simple "either-or" (mutually exclusive) choices but reside in a peculiar "both-and" phase prior to observation, encompassing multiple states concurrently.

At its essence, a "quantum" is the collection and interplay of all possibilities constituted by probability amplitudes—a manifestation distinctly different from naive particles or classical probabilities.

10. Schrödinger's Debate on Quantum Superposition: "Both-And" or "Either-Or"?

Schrödinger's classic discourse on quantum superposition profoundly addresses the "essence of quanta"—through his thought experiment known as "Schrödinger's cat," he highlighted the fundamental distinction between quantum superposition and classical probability, although his original intent was to critique Bohr's interpretation, which is now regarded as mainstream.

Classical probability implies uncertainty arising from "our ignorance" of which state is actual—for instance, in coin tossing, the outcome is inherently heads or tails prior to landing, with our lack of knowledge being the source of uncertainty; this exemplifies the classical "either-or" (mutually exclusive) paradigm.

Quantum probability amplitudes, however, revolutionary assert that, prior to observation, the system genuinely exists in a superposition of all possibilities, with each possibility authentically present in the physical description. This constitutes a true "both-and" state, rather than the traditional "either-or."

The "both alive and dead" state of Schrödinger's cat illustrates the "super-classical" character of quantum states: it is not a fuzzy ambiguity of a single entity but a mathematical multiplicity in superposition, which only "collapses" into a unique classical reality upon measurement (per the Copenhagen interpretation).

More profoundly: the superposition of quantum probability amplitudes differs essentially from ordinary probabilistic mixtures (such as lotteries or dice rolls):

Probability amplitudes possess not only magnitude but also phase (complex numbers), enabling interference among them—this is the root of the peculiarity in phenomena like the double-slit experiment.

When two possibilities superpose, they can either enhance or completely cancel probabilities, a "quantum interference" incomprehensible in classical probability.

Only when we measure the system—posing the question "which one is it?"—does this superposition "forcibly" project onto a single answer in the classical world.

Thus, the core mystery of quanta resides in the fact that—

"Prior to measurement, the world is not 'either-or' but 'both-and'," this-and-that, coexisting simultaneously.

This ontological multiplicity in superposition represents the most distinctive, counterintuitive, and wondrous aspect of quantum physics.

However, it is important to remember that this is merely the perspective of the Copenhagen school.

11. The Uniqueness of Probability Amplitudes: Neither Physical Quantities Nor Interdisciplinary Commonalities

In the mathematical framework of quantum mechanics, the "probability amplitude" is an extraordinarily distinctive and central concept. This notion is entirely distinct from "probability density" or "distribution" in classical probability and statistics, with its uniqueness manifesting in several key aspects:

First, a probability amplitude is a complex number, rather than a mere real-valued probability. It possesses both magnitude and phase. The probability amplitude itself lacks classical physical significance and has no physical units of measurement (i.e., it is dimensionless). For example, in the wave function ψ(x)ψ(x), its magnitude and phase determine the full dynamics and physical implications of the quantum, yet ψ(x)ψ(x) cannot be observed or "directly measured" by any instrument; it serves merely as an abstract "direction" in the theoretical description space. Only when computing ∣ψ(x)∣2ψ(x)2 (the modulus squared of the probability amplitude) does it connect to the real world, determining the "probability of finding the particle at position xx".

Probability amplitudes are not physical quantities. They differ fundamentally from "measurable, observable" real physical quantities in classical physics, such as force, energy, or electric fields. Probability amplitudes cannot yield experimental readings or independently influence apparatus; they function solely as mathematical tools for describing "quantum superposition processes," bridging theory and experiment.

Second, this concept finds no equivalents in other domains of the natural sciences. The structure and interference properties of probability amplitudes are meaningful only within quantum mechanics and related fields. In statistics, biology, classical mechanics, economics, and other disciplines, probabilities merely represent "the uncertainty of a particular outcome occurring," without the capacity for "interference" or description via complex amplitudes encompassing multiple variations. The worldview of probability amplitudes implies that only their superposition—including coherence and interference—can accurately describe microscopic physical behavior, in stark contrast to macroscopic statistics that "merely add probabilities, not probability amplitudes."

Furthermore, since probability amplitudes themselves are not physical quantities (merely mathematical tools), the "quantum states" they describe are likewise not physical realities in an ontological sense. They merely express, within the theory, the full spectrum of possibilities prior to observation, serving as "epistemological" constructs for human cognition, characterization, and prediction of the quantum world, rather than some independently "existing" entities in the universe.

Many physicists and textbooks often mislead readers by treating quantum states as physical realities per se, equating probability amplitudes with some "real component" of actuality. In truth, quantum states and probability amplitudes represent merely the mathematical structure of all possibility distributions; prior to observational acts, they do not imply "the object is thus." Only after observation does the probability amplitude collapse into probabilities, linking to actual physical quantities.

This constitutes one of the profound paradoxes of quantum mechanics: we describe the world using non-physical quantities (probability amplitudes, state vectors); yet our experimental outcomes yield only real measurements of probabilities and physical quantities themselves.

12. Quantum States Are Not Ontological Realities, But Epistemological Tools

Through the essential analysis of probability amplitudes and quantum states, we can discern that quantum states are not independently existing "ontological realities" in the physical world but rather cognitive tools employed by humans to describe and predict physical phenomena. This insight stands at the forefront of theoretical physics and philosophy of science, representing one of the most contentious yet essential aspects of quantum mechanics' development that requires clarification.

Why is this the case? On one hand, quantum states—whether represented by wave functions, state vectors, or density matrices—are not directly observable and cannot yield direct numerical readings from experimental instruments. Only through measurement or "interaction" with experiments do the "all possibilities" delineated by the quantum state "collapse" into a singular real outcome. This "space of possibilities" described by probability amplitudes constitutes merely our comprehensive cognition of nature prior to observation, without implying that the world is inherently composed of state vectors or wave functions.

On the other hand, for an extended period, certain physicists, textbooks, and even popular science works have tended to conflate quantum states with fundamental particles as "physical entities," as if quantum states exist objectively like spheres, ripples, or energy blocks. This is, in fact, a conceptual error that confuses physical representations with physical realities. Some have even boldly published articles in journals discussing "the collapse speed of wave functions." In reality, different experimental setups and measurement methods influence "which quantum state describes the system," underscoring that quantum states are "cognitive frameworks" infused with observer perspectives, rather than absolute objective entities existing independently of cognition and observation.

This recognition has long been overlooked in the history of quantum theory development, not only due to intuitive habits but also because of humanity's ontological presuppositions about the world—we persistently attempt to grasp nature through "what things are," yet quantum theory reminds us that sometimes "what things are" is not the sole imperative; more crucial is "how we describe and predict." As abstract representations for cognition and prediction, quantum states herald the "epistemological turn" in modern science.

Summary

Quanta are not tangible "minimal objects" that can be grasped; in modern physics, they manifest as abstract structures of "collections of probability amplitudes" and "superpositions of possibilities."

Historically, the concept of quanta originated from ancient Greek atomism and ideas of energy discreteness, but its truly revolutionary development occurred through theoretical breakthroughs by Planck, Einstein, and later figures such as Schrödinger and Heisenberg.

Textbooks often evade the essential definition of "quanta," citing only historical descriptions without commentary or argumentation, whereas in modern theory, quantum (states) possess rigorous mathematical expressions and multifaceted structures.

Probability amplitudes and their superpositions introduce novel logic and imagery for describing the world. However, it is imperative to acknowledge that these are merely cognitive tools we employ to characterize nature and predict experimental outcomes, rather than unassailable "real" ontologies inherent to nature itself.

The profundity of quantum physics lies in its subversion of the classical physical notion of "entity realities" and its guidance toward reflecting on the fundamental structure of "scientific knowledge"—the world sometimes cannot be reduced to independently existing "things" but can only be characterized by us through abstract structures such as probability amplitudes and state vectors.

The essence of quanta represents the ultimate embodiment of discreteness and probabilism in the coupling of theory and experiment within nature, serving as a crystallization of modern scientific thought that transcends boundaries between ontology and epistemology, reality and representation. A genuine understanding of the physical imagery of quanta necessitates integrating three perspectives: mathematical structure, experimental facts, and philosophical reflection, transcending the limited imaginative framework of "particle-wave-object" to approach the deeper principles of the world revealed by modern physics.

It is worth noting that the quantum attributes summarized in this article were not preconceived or meticulously designed at the outset of theoretical construction. On the contrary, the development of quantum theory was characterized by high degrees of exploration and practicality. Initially, scientists merely attempted various levels of hypotheses and inductions (such as energy quanta and photons) to address experimental phenomena inexplicable by classical physics, but these hypotheses were incomplete and did not foresee the full structure and features of subsequent quantum theory.

As understanding deepened regarding atomic structure, spectral lines, physical measurement processes, and other phenomena, scientists gradually constructed a quantum mathematical system encompassing probability amplitudes, superposition, operators, eigenvalues, commutation relations, and more. In the course of practical applications, to resolve new physical problems and develop novel computational methods, the specific expressions of the theory continually evolved and refined.

Only after these methods and theoretical constructs matured through extensive experimental validation and mathematical logical deduction could we retrospectively, at a higher level of abstraction, summarize and distill "what constitutes quantum attributes" from the established mathematical expressions. In other words, so-called "quantum attributes"—such as probability amplitudes, superposition states, uncertainty, identical particles, non-locality, etc.—were largely "discovered along the way," without any a priori philosophical or planned design in the early stages.

Thus, the quantum essence we enumerate and comprehend today is the outcome of prolonged theoretical evolution, practical feedback, and continual summarization. This "post hoc induction" path of disciplinary development reflects the open spirit of scientific exploration while reminding us that the complete connotations of the quantum world are far from being pre-planned or fixed in one go; rather, they gradually emerge through ongoing collisions, applications, and reflections.