The Imprint of Reality on Structure

Spyros Tserkis

Essay published on September 20, 2025

It is through direct or indirect perception of physical phenomena that human beings access the world around them. The regularities and patterns that emerge from observation call for a systematic description, which in turn requires the introduction of concepts sufficiently abstract to be embedded within a consistent theoretical framework. While the reality of percepts can readily be affirmed on an empirical basis, the reality of concepts poses an ontological problem that dates back to the origins of philosophy.

The view that refrains from attributing any reality to scientific concepts and their relations is known in contemporary philosophy as instrumentalism. According to this view, scientific theories function as predictive instruments rather than as descriptions of reality. In contrast, realism (short for scientific realism) holds that scientific concepts such as mass, force, and charge, together with the relations they enter into, correspond to reality or at least provide a close approximation of it.

Instrumentalism, due to its agnostic character, remains a safe position as long as questions about the ontological status of scientific concepts are set aside. Realism, on the other hand, faces significant challenges, particularly in light of developments in modern physics such as quantum mechanics, the branch of physics concerned with the subatomic world, and cosmology, which studies the nature and development of the universe macroscopically.

The aim of this essay is to address the limitations that realism faces and to defend an alternative perspective, known as structural realism (short for structural scientific realism). Structural realism, a term coined by John Worrall,1 holds that reality manifests itself through the mathematical relations among scientific concepts. An early formulation of this view appears in the writings of Henri Poincaré, who argued that:2

Science [...] is a system of relations. [...] it is in the relations alone that objectivity must be sought; it would be vain to seek it in beings considered as isolated from one another.

To the question of whether science reveals the true nature of things, Poincaré answered:2

Not only science cannot teach us the nature of things; but nothing is capable of teaching it to us.

This essay focuses on this version of structural realism, according to which there is a barrier to accessing reality beyond its structure, whereas another version claims that reality itself is nothing but structure.

A concept that plays a central role in quantum mechanics is complementarity. In its broader sense, complementarity refers to a trade-off in the compatibility between two measured properties, which, in its most pronounced form, results in their mutual exclusivity. Complementarity is clearly demonstrated in photography: adjusting the aperture of a camera lens, while keeping all other settings constant, creates a trade-off between brightness and depth of field (illustrated in Figure 1). A smaller aperture allows less light to reach the sensor, resulting in a darker image but with greater depth of field, meaning that objects at varying distances remain in focus. Conversely, a wider aperture lets in more light, producing a brighter image but reducing the depth of field, so only a narrow range remains sharp while the rest becomes blurred. It should be emphasized that complementarity is not a consequence of imperfect measurements, but a fundamental constraint rooted in the structure of the system.

Descriptive alt text
Figure 1. Three photos of the same composition taken with different aperture settings, while all other parameters remain constant (focal length: 50mm, exposure time: 1/100s, ISO: 100).

Complementarity in photography does not raise any metaphysical concerns, as it does not apply to individual pixels but to the image they collectively form. The situation becomes conceptually troubling, however, when complementarity arises in the context of properties attributed to single physical entities. This is precisely the case with photons, the fundamental constituents of light.

The ability of a photon to pass through a specific type of filter, called a polarizer, is associated with a property called polarization. A polarizer can be oriented in any direction. For instance, if a polarizer is oriented vertically and a photon passes through it, the photon is said to be vertically polarized, and analogously for any other orientation. When a photon is known to be polarized in a certain orientation, such as vertically, and is then sent through a polarizer oriented orthogonally, i.e., horizontally, the photon is blocked (as illustrated in the left panel of Figure 2).a Interestingly, if a vertically polarized photon is sent through a polarizer tilted at 45 degrees (as illustrated in the right panel of Figure 2), it passes through with 50% probability, and if the same photon is subsequently sent through a vertical polarizer, it again passes through with 50% probability. This behavior indicates that the property associated with the binary distinction between vertical and horizontal outcomes is complementary to the one associated with the diagonal outcomes, meaning that precise knowledge of one entails maximum uncertainty about the other. This suggests that the polarizer does not passively reveal a pre-existing polarization, but, counterintuitively, that the outcome depends on the orientation of the polarizer as well.

Descriptive alt text
Figure 2. Two polarizers placed with orthogonal (left) and complementary (right) orientations along a photon's path.

The existence of complementary properties within a single fundamental particle, such as a photon, challenges traditional realism, which assumes that fundamental entities possess definite and intrinsic properties. If two properties of an entity cannot be independently measured with certainty, can both truly correspond to real properties? In 1935, a seminal paper by Albert Einstein, Boris Podolsky, and Nathan Rosen, hereafter referred to as EPR, explored this issue in depth, proposing the following sufficient criterion for a property (or quantity, as they refer to it) to be considered an element of reality:3

If, without in any way disturbing a system, we can predict with certainty the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.

This criterion raises doubts about whether both complementary properties can be elements of reality, although the situation becomes more nuanced once entanglement is taken into account. Entanglement between two (or more) particles is the phenomenon in which measurement outcomes are found to be correlated across different choices of measurement parameters that correspond to complementary properties (see Figure 3 for a schematic representation). For example, the polarizations of two entangled photons can be perfectly correlated when both photons are measured in the vertical orientation, meaning that either both pass through the vertical polarizer or both are blocked, but they are also perfectly correlated when they are measured in a complementary orientation, such as at 45 degrees. This behavior would not occur with two photons that were explicitly prepared to be vertically polarized, since in that case they would produce correlations only in the vertical orientation, while measurements in the complementary orientations would yield completely random results.

Descriptive alt text
Figure 3. Schematic of two systems, A and B, each containing a particle. For each particle, a measurement parameter is selected that corresponds to one of two available complementary properties. For entangled particles, correlations exist between the outcomes.

By considering this non-trivial correlation, EPR argued that the outcome of a measurement on a distant particle could be predicted with certainty, and without disturbance, by measuring the corresponding property of its entangled counterpart. Then, relying on counterfactual reasoning, the assumption that the outcomes of mutually incompatible measurements can be meaningfully considered, they argued that both properties could in principle be determined without any uncertainty relation arising between them, thereby contradicting their complementary nature. This led them to conclude that quantum mechanics must be an incomplete theory, a conclusion that initiated a long-standing debate that continues to this day.

In 1964, John Bell took EPR’s objections seriously and suggested the following setup.4 Two particles are spatially separated, and for each, an experimental parameter is freely chosen to determine which of two complementary properties will be measured (as illustrated in Figure 3). The procedure is repeated with different choices of measurement parameters, each time measuring identically prepared pairs of particles. It should be noted that particles are described as identically prepared rather than identical, since there is no way of verifying in advance that they are exactly the same. Seeking to test the completeness of quantum mechanics, Bell considered the possibility that the system under investigation possesses additional properties, called hidden variables, whose existence would supply the missing information needed to render the theory complete. Relying on counterfactual reasoning, Bell derived the maximum value of an expression involving all four possible correlations for two measurement parameters on each side. The purpose of Bell’s derivation was to construct a testable criterion that was theoretically expected to be violated by entangled pairs of particles. This expectation was experimentally confirmed in the 1980s and led to the 2022 Nobel Prize in Physics for the scientists involved. Given this violation, at least one of the assumptions used in the derivation must be incorrect.b

The first assumption Bell used in his derivation, called outcome independence, is that the measurement outcomes are independent of each other when hidden variables are also taken into account. This means that the hidden variables of the system provide the missing information to fully describe each particle separately. The second assumption, called parameter independence, is that the outcome for one particle is independent of the measurement parameter chosen for the other. This reflects the idea that, given enough spatial separation, the choice of measurement parameter on one particle has no influence on the other particle. The third assumption, called measurement independence, is that the system's hidden variables are independent of the selected measurement parameters. This implies that prior to the measurement, hidden variables correspond to properties inherent to the system of particles.

For those who reject counterfactual reasoning, Bell’s criterion poses no challenge (nor does the EPR argument), as quantum mechanics can then be regarded as a complete but strictly statistical theory. That view is adequate for both instrumentalists and structural realists. Traditional realists, however, cannot avoid counterfactual reasoning in this context, since, from their perspective, the physical properties a priori possess definite values, even when the measurements that reveal them are in practice mutually exclusive.

The denial of outcome independence suggests that entangled particles cannot be fully described in isolation from one another. Consequently, the measurement outcome of a particle that is part of an entangled pair may be random when considered in isolation, but it is correlated with the measurement outcome of its counterpart. The possibility that entangled systems cannot be fully described in terms of properties attributed to individual particles is compatible with instrumentalism and structural realism, but not with traditional realism. Traditional realists are therefore compelled to account for the randomness observed in measurements when entangled particles are considered individually.c One interpretation suggests that certain measurements are inherently random. Another proposes that multiple universes exist to realize all possible measurement outcomes. Both views are unsatisfactory, as they fill explanatory gaps with ad hoc metaphysical assertions.

Rejecting parameter independence implies a dependence between the outcome for one particle and the measurement parameter selected for the other, which is not limited by spatial separation.d A representative theory that adopts this perspective is the pilot-wave theory, in which the hidden variables are the particles’ actual positions, "guided" by a pilot-wave that acts globally in the sense that what happens to one particle in general depends on what is measured on the other. This perspective is fully compatible with traditional realism. Likewise, it is compatible with instrumentalism and structural realism, since the mathematical framework can be accepted devoid of ontological commitments at the level of individual properties.

Finally, rejecting measurement independence allows the hidden variables to depend on the selected measurement parameters. It is important to note that this dependence need not be causal in nature, since that would give rise to paradoxical scenarios such as retro-causality. The rejection of this assumption is compatible with both instrumentalism and structural realism, but is incompatible with traditional realism, which is committed to the existence of inherent properties, thereby implying properties that do not depend on the act of measurement.

Based on the above discussion, if any ontological claim about reality is to be made, structural realism appears to be the most reasonable view. Traditional realism requires adopting substantial theory-level modifications, as in pilot-wave theory, and treating them as the new real framework. This viewpoint can be challenged by a philosophical argument known as pessimistic induction, which notes that many past theories once regarded as true were eventually abandoned or subsumed by more comprehensive theoretical frameworks, suggesting that the same fate may await current theories. Structural realism, in contrast, maintains that although the specific concepts employed by successive theories may change, the relational structure they describe tends to be preserved.

The appeal of structural realism can also be found in cosmology, although, in the absence of a result analogous to Bell’s criterion, the discussion is conducted at a higher level. Throughout history, cosmological models have reflected not only our evolving understanding of the universe but also deep philosophical assumptions about humanity's place within it. The geocentric model placed the Earth, and by extension humanity, at the center of the universe. A major shift in this anthropocentric worldview came with the adoption of the heliocentric model, which demoted the Earth from its special position, placing it among the other planets that orbit the Sun.

The anthropocentric worldview, however, was too deeply rooted to disappear from scientific thought, and it resurfaced in modern cosmology as the fine-tuning problem. Fine-tuning is based on the premise that the emergence of life depends on several conditions, including the existence of long-lived stars and habitable planets. Such conditions appear to be possible only when certain physical parameters, such as the strengths of the fundamental forces, lie within a narrow range of values. Thus, the existence of life on Earth has led many to contemplate the apparent coincidence that the universe possesses precisely those conditions required for life to arise.

Several explanations have been offered to resolve the fine-tuning problem. One proposes the existence of multiple universes, each with its own set of fundamental parameters, so that observers inevitably inhabit a universe that supports their existence. Another appeals to chance, suggesting that the universe simply happens to possess the right conditions for life. A third invokes an external designer who "tuned" Nature’s physical parameters to values suitable for life. All these explanations rely on traditional realism since they presuppose that fundamental parameters are independently existing concepts. From the perspective of structural realism, however, fine-tuning is a pseudo-problem, since the specific values of the fundamental parameters need not be regarded as independent features to which values are ascribed by design, by chance, or by any other mechanism, but rather as quantitative expressions of an underlying structure. In other words, once certain entities are conceptualized and then mathematically abstracted, the internal coherence of Nature is projected onto them.

In the development of any theoretical framework, concepts are introduced axiomatically and serve as reference points through which relations emerge, revealing the structure of reality. If anything lies beneath this structure, it may be, as Baruch Spinoza envisioned, a single substance with innumerable attributes, an indivisible whole, unified and expressed through all that is to be observed.

Acknowledgments

I would like to thank Karina Mikertumova for kindly providing photography consultation on Figure 1.

Footnotes

a This principle has practical applications, such as in sunglasses, where vertically polarized lenses reduce glare by blocking reflected light that is horizontally polarized.

b An unexpected connection is that Bell's criterion had already been considered as part of a broader set of criteria formulated over a century earlier by the logician George Boole, which he called conditions of possible experience.5

c The measurement randomness observed in particles that are not part of an entangled pair can also be attributed to the unavoidable entanglement they form with the measuring apparatus.

d Such dependence does not allow faster-than-light communication.

Bibliography

1. John Worrall, "Structural Realism: The Best of Both Worlds?" Dialectica 43, 99–124 (1989).

2. Henri Poincaré, The Value of Science (The Science Press, New York, 1907).

3. Albert Einstein, Boris Podolsky, and Nathan Rosen, "Can Quantum-Mechanical Description of Physical Reality Be Considered complete?" Phys. Rev. 47, 777–780 (1935).

4. John S. Bell, "On the Einstein Podolsky Rosen paradox," Physics Physique Fizika 1, 195–200 (1964).

5. Itamar Pitowsky, "George Boole’s ‘Conditions of Possible Experience’ and the Quantum Puzzle," Br. J. Philos. Sci. 45, 95–125 (1994).

Contact

Reach the author at: contact@spyrostserkis.com