accelerator.32 Any attempt to predict the behavior of electrons in SLAC exclusively by Schrödinger's equation would lead to superpositions upon superpositions without yielding electrons with definite directions and energies. Engineers assume electrons traveling in definite trajectories through drift tubes and accelerating fields between these tubes. The resolution of this conflict is similar to the preceding. If (F2) is treated as an isolated ontological claim about the nature and behavior of electrons, then it cannot be accepted as true. Within the context of the Franck-Hertz experiment, or SLAC, or particle detectors, it is a pragmatic presupposition. This acceptance fits LCP in a way that avoids contradictions if one accepts complementarity as a way of extending and limiting classical concepts in quantum contexts.
2.3 Extending Ordinary Language.
Reasoning in ordinary language not only involves material inferences, it also relies on a picture of the world, a cohesive framework in which novel discoveries or conjectures can be related to complex networks of established facts and presuppositions. The major difficulty artificial intelligence projects encountered was their inability to supply such a supporting framework. Attempts to fill this gap brought a realization of how complex and functional our normal world view is. LCP embodies a coherent highly developed view of reality. This supplies a supporting role in interpreting and coordinating inferences. No one familiar with the practice of physics can really doubt this. This language, however, falls between the cracks of contemporary methodologies. Ordinary language analysis, in its various forms, does not accommodate the mathematical representation of quantities. The formal methods that philosophers have developed for reconstructing and interpreting theories can accommodate mathematical formulations and logical structures. This is generally done in such a way that there is virtually no role for linguistic analysis. In a reconstructed theory mathematical formulations are foundational and inferences are formal, i.e., governed by rules independent of the content to which are applied. If a mathematical formulation is to supply a foundational role, then it must be developed rigorously. This not only eliminates the dirty math that justified mathematical formulations on physical grounds. It also eliminates the inferential role that played by physical concepts and the conceptual networks that support them. Philosophers still tend to rely on the myth of an observational language, supplemented by theoretical terms. The basic problem here is not with the practice of physics, but with the inadequacy of such reconstructions.
I do not intend to develop a general theory of LCP. Instead I will exploit the historical development of the first part to bring out two points: the dialectical interplay between mathematical and physical considerations that lead to the acceptance of new quantitative concepts; and the tendency to regard established concepts as facts about the world. On the first point I rely on Mary Hesse’s (1974) network model of scientific inference. Any new quantitative predicates introduced into physics are under dual constraints. The empirical constraint stems from the primary process involved in introducing, learning, and coming to use such predicates through empirical association in some physical situation. Here perceived similarities generally supply an initial basis for general terms and stimulate a search for lawlike generalizations among the classes
32 See Cartwright, 1983, pp. 172-174. For an illustration of the role inferences based on trajectories plays in particle detectors, see Roe, 1996, pp. 44-54