For much of the 20th century, the dominant approach in science has been reductionism – "the idea that it is possible, at least in principle, to explain a phenomenon in terms of less complicated constituents".
This principle has ancient roots - Francis Bacon (1561-1626) quotes Aristotle as declaring "That the nature of everything is best seen in his smallest portions."  In many fields, reductionist explanations are impractical, and in such fields all explanations involve 'high level' concepts. Nevertheless, the reductionist belief has been that the role of science is to progressively explain high level concepts by concepts at a more and more basic level.
In the realm of biology, Ayala has suggested three kinds of reductionism:
- 1. Ontological: An organism is composed exhaustively of nonliving atoms and life processes do not involve non-material entities
- 2. Methodological: The study of organisms should begin at the lowest levels of complexity
- 3, Epistemological: Biology is built upon the laws and theories of the physical sciences.
As pointed out by Yates, each of these assertions can be seen as a question by inverting the verb-subject order. He rephrases the second assertion, in particular, as:
- "Does discovery of the properties of the smaller elements and reference to them give greater scientific understanding than reference to the larger?"
Yates says the answer to this question is unclear. There are several aspects to this. One is that the reduction of (say) chemistry to a quantum field-theoretic explanation conveys no useful information, and obscures the important features of chemical bonding made more understandable at a more macroscopic level of explanation. In fact, the more microscopic explanation is so complex as to exceed our capacity. A more accessible example: "The antics of a troop of monkeys in the forest canopy are doubtless consistent with all of physics and chemistry, but this knowledge supplies no insights that will be useful to a student of animal behavior."
A related issue is whether a complex system can always be usefully divided into an assembly of subsystems. Such analysis abstracts away some of the actual system, making it in effect invisible, and can lead to failures of prediction, such as the roles played by 'emergence' and unanticipated macroscopic 'order parameters'.
Reductionism in physics
Perhaps the prototypical example of reductionism is the historical progress of physics. It is a history of ever more fine-grained explanations. In the 19th and 20th centuries concepts have evolved from the atom as the indivisible building block of nature; to the neutron, electron, and proton; and on to the leptons and quarks of the Standard model; and possibly still further to string theory, although that step is controversial today. At each stage the indivisible fundamental particles were found to have structure that could be explained only with new and tinier particles with different interactions between them. There is a belief that eventually all theories will be combined in the grand 'theory of everything', replacing the Standard model and the general theory of relativity.
In an essay, Nobel-prize winning physicist P. W. Anderson says: "The reductionist hypothesis may still be a topic for controversy among philosophers, but among the great majority of active scientists I think it is accepted without question." Although Anderson agrees with this viewpoint, he adds a caveat: "The ability to reduce everything to simple fundamental laws does not imply the ability to start from these laws and reconstruct the universe. In fact, the more the elementary particle physicists tell us about the nature of the fundamental laws, the less relevance they seem to have to the very real problems of the rest of science...The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity...Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviors requires research which I think is as fundamental in its nature as any other."
Reductionism in biology
Brigandt and Alan break reductionism in biology into three parts, ontological, epistemological and methodological. Ontological reductionism is the view that a biological system (for example, the human brain) is nothing more than molecules and their interactions. Although not taking things down to a molecular level, a great many biologists support a form of ontological reductionism:
Although not stated, the idea that a 'nerve cell' also can be explained at a molecular level lurks in the background.
As an example, to explain the behavior of individuals we might refer to motivational states such as hunger. These reflect features of brain activity that are still poorly understood, but we can investigate, for example, the 'hunger centers' of the brain that house these drives. These centers involve many neural networks – interconnected nerve cells, and each network we can be probed in detail. These networks in turn are composed of specialized neurons that can be analyzed individually. These nerve cells have properties that are the product of a genetic program that is activated in development – and so are reducible to molecular biology. However, while behavior is thus in principle reducible to basic elements, explaining the behavior of an individual in terms of the most basic elements has little predictive value, because the uncertainties in our understanding are too great.
Others find such reduction of all mental states not simply impractical but fundamentally misguided.
Reductionism in philosophy
In philosophy the question of reductionism is connected to the two large fields of ontology (what exists) and epistemology (how do we find out about what exists?). In both cases there are pluralistic (manifold approaches) and monistic (there is only one approach) schools of thought.
At bottom, reductionism faces a dilemma - the connection between the knower and the known; human cognition is always mediated by its own limits. As stated by Merleau-Ponty:
A modern formulation of this issue is Hawking and Mlodinow's model-dependent reality, that is, we rely upon observation to check our models of reality, but entertain the skepticism that comes from an understanding that these models are creations of the minds doing the observation, and subject to any distortions and lack of perspicuity that the limitations of our minds, our imagination, and our observational ability may imply.
The reductionist approach assigned particular importance to measurement of quantities; quantification of observations makes them accurately and objectively verifiable, and quantitative predictions are more readily testable than purely qualitative predictions. On the other hand, the implications of measurement are conferred by their relation to a theory (or theories), and theories are tentatively held, refer to different (although overlapping to some degree) domains of experience, and can conflict.
For some things there is a natural scale by which they can be measured, but for many, measurement scales are human constructs. For example the IQ scale used to purportedly measure intelligence in fact measures how well an individual performs on certain standardised tests, and how such performance relates to cognitive ability is open to debate. Nevertheless, such measurements are objectively repeatable.
Measurements may be tabulated, graphed, or mapped, and statistical analyzed; often these representations of the data use tools and conventions that are at a given time, accepted and understood by scientists within a given field. Measurements may need specialized instruments such as thermometers, microscopes, or voltmeters, whose properties and limitations are familiar within the field, and scientific progress is often intimately tied to their development. Measurements also provide operational definitions: a scientific quantity is defined precisely by how it is measured, in terms that enable other scientists to reproduce the measurements.
Scientific quantities are often characterized by units of measure which can be described in terms of conventional physical units. Ultimately, this may involve internationally agreed ‘standards’; for example, one second is defined as exactly 9,192,631,770 oscillations or cycles of the cesium atom's resonant frequency. This definition may seem very specific, but underlying this definition is a theoretical assessment that decides what precautions are necessary in observing the cesium atom to assure that its resonant frequency is unperturbed by extraneous influences that might affect it (like gravity, or imperfections in vacuum) and, of course, the belief that we understand under what circumstances the resonant frequency is invariant. A corollary of such error analysis is the selection process that settles upon the cesium atom for a standard: that is based upon both mundane considerations of realizing the standard with some modicum of ubiquity and upon the practical assessment of its accuracy. This choice may change.
All measurements are accompanied by the possibility of error, so their uncertainty is often estimated by repeating measurements, and seeing by how much these differ. Counts of things, such as the number of people in a nation at a given time, may also have an uncertainty: counts may represent only a sample, with an uncertainty that depends upon the sampling method and the size of the sample.
The scientific definition of a term sometimes differs substantially from their natural language use; mass and weight overlap in meaning in common use, but have different meanings in physics.
- Francis Crick, quoted in Robert N. Brandon (1996). “Chapter 11: Reductionism versus holism versus mechanism”, Concepts and Methods in Evolutionary Biology. Cambridge University Press. ISBN 0521498880.
- Ingo Brigandt,, Alan Love (Apr 30, 2012). Edward N. Zalta (ed.):Reductionism in Biology. The Stanford Encyclopedia of Philosophy (Summer 2012 Edition).
- Francis Bacon 'The Advancement of Learning' 
- Francisco J Ayala (1987). “Biological reductionism: The problems and some answers”, F Eugene Yates, ed: Self-Organizing Systems: The emergence of order. Springer, pp. 315-324. ISBN 9781461282273.
- F Eugene Yates (1994). "Order and complexity in dynamical systems: Homeodynamics as a generalized mechanics for biology". Mathematical and Computer Modelling 19 (6-8): pp. 49-74.
- Franklin M Harold (2001). The Way of the Cell: Molecules, Organisms, and the Order of Life. Oxford University Press, p. 218. ISBN 9780198030881.
- Philip W Anderson (August 4, 1972). "More is different: Broken symmetry and the nature of the hierarchical structure of science". Science (177): pp. 393-396.
- This quote is from: Eric R. Kandel (2007). In Search of Memory: The Emergence of a New Science of Mind. WW Norton, p. 9. ISBN 0393329372. However, the same language can be found in dozens of sources. Some philosophers object to the unsupported statement of such conjectures, for example, observing that consciousness has yet to be shown to be a process at all, never mind a biological process. See Oswald Hanfling (2002). Wittgenstein and the Human Form of Life. Psychology Press, pp. 108-109. ISBN 0415256453.
- A rather extended discussion is provided in Georg Northoff (2004). Philosophy of the Brain: The Brain Problem, Volume 52 of Advances in Consciousness Research. John Benjamins Publishing. ISBN 1588114171.
- Martin C. Dillon (1997). “Chapter 3: The thesis of the primacy of perception”, Merleau-Ponty's Ontology, Revised 1988 ed. Northwestern University Press, pp. 51 ff. ISBN 9780810115286.
- Maurice Merleau-Ponty (1964). James M Edie, ed: The Primacy of Perception: And Other Essays on Phenomenological Psychology. Northwestern University Press, p. 16-17. ISBN 9780810101647.
- Michael Shermer (2011). The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce Them as Truths. Macmillan, p. 6. ISBN 9781429972611.
- Unit of time (second). NIST. Retrieved on May 28, 2014.
- DB Sullivan et al. (2005). "NASA's laser-cooled atomic clock in space". Advances in space research 36: 107-113. “After a brief overview...this paper discusses the systematic frequency shifts that limit the accuracy of the clock...We have set a goal of 5 × 10-17, which is 10 times lower than the uncertainty of today's best cesium fountain clocks on Earth.”
- A G Mungall, R Bailey and H Daams (January 17, 1966). "The Canadian Cesium Beam Frequency Standard". Metrologia 2 (3): 98. DOI:10.1088/0026-1394/2/3/002. Retrieved on May 28, 2014. Research Blogging. “Sources of error are discussed, and a value of 1.9 × 10-11 is estimated as the upper limit.”
- Ichiro Ushijima et al. (May 16, 2014). Cryogenic optical lattice clocks with a relative frequency difference of 1×10−18.