Reductionism

For much of the 20th century, the dominant approach in science has been reductionism – "the idea that it is possible, at least in principle, to explain a phenomenon in terms of less complicated constituents". This principle has ancient roots - Francis Bacon (1561-1626) quotes Aristotle as declaring "That the nature of everything is best seen in his smallest portions." In many fields, reductionist explanations are impractical, and there all explanations involve 'high level' concepts. Nevertheless, the reductionist belief has been that the role of science is to progressively explain high level concepts by concepts at a more and more basic level.

For example, to explain the behavior of individuals we might refer to motivational states such as hunger. These reflect features of brain activity that are still poorly understood, but we can investigate, for example, the 'hunger centers' of the brain that house these drives. These centers involve many neural networks – interconnected nerve cells, and each network we can be probed in detail. These networks in turn are composed of specialized neurons that can be analyzed individually. These nerve cells have properties that are the product of a genetic program that is activated in development – and so are reducible to molecular biology. However, while behavior is thus in principle reducible to basic elements, explaining the behavior of an individual in terms of the most basic elements has little predictive value, because the uncertainties in our understanding are too great.

Measurement
The reductionist approach assigned particular importance to measurement of quantities; quantification of observations makes them accurately and objectively verifiable, and quantitative predictions are more readily testable than purely qualitative predictions. For some things there is a natural scale by which they can be measured, but for many, measurement scales are human constructs. For example the IQ scale used to purportedly measure intelligence in fact measures how well an individual performs on certain standardised tests, and how such performance relates to cognitive ability is open to debate. Nevertheless, such measurements are objectively repeatable. Measurements may be tabulated, graphed, or mapped, and statistical analysed; often these representations of the data use tools and conventions that are at a given time, accepted and understood by scientists within a given field. Measurements may need specialized instruments such as thermometers, microscopes, or voltmeters, whose properties and limitations are familiar within the field, and scientific progress is often intimately tied to their development. Measurements also provide operational definitions: a scientific quantity is defined precisely by how it is measured, in terms that enable other scientists to reproduce the measurements. Scientific quantities are often characterized by units of measure which can be described in terms of conventional physical units. Ultimately, this may involve internationally agreed ‘standards’; for example, one second is defined as exactly 9,192,631,770 oscillations or cycles of the cesium atom's resonant frequency. The scientific definition of a term sometimes differs substantially from their natural language use; mass and weight overlap in meaning in common use, but have different meanings in physics. All measurements are accompanied by the possibility of error, so their uncertainty is often estimated by repeating measurements, and seeing by how much these differ. Counts of things, such as the number of people in a nation at a given time, may also have an uncertainty: counts may represent only a sample, with an uncertainty that depends upon the sampling method and the size of the sample.