Eyes wearing inverted lenses Philosophy of science |
Foundations |
Method |
Conclusions |
“”Science is far from the perfect instrument of knowledge. It's just the best we have.
|
—Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark[1] |
The scientific method is an epistemological system for deriving and developing knowledge. Some people consider it the best method for making useful and practical additions to human knowledge about the physical world, and it has resulted in the technological leaps made in its spread throughout the Western world.[citation needed] The scientific method can also be described as a learning process.
Galileo Galilei (1564-1642) and Francis Bacon (1561-1626) figured among the first European scientists to use the scientific method as we know it, prioritizing it over the Ancient Greek tradition of knowledge-generation, a tradition that prioritised rational thought over empiricism. Prior to this, thinkers in the "Islamic Golden Age"[2] (7th to 12th centuries) made wide use of scientific methodology. In 1021 CE, for example, Ibn al-Haytham, working in Africa, emphasised the primacy of experimentation in his Book of Optics.[3][4][5][6] Arabic science also generated a system of peer review.[7][8] The scientific method was not always accepted during its period of development, and the work of Hungarian physician Ignaz Semmelweis (1818-1865) on disinfection procedures provides a telling example of what happens when know-it-alls ignore the method, or the conclusions arrived at by its use.
At the core of modern scientific practice method is the idea that the value of a hypothesis, theory, or concept is best determined by its ability to make falsifiable predictions that one can test against empirical reality. This means that supernatural entities or concepts that are meaningless or logically contradictory cannot be included in a scientific hypothesis (not least because of the difficulty of putting a sample of a deity in a test-tube). Consequently, when carrying out investigations scientists assume a position of methodological naturalism.
Humans, including scientists, are fallible and irrational apes by nature. The scientific method, accordingly, helps these highly imperfect beings iron out their biases, attain a reasonable degree of objectivity, and develop reliable and sometimes useful results.
“”Aah, there's nothing more exciting than science. You get all the fun of sitting still, being quiet, writing down numbers, paying attention... Science has it all.
|
—Principal Skinner, Bart's Comet[9] |
The scientific method isn't a simple, linear process, but is wrapped up in the complexities of research in the real world and the practicalities of what is possible. However, the idea of testing a hypothesis and refining knowledge based on observation is a constant theme of science.[10]
Despite the lack of simple linearity in reality, the method has often been codified into stages that make it easier to understand. Essentially, the following five steps make up the scientific method:
The testing of hypotheses allows for error correction and the development of better models. One of the notable examples is the development of atomic theory - the theory that describes what atoms "look like." From Dalton's indivisible model, to Thomson's "plum pudding" model, to Rutherford's teeny-tiny nucleus model, and then to the Bohr Model and modern quantum physics, the atom developed in steps because each model made predictive statements that could be tested. Thus the theory is refined over time and as observational evidence increases in support of it. That evidence supports a hypothesis implies that the hypothesis is stronger (and more likely to be valid) than before the test. On the other hand, evidence against a hypothesis makes it invalid, thus falsifying it.[12] It is an inductive method, although its results can be used deductively as well.
All but the first two steps are omitted from the process in pseudosciences such as intelligent design (where step 3 would be impossible) and most borderline-supernatural alternative medicines like homeopathy. Pseudosciences do observe the world, and do come up with explanations, but are often unable or unwilling to follow through in testing them more thoroughly. Refining the hypotheses is also undesirable in pseudoscience as this could lead to abandoning the central dogma of the belief - imagine where modern technology would be if the scientists of the 20th century refused to modify the structure of the atom as new observational evidence came in? However, because observations and explanations still form a part of pseudoscience and can be phrased in a scientific style, pseudosciences may mistakenly appear to have scientific authority.
In practice, different academic disciplines apply the scientific method in what may at first appear to be different ways, but fundamentally all use strong inference based on falsifiable hypotheses. For example, in fields such as astrophysics, evolution, and geology experiments can be difficult or impossible because of the scale of space and time involved. We can't set up a controlled experiment involving hundreds of light years, millions of years, or replicates of hundreds of Earths. Instead however we can use mathematical models of planetary behaviour to understand orbital patterns, comparative analysis of characteristics of fossil and extant organisms to build evolutionary trees, and bore-hole samples to interpret subsurface geology.
Observation and insight are a key part of scientific inquiry. For example, in the history of biology, much of the early work involved detailed collection, description, and classification of organisms. The extensive early work documented in museum collections and old tomes, along with personal experience as an exploratory biologist on the HMS Beagle, served as the fodder for Charles Darwin's conception of evolution by natural selection. Similarly Albert Einstein's theory of relativity was based on a solid understanding of Newtonian physics, along with personal observations of relative movement while gazing out a train window. Observation and insight are the grist to generate hypotheses and theories, and the full scientific method is necessary for hypotheses and theories to withstand the test of time.
Scientific skepticism is a vital element in the scientific process, ensuring that no new hypothesis is considered a Theory (capped T) until sufficient evidence is provided and other scientists have had their chances to debunk it. Even then, all of science is always considered a "good working model" and the "best understanding we have at the present time." No scientific idea is ever considered "the final word," nor the Word of God. It is always assumed that someone, somewhere is out to disprove the current theory.
“”You must not say that this cannot be, or that that is contrary to nature. You do not know what Nature is, or what she can do; and nobody knows; not even Sir Roderick Murchison, or Professor Owen, or Professor Sedgwick, or Professor Huxley, or Mr. Darwin, or Professor Faraday, or Mr. Grove, or any other of the great men whom good boys [and girls] are taught to respect. They are very wise men; and you must listen respectfully to all they say: but even if they should say, which I am sure they never would, "That cannot exist. That is contrary to nature," you must wait a little, and see; for perhaps even they may be wrong.
|
—Sir Charles Kingsley[13] |
The scientific method helps us pursue the ideal of scientific objectivity, protecting against bias that could lead to false conclusions. Bias, in the sense of inclinations or preconceptions, is part of being human, and has a role in scientific inquiry insofar as it guides what questions to ask and how to ask them. At the same time bias leads to championing a particular conclusion a priori, independent of evidence, belief, not necessarily reality. The scientific method explicitly seeks to remove bias through rigorous hypothesis testing and reproducing results. Bias can enter in many different ways, including the initial framing of an inquiry, the time scale examined, and innate properties of the system being examined. For example, a pharmaceutical compound may be approved as safe because it appears safe and effective in short-term studies, while it may later be shown to be ineffective or unsafe in long-term studies. In essence, the scientific method serves as a tool to keep bias in check.
The philosophy of science dates back to the Greeks, but it began to take its modern form during the scientific revolution. Two competing schools of thought emerged at this point: the rationalist tradition associated with René Descartes and the empiricist tradition of Francis Bacon. During the 18th century, David Hume philosophically undermined the scientific method with his problem of induction[14] and his deconstruction of causation.[15]
A synthesis of rationalism and empiricism arose in the 18th century with the work of Immanuel Kant[16] and continued in the 19th century among pragmatist philosophers such as Charles Sanders Peirce.[17] During the 20th century, the logical positivists attempted to do away with pesky metaphysics and a number of other branches of philosophy altogether. The enterprise failed when it was noticed that the verification principle that logical positivism built on was self-refuting. Karl Popper (1902-1994) replaced verifiability with falsifiability, that is, for an idea to be popperly "scientific" it must be possible to devise an experiment (even a thought experiment) that could render it false. Popper intended falsification both as a solution to the demarcation problem and as a workaround for Hume's problem of induction.[18] Thomas Kuhn took a more historical approach to thinking about science, aiming to get a better picture of how science was practiced in reality. He described the dynamics of scientific change, coining the terms scientific revolution and paradigm shift to help describe what he saw as the way a fundamentally conservative set of ideas could be overturned and become a new, different set of conservative ideas. Kuhn rejected the idea that there was only one scientific method. This influenced the practitioners of what would become the sociology of science as well as other philosophers, such as Imre Lakatos. Lakatos conceived of science as split into numerous paradigms he called "research programmes", each making use of its own methodology and assumptions. (Summary: Humans remain humans and don't naturally think in a scientific manner, but have to learn it, and easily backslide.)
Other schools of "scientific criticism" look at science critically from an economic perspective, or focus on discourse, but these are more academic and less practical critiques.
In order to look for "data" you need to have a model or "structure" of how the world works. The problem as James Burke pointed out in the "Worlds Without End" episode of Day the Universe Changed that structure can drive every part of your research even what you accept as reliable data.
This possibility of the structure driving the data rather than the data driving the structure had been hammered home in anthropological circles back in 1956 with Horace Miner's bitingly satirical "Body Ritual among the Nacirema."[19] Often referenced as a satirical look at American culture, it was also a look at anthropological work of the time and the "Look at these poor primitives who believe in magic that we are so much wiser than" attitude so common in professional publications of the time. Miner showed that with that model any culture (even that of then modern 1950s United States) could be dismissed as a bunch of magic-using savages.
In "Worlds Without End" Burke points out one of the reasons the Piltdown hoax lasted as long at it did was it fitted the then prevalent structure of finding a human like skull with an ape-like face. In fact, in 1913, David Waterston of King's College London stated in Nature that the find and an ape mandible and human skull[20] and French paleontologist Marcellin Boule said the same thing in 1915. In 1923 Franz Weidenreich stated after careful examination that the Piltdown find was a modern human cranium and an orangutan jaw with filed-down teeth[21] but because Piltdown fit the structure so well other scientists let the model drive their thinking rather than the evidence itself.
Extra Credit points out in God Does Not Play Dice - The Danger of Unquestioned Belief that you have to have a series of postulates to even begin to formulate anything but that if you hold on to the postulates as if they are fact then it can and will blind one to acknowledging the system being used may be flawed.
A related problem is that more information makes one more confident on the theory they have formulated but that does not correlate on how accurate it is.[22]
Pseudoscientists have discovered an obvious way to 'cheat' the scientific method. It goes like this:
This is a blatant perversion of the scientific method, but to someone not versed in science, fallacies, or psychology, it might seem similar enough to be accepted as legitimate.
This manner of cheating has been used by proponents of intelligent design. Note that this isn't limited to pseudoscientists such as those trying to grant legitimacy to intelligent design, but is a mistake frequently made even by "proper" scientists, if they focus too much on finding evidence that supports their hypothesis (their "belief"), instead of focusing on attempting to find evidence that would refute it, or on attempting to find evidence that would refute competing hypotheses.
For those of you in the mood, RationalWiki has a fun article about Pseudoscientific method. |