Название: Planning and Executing Credible Experiments
Автор: Robert J. Moffat
Издательство: John Wiley & Sons Limited
Жанр: Физика
isbn: 9781119532866
isbn:
1.5 A Systems View of Experimental Work
An experiment is a system designed to make a measurement. The system consists of the hardware (test rig and specimens), the instruments (including sensors, amplifiers, extension wires, etc.), and the interpretive software (calibration routines, data reduction programs, etc.). The whole system is an instrument designed to make a particular kind of measurement. As such, the system must be designed so that it can be calibrated. The system must be calibrated before it is used to generate new data. It must be possible, using diagnostic tests on the system, to confirm the accuracy of the measurements made with the system.
This view of an experiment is illustrated in Figure 1.1, which also shows some of the necessary features of the system design.
Perhaps the most important feature of this view of experimental work is the important role given to uncertainty analysis. There are uncertainties in every measurement and, therefore, in every parameter calculated using experimental data. When the results of an experiment scatter (i.e. are different on repeated trials), the question always arises, “Is this scatter due to the uncertainties in the input data or is something changing in the experiment?”
Uncertainty analysis provides the proven credible way to answer that question. By quantifying and reporting the uncertainty of each value, we allow our client confidence and credence in our results. Figure 1.1 shows the uncertainty analysis as a key part of the data reduction program, although it is too often neglected. Using either Root‐Sum‐Squared estimation or Monte Carlo simulation, the uncertainty in experimental results can be calculated with little additional effort on the part of the experimenter.
Figure 1.1 The experiment viewed as an instrument. Adjust the instrument by analyzing Uncertainty in each Bubble.
1.6 In Defense of Being a Generalist
The last point we wish to make before sending you off into this body of work has to do with the level of expertise one needs to run good experiments.
We think it is more important for an experimentalist to have a working knowledge of many areas than to be a specialist in any one. The lab is a real place; Mother Nature never forgets to apply her own laws. If you are unaware of the Coanda effect, you will wonder why the water runs under the counter instead of falling off the edge. If you haven't heard of Joule–Thomson cooling, you will have a tough time figuring out why you get frost on the valve of a CO2 system.
Accordingly, if you aren't aware of the limitations of statistics, then using a statistical software package may lead you to indefensible conclusions.
It is not necessary to be the world's top authority on any of the mechanisms you encounter in the lab. You simply have to know enough to spot anomalies, to recognize that something unexpected or interesting is happening, and to know where to go for detailed help.
As an experimentalist, always beware of assumptions and presuppositions. See Figure 1.2 and “The Bundt Cake Story” (Panel 1.1). Step forward and predict. Then be ready and humble to course correct.
The lab is a great place for an observant generalist. The things that happen in the lab are real and reflect real phenomena. When something unexpected happens in the lab, if you are alert, you may learn something! As Pasteur said, “Chance favors only the prepared mind” (Pasteur 1854).
Let’s now launch toward planning and executing credible experiments.
Figure 1.2 The Bundt cake as delivered. A high heat‐transfer coefficient lifts the fluid batter like a hot air balloon. But which stagnation point is up, and which is down?
Panel 1.1 The Bundt Cake Story
One night, years ago, my wife baked a Bundt cake (chocolate and vanilla batter layered in a toroidal pan). When she presented me with a slice of that cake for dessert, I was impressed. But, also, I noticed something interesting about the pattern the batter had made as it cooked.
I recognized that the flow pattern, as drawn in figure 1.2, was related to the heat‐transfer coefficient distribution around the baking pan.
I tried to impress my wife with my knowledge of heat transfer by explaining to her what I thought I saw. “Look,” I said, “see how the batter rose up in the center, and came down on the sides. That means that the batter got hot in the center sooner than it did on the edges. That means that the heat‐transfer coefficient is highest at the bottom center stagnation point for a cylinder in free convection with a negative Grashof number.”
My wife was silent for a minute, then gently corrected me “I baked the cake upside down.”
Of course, as soon as I learned that, I was able to say with confidence that “The heat‐transfer coefficient is lowest at the bottom center stagnation point and high on the sides, for a cylinder in free convection with a negative Grashof number.”
The Moral of This Story?
It is critically important that you can trust your data before you try to interpret it. Beware! Once we accept our results as valid, how can we avoid constructing or searching for an explanation? Does not the scientific method and our human nature spur us to do so?
References
1 Carey, B. (2011). Fraud case seen as a red flag for psychology research. http://www.nytimes.com/2011/11/03/health/research/noted‐dutch‐psychologist‐stapel‐accused‐of‐research‐fraud.html.
2 Ioannidis, J.P.A. (13 July 2005). Contradicted and initially stronger effects in highly cited clinical research. JAMA 294 (2): 218–228. https://doi.org/10.1001/jama.294.2.218. PMID 16014596.
СКАЧАТЬ