METROLOGICAL EXAMINATION OF QUANTITATIVE-ANALYSIS METHODS UDC 545(018):389
V. M. Tslaf
There are difficulties in metrological research on quantitative-analysis methods because the objects and structures differ from those traditionally examined in metrology. A measurement consists in deriving the value of a physical quantity, by which is meant a feature common in a qualitative respect to many physical objects but in a quantitative one individual for each such object [1]. The task for quantitative analysis is to determine the composition. To apply measurement methods, one must express the composition in terms of properties, i.e., physical quantities. There are two groups in quantitative analysis defined on the methods of expressing the composition via the properties. If the substance in the initial state has a physical property or set o f properties that may be taken as quantitatively defined, and the magnitudes unambiguously correspond to the composition, then quantitative analysis requires only the measurement of those quantities and conversion to the result. The set of operations constitutes the analytical process in the first group, which includes colorimetry, conductometry, spectrophotometry, and most other physicochemical methods. If the substance in the initial state does not have such properties, it is treated with auxiliary substances to produce them: a reagent during titration, a sorbent in chromatography, etc. Such methods belong to the second group. This includes chemical methods proper (volumetric or gravimetric) and chromatography from the physicochemical ones. The interaction result is not always directly accessible to measurement, so the second group often involves additional transformation such as a color indicator during titration, detection during chromatography, and so on in order to obtain the directly measured physical quantity. The purpose of a process in chemical analysis, in contrast to chemical engineering, is to obtain not a substance but information, so it can be classified as a data-yielding process. A major characteristic here is the structure, which governs the point at which the information arises and how it moves and interacts. Major regularities can be established from process structures, as can the sources of distortion, which appear as errors in the result. The data structure in the first group coincides with the measurement structure [2], so the traditional techniques can be used there. Difficulties arise in the second group because the structures d i f f e r substantially from those of measurements. One has a treatment applied to the material, and errors in specifying the treatment affect the result, as does the accuracy in the model for the interaction with the auxiliary material. Figure 1 shows the block diagram, where 1 is sample preparation, 2 the preparation of the auxiliary substances, 3 the preparation of the technical facilities (instruments, laboratory vessels, and so on), 4 the interaction, 5 the transformation of the result into the directly measured quantity, 6 the measurement, and 7 the transformation of the measurement to the analysis one, e.g., calculating concentrations from chromatogram peak parameters. One can estimate the errors in specifying the treatment on the basis of an extension of the measure concept. In metrology, the name measure is given to a means of measurement designed to reproduce a physical quantity with a given scale [1]. A generalized measure is a technical facility or substance for reproducing a physical or chemical object or phenomenon with qualitatively and quantitatively specified features. This includes measures in the ordinary sense, standard specimens for compositions and properties, and facilities for reproducing qualitatively or quantitatively specified treatments such as from reagents in titration, from sorbents or the column as a whole in chromatography, etc. Quantitative analysis is a data-producing process, so it is basic whether that treatment applies to the data carrier or only to an auxiliary (energy) process in order to identify the internal information. This is of general significance in metrology and has been considered partially in [2]. To derive general concepts, we first consider the measurement. Translated from Izmeritel'naya Tekhnika, No. 6, pp. 11-13, June, 1990.
0543-1972/90/3306-0543512.50 o1990 Plenum Publishing Corporation
543
Fig. 1 Any material is in a state of energy exchange with the environment, which involves the internal energy and the environmental energy. When an object is coupled to a means of measurement, there is similar exchange between them, which may be called natural. I f the quantity to be determined appears in the equation for the natural exchange, there is no need to involve additional energy in order to measure it, but if it does not, one needs a new energy derived from the environment to measure it, and the interaction is provided to extract the information in the o b j e c t , not to alter the object characteristics. In an ideal organization, the measured quantities are invariant with respect to the parameters of the energy flows. In analytical chemistry, there are examples of measurements with special energy flows such as colorimetry, conductometry, spectrophotometry, and other physicochemical methods, which employ light beams, currents, and so on. The analysis results are invariant with respect to the parameters of those fluxes over wide ranges, and in the ideal case, the fluxes do not affect the material. The action on the material cannot be considered as a data-bearing process because it is not the source of the information in the result; that action is not reflected in the information structure. However, if the interaction affects the result substantially, the result includes information on that interaction, which is to be considered as i n f o r m a t i o n - b e a r i n g and is reflected in the information structure. In real experiments, any energy input changes certain quantities, but the main feature used in differentiating the two situations is the invariance or dependence of the result on the scale of the interaction in an ideally organized experiment, when unforeseen or secondary effects are absent. There is a close analogy between the information structures in such quantitative analysis processes and ones during testing [2]. The information structure in chromatography, for example, corresponds to the case where the input is provided by a measure and the result of the treatment is measured. A model for errors in quantitative analysis should incorporate all the data sources, since those sources in addition generate noise, which leads to errors. Those sources in quantitative analysis are the auxiliary substances, the technical facilities, and the operators, who are involved at all stages. The error model should reflect the data interaction in the chemical or physicochemical process, which is defined by the corresponding equations. However, data-bearing and chemical or physicochemical processes are not identical here. There is no doubt that a flow of material can be an information carrier on the composition and properties of the substance, but with an ideally homogeneous substance, the first batch contains all that information, and further batches in the flow provide no new information. However, it is not always possible to construct an adequate model for the error that allows metrological optimization. It may be that one cannot derive the individual elements of the methods by theoretical or experimental research. We consider this in detail for gas chromatography. The information structure in chromatography is analogous to that in a test in which the scale of the action is specified by a measure (in the above generalized sense). If one could set up a chromatographic column with preset properties or could certify it separately f r o m the chromatograph, one would have an exact analogy between a column filled with sorbent and a measure. However, one cannot make a column with exactly preset properties or certify it apart f r o m the chromatograph, i.e., without reference to all the other error sources (sample selection and input, detection, and so on). Therefore, although the column information structure simultaneously has roles of a generalized measure and a means of action, the metrological features of that measure remain unknown. In such cases, the error model cannot be complete. Many factors are incorporated only as a group, and there is a single way of providing metrological certification, which is known in cybernetics as the b l a c k - b o x method. One way of realizing this is by interlaboratory trials. The lack of scope for a reasonably accurate error model is characteristic of most quantitative-analysis methods in the second group. The structure of any such model is analogous to that for errors in tests, but not in measurements. 544
It is therefore not entirely correct to identify quantitative analysis with measurements and to apply traditional metrological techniques to analytical methods. Metrological research on quantitative analysis thus involves the above division into two groups, in which the first is not associated with information interaction with the material involving auxiliary substances acting as generalized measures, while the second is. Traditional metrological methods are applicable to the second group only for estimating the errors of the means of measurement and the individual methodological components in the error of the result. LITERATURE CITED I.
2.
All-Union State Standard 16263-70: The State System of Measurements: Metrology: Terms and Definitions [in Russian]. V. M. Tslaf, Izmer. Tekh., No. 2, 9 (1988).
545