Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Anal Chem ; 88(7): 3504-11, 2016 Apr 05.
Article in English | MEDLINE | ID: mdl-26910125

ABSTRACT

Quantification of precious metal content is important for studies of ore deposits, basalt petrogenesis, and precious metal geology, mineralization, mining, and processing. However, accurate determination of metal concentrations can be compromised by microheterogeneity commonly referred to as the "nugget effect", i.e., spatially significant variations in the distribution of precious metal minerals at the scale of instrumental analytical beam footprints. There are few studies focused on the spatial distribution of such minerals and its detrimental effects on quantification of the existing suite of relevant reference materials (RM). In order to assess the nugget effect in RM, pressed powder pellets of MASS-1, MASS-3, WMS-1a, WMS-1, and KPT-1 (dominantly sulfides) as well as CHR-Pt+ and CHR-Bkg (chromite-bearing) were mapped with micro-XRF. The number of verified nuggets observed was used to recalculate an effective concentration of precious metals for the analytical aliquot, allowing for an empirical estimate of a minimum mass test portion. MASS-1, MASS-3, and WMS-1a did not contain any nuggets; therefore, a convenient small test portion could be used here (<0.1 g), while CHR-Pt+ would require 0.125 g and WMS-1 would need 23 g to be representative. For CHR-Bkg and KPT-1, the minimum test portion mass would have to be ∼80 and ∼342 g, respectively. Minimum test portions masses may have to be greater still in order to provide detectable analytical signals. Procedures for counteracting the detrimental manifestations of microheterogeneity are presented. It is imperative that both RM and pristine samples are treated in exactly the same way in the laboratory, lest powders having an unknown nugget status (in effect all field samples for analysis) can not be documented to be representing a safe minimum mass basis.

2.
J AOAC Int ; 98(2): 269-74, 2015.
Article in English | MEDLINE | ID: mdl-25807041

ABSTRACT

The target audience for this Special Section comprises parties related to the food and feed sectors, e.g., field samplers, academic and industrial scientists, laboratory personnel, companies, organizations, regulatory bodies, and agencies who are responsible for sampling, as well as project leaders, project managers, quality managers, supervisors, and directors. All these entities face heterogeneous materials, and the characteristics of heterogeneous materials needs to be competently understood by all of them. Before delivering analytical results for decision-making, one form or other of primary sampling is always necessary, which must counteract the effects of the sampling target heterogeneity. Up to five types of sampling error may arise as a specific sampling process interacts with a heterogeneous material; two sampling errors arise because of the heterogeneity of the sampling target, and three additional sampling errors are produced by the sampling process itself-if not properly understood, reduced, and/or eliminated, which is the role of Theory of Sampling. This paper discusses the phenomenon and concepts involved in understanding, describing, and managing the adverse effects of heterogeneity in sampling.


Subject(s)
Animal Feed/analysis , Food Analysis/methods , Research Design/standards , Food Analysis/standards
3.
J AOAC Int ; 98(2): 295-300, 2015.
Article in English | MEDLINE | ID: mdl-25806601

ABSTRACT

Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.


Subject(s)
Animal Feed/analysis , Food Analysis/methods , Research Design/standards , Food Contamination , Selection Bias , Glycine max/chemistry
4.
J AOAC Int ; 98(2): 282-7, 2015.
Article in English | MEDLINE | ID: mdl-25807044

ABSTRACT

Quality control (QC) is a systematic approach for estimating and minimizing significant error contributions to the measurement uncertainty from the full sampling and analysis process. Many types of QC measures can be implemented; the three dealt with here are primary sampling reproducibility, sample processing reproducibility, and contamination. Sampling processes can be subject to QC by applying a replication experiment, used either from the top by replication of the entire sampling/ preparation/analysis process, or in a hierarchical fashion successively at each subsequent sampling stage. The analytical repeatability is necessarily always included in either alternative. The replication experiment results in a quality index, the Relative Sampling Variability, which is used to assess the total error associated with the full field-to-analysis pathway. Contamination can occur at essentially all locations in the sampling regimen in the food/feed realm, affecting sample containers, sampling tools, sample processing equipment, environmental conditions, and sampling personnel. QC events to determine contamination should always be included where appropriate, but is of most concern for low concentration and/or volatile analytes. It is also of key importance in the development of new sampling protocols or carried-over protocols intended for use on new types of materials/lots than the ones for which they were originally developed. We here establish a first practical framework for QC as applied to the sampling context.


Subject(s)
Food Contamination/analysis , Quality Control , Research Design/standards , Selection Bias , Food Analysis , Reproducibility of Results
5.
J AOAC Int ; 98(2): 275-81, 2015.
Article in English | MEDLINE | ID: mdl-25807306

ABSTRACT

Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.


Subject(s)
Food Analysis/methods , Food Contamination/analysis , Research Design/standards , Animal Feed/analysis , Reproducibility of Results , Selection Bias
6.
J AOAC Int ; 98(2): 259-63, 2015.
Article in English | MEDLINE | ID: mdl-25807197

ABSTRACT

International acceptance of data is a much-desired wish in many sectors to ensure equal standards for valid information and data exchange, facilitate trade, support food safety regulation, and promote reliable communication among all parties involved. However, this cannot be accomplished without a harmonized approach to sampling and a joint approach to assess the practical sampling protocols used. Harmonization based on a nonrepresentative protocol, or on a restricted terminology tradition forced upon other sectors would negate any constructive outcome. An international discussion on a harmonized approach to sampling is severely hampered by a plethora of divergent sampling definitions and terms. Different meanings for the same term are frequently used by the different sectors, and even within one specific sector. In other cases, different terms are used for the same concept. Before efforts to harmonize can be attempted, it is essential that all stakeholders can at least communicate effectively in this context. Therefore, a clear understanding of the main vocabularies becomes an essential prerequisite. As a first step, commonalities and dichotomies in terminology are here brought to attention by providing a comparative summary of the. terminology as defined by the Theory of Sampling {TOS) and those in current use by the International Organization for Standardization, the World Health Organization, the Food and Agriculture Organization Codex Alimentarius, and the U.S. Food and Drug Administration. Terms having contradictory meaning to the TOS are emphasized. To the degree possible, we present a successful resolution of some of the most important issues outlined, sufficient to support the objectives of the present Special Section.


Subject(s)
Food Analysis/methods , Food Contamination/analysis , Research Design/standards , Terminology as Topic , Food Quality , Food Safety
7.
Environ Geochem Health ; 36(6): 1151-64, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24861191

ABSTRACT

In areas where water is a major source of dietary iodine (I), the I concentration in drinking water is an important factor for public health and epidemiological understandings. In Denmark, almost all of the drinking water is originating from groundwater. Therefore, understanding the I variation in groundwater and governing factors and processes are crucial. In this study, we perform uni- and multivariate analyses of all available historical Danish I groundwater data from 1933 to 2011 (n = 2,562) to give an overview on the I variability for first time and to discover possible geochemical associations between I and twenty other elements and parameters. Special attention is paid on the description and the quality assurance of this complex compilation of historical data. The high variability of I in Danish groundwater (

Subject(s)
Groundwater/chemistry , Iodine/analysis , Water Pollutants, Chemical/history , Denmark , Environmental Monitoring , History, 20th Century , History, 21st Century , Time Factors , Water Pollutants, Chemical/analysis
8.
Anal Chim Acta ; 1193: 339227, 2022 Feb 08.
Article in English | MEDLINE | ID: mdl-35058013

ABSTRACT

For some real-world material systems, estimations of the incompressible sampling variance based on Gy's classical s2(FSE) formula from the Theory of Sampling (TOS) show a significant discrepancy with empirical estimates of sampling variance. In instances concerning contaminated soils, coated particular aggregates and mixed material systems, theoretical estimates of sampling variance are larger than empirical estimates, a situation which does not have physical meaning in TOS. This has led us to revisit the development of estimates of s2(FSE) from this famous constitutional heterogeneity equation and explore the use of size-density classes for mixed material systems (mixtures of both analyte-enriched and coated particles), an approach which has been mostly unused since Gy's original derivation. This approach makes it possible to avoid taking into account the granulometric and liberation factors from Gy's classical treatment, and present grounds for criticising the use of 'standard' input values of critical parameters such as f: = 0.5, and g: = 0.25. But, as always, the "liberation factor" (l) issue still plays an important role, which is paid due attention. The constitutional heterogeneity formula based on size-density classes is presented in a form that allows for easy implementation in practice, within specified limitations. We present extensive experimental results from real-world systems. Using the "SDCD model" with published data reproduced the relative sampling variances calculated for the standard "mineral-like matrices", but more importantly corrected the relative sampling variance calculated for real contaminants by several orders of magnitudes. In all cases, the recalculated relative sampling variances were decreased to below their corresponding experimental measurements, now fully as expected from TOS, substantiating our development.


Subject(s)
Environmental Pollution , Specimen Handling , Soil
11.
Anal Chim Acta ; 1049: 47-64, 2019 Feb 21.
Article in English | MEDLINE | ID: mdl-30612657

ABSTRACT

There has been an extensive abuse of Gy's Formula during the entire history of applied TOS (Theory of Sampling), it being applied too liberally to almost any aggregate material conceivable for many material classes of extremely different compositions with significant (to large, or extreme) fragment size distribution heterogeneity, for example many types of municipal and industrial waste materials. This abuse regimen is for the most part characterized by lack of fundamental TOS competence and the historical context of Gy's formula. The present paper addresses important theoretical details of TOS, which become important as sampling rates increase at the conclusion of the full 'lot-to-analysis sampling pathway regarding finer details behind TOS' central equations linking sampling conditions to material heterogeneity characteristics allowing the estimation of Total Sampling Error (TSE) manifestations. We derive a new, complementary understanding of the two conceptual factors, y the grouping factor and, z, the segregation factor, intended to represent the local (increment scale) and long-range (increment to lot-scale) heterogeneity aspects of lot materials, respectively. We contrast the standard TOS exposé with the new formulation. While the phenomenological meaning and content of the new proposed factors (y and z) remains the same, their numerical values and bracketing limits are different with z now representing more realistic effects of liberation and segregation combined. This new formulation makes it easier to get a first comprehensive grasp of TOS' dealings with sampling of significantly heterogeneous materials. We believe this may present a slightly easier path into the core issues in TOS when sampling and sub-sampling gets closer to the final aliquot scale.

12.
Biotechnol Bioeng ; 99(2): 302-13, 2008 Feb 01.
Article in English | MEDLINE | ID: mdl-17626304

ABSTRACT

A study of NIR as a tool for process monitoring of thermophilic anaerobic digestion boosted by glycerol has been carried out, aiming at developing simple and robust Process Analytical Technology modalities for on-line surveillance in full scale biogas plants. Three 5 L laboratory fermenters equipped with on-line NIR sensor and special sampling stations were used as a basis for chemometric multivariate calibration. NIR characterisation using Transflexive Embedded Near Infra-Red Sensor (TENIRS) equipment integrated into an external recurrent loop on the fermentation reactors, allows for representative sampling, of the highly heterogeneous fermentation bio slurries. Glycerol is an important by-product from the increasing European bio-diesel production. Glycerol addition can boost biogas yields, if not exceeding a limiting 5-7 g L(-1) concentration inside the fermenter-further increase can cause strong imbalance in the anaerobic digestion process. A secondary objective was to evaluate the effect of addition of glycerol, in a spiking experiment which introduced increasing organic overloading as monitored by volatile fatty acids (VFA) levels. High correlation between on-line NIR determinations of glycerol and VFA contents has been documented. Chemometric regression models (PLS) between glycerol and NIR spectra needed no outlier removals and only one PLS-component was required. Test set validation resulted in excellent measures of prediction performance, precision: r(2) = 0.96 and accuracy = 1.04, slope of predicted versus reference fitting. Similar prediction statistics for acetic acid, iso-butanoic acid and total VFA proves that process NIR spectroscopy is able to quantify all pertinent levels of both volatile fatty acids and glycerol.


Subject(s)
Bacteria, Anaerobic/metabolism , Biosensing Techniques/methods , Gasoline , Bioreactors/microbiology , Fatty Acids, Volatile/analysis , Glycerol/analysis , Spectroscopy, Near-Infrared
13.
J Biotechnol ; 133(1): 162-9, 2008 Jan 01.
Article in English | MEDLINE | ID: mdl-17996321

ABSTRACT

Production monitoring of "natural" 2-heptanone from octanoic acid in an industrial fed-batch cultivation based on Penicillium roqueforti requires development of a method for determination of octanoic acid dissolved in the water phase. An electronic tongue array using six non-specific potentiometric sensors with solid inner contact, and a pH electrode, has been introduced by spiking octanoic acid to a substrate obtained from four different cultivations, representing variations in the relevant industrial matrix. Multivariate calibration was performed on acid concentrations spanning 0.65-20 mmol l(-1). Excluding the lowest concentration a global Partial Least Square regression model with a predicted versus measured correlation of 0.98 and a relative root mean square error of prediction of 5.1% (ln units) (RPD=5.5) signifies a highly acceptable prediction facility. This model was further tested by subjecting it to undiluted as well as diluted samples obtained from a cultivation process in which octanoic acid was catabolized; this led to acceptable prediction errors within the same range as for the global model. It is concluded that the ET sensor array can be applied for determination of octanoic acid in cultivation systems of the general P. roqueforti type.


Subject(s)
Biomimetics/instrumentation , Bioreactors , Biosensing Techniques/instrumentation , Caprylates/analysis , Electrochemistry/instrumentation , Penicillium/metabolism , Tongue , Biomimetics/methods , Biosensing Techniques/methods , Culture Media/chemistry , Electrochemistry/methods , Feasibility Studies , Online Systems
14.
Thromb Haemost ; 98(2): 339-45, 2007 Aug.
Article in English | MEDLINE | ID: mdl-17721616

ABSTRACT

Fibrin clots with reduced permeability, increased clot stiffness and reduced fibrinolysis susceptibility may predispose to cardiovascular disease (CVD). Little is known, however, about the structure of fibrin clots in patients with end-stage renal disease (ESRD). These patients suffer from a high risk of CVD in addition to their chronic low-grade inflammation. Using permeability, compaction and turbidity studies in 22 ESRD patients and 24 healthy controls, fibrin clots made from patient plasma were found to be less permeable (p < 0.001), less compactable (p < 0.001), and less susceptible to fibrinolysis (p < 0.001) than clots from controls. The maximum rate of turbidity increase was also higher for the patients than controls (p < 0.001), and scanning electron microscopy revealed higher clot density of fibrin fibers in clots from patients than clots from controls (p < 0.001). Patients had higher plasma concentrations of fibrinogen, C-reactive protein and interleukin 6 than controls. These plasma markers of inflammation correlated significantly with most of the fibrin structure characteristics observed in the patients. In contrast, plasma markers of azothemia showed no such correlations. The results suggest that in ESRD patients fibrin clots are significantly different from healthy controls, and that the fibrin structure characteristics in the patients are associated primarily with the inflammatory plasma milieu rather than with level of azothemia.


Subject(s)
Blood Coagulation , Fibrin/chemistry , Kidney Failure, Chronic/blood , Azotemia/blood , Biomarkers/blood , Case-Control Studies , Female , Fibrinolysis , Humans , Inflammation/blood , Male , Microscopy, Electron, Scanning , Middle Aged , Nephelometry and Turbidimetry , Permeability
15.
Int J Pharm ; 499(1-2): 156-174, 2016 Feb 29.
Article in English | MEDLINE | ID: mdl-26707245

ABSTRACT

In spite of intense efforts in the last 20 years, the current state of affairs regarding evaluation of adequacy of pharmaceutical mixing is at an impressive standstill, characterized by two draft guidances, one withdrawn, and the other never approved. We here analyze the regulatory, scientific and technological situation and suggest a radical, but logical approach calling for a paradigm shift regarding sampling of pharmaceutical blends. In synergy with QbD/PAT efforts, blend uniformity testing should only be performed with properly designed sampling that can guarantee representativity-in contrast to the current deficient thief sampling. This is necessary for suitable in-process specifications and dosage units meeting desired specifications. The present exposé shows how process sampling based on the Theory of Sampling (TOS) constitutes a new asset for regulatory compliance, providing procedures that suppress hitherto adverse sampling errors. We identify that the optimal sampling location is after emptying the blender, guaranteeing complete characterisation of the residual heterogeneity. TOS includes variographic analysis that decomposes the effective total sampling and analysis error (TSE+TAE) from the variability of the manufacturing process itself. This approach provides reliable in-process characterization allowing independent approval or rejection by the Quality Control unit. The science-based sampling principles presented here will facilitate full control of blending processes, including whether post-blending segregation influences the material stream that reaches the tabletting feed-frame.


Subject(s)
Chemistry, Pharmaceutical/methods , Drug Compounding/standards , Technology, Pharmaceutical/methods , Humans , Pharmaceutical Preparations/standards , Quality Control , Tablets
16.
Bioresour Technol ; 101(4): 1199-205, 2010 Feb.
Article in English | MEDLINE | ID: mdl-19837584

ABSTRACT

Optimization of 2nd generation bioethanol production from wheat straw requires comprehensive knowledge of plant intake feedstock composition. Near Infrared Spectroscopy is evaluated as a potential method for instantaneous quantification of the salient fermentation wheat straw components: cellulose (glucan), hemicelluloses (xylan, arabinan), and lignin. Aiming at chemometric multivariate calibration, 44 pre-selected samples were subjected to spectroscopy and reference analysis. For glucan and xylan prediction accuracies (slope: 0.89, 0.94) and precisions (r(2): 0.87) were obtained, corresponding to error of prediction levels at 8-9%. Models for arabinan and lignin were marginally less good, and especially for lignin a further expansion of the feasibility dataset was deemed necessary. The results are related to significant influences from sub-sampling/mass reduction errors in the laboratory regimen. A relative high proportion of outliers excluded from the present models (10-20%) may indicate that comminution sample preparation is most likely always needed. Different solutions to these issues are suggested.


Subject(s)
Biofuels/analysis , Ethanol/chemical synthesis , Power Plants/instrumentation , Spectroscopy, Near-Infrared/instrumentation , Spectroscopy, Near-Infrared/methods , Triticum/chemistry , Calibration , Feasibility Studies , Least-Squares Analysis , Lignin/analysis , Models, Chemical , Multivariate Analysis , Online Systems , Particulate Matter/analysis , Polysaccharides/analysis , Reproducibility of Results , Xylans/analysis
17.
Anal Chim Acta ; 653(1): 59-70, 2009 Oct 19.
Article in English | MEDLINE | ID: mdl-19800475

ABSTRACT

Sampling errors can be divided into two classes, incorrect sampling and correct sampling errors. Incorrect sampling errors arise from incorrectly designed sampling equipment or procedures. Correct sampling errors are due to the heterogeneity of the material in sampling targets. Excluding the incorrect sampling errors, which can all be eliminated in practice although informed and diligent work is often needed, five factors dominate sampling variance: two factors related to material heterogeneity (analyte concentration; distributional heterogeneity) and three factors related to the sampling process itself (sample type, sample size, sampling modus). Due to highly significant interactions, a comprehensive appreciation of their combined effects is far from trivial and has in fact never been illustrated in detail. Heterogeneous materials can be well characterized by the two first factors, while all essential sampling process characteristics can be summarized by combinations of the latter three. We here present simulations based on an experimental design that varies all five factors. Within the framework of the Theory of Sampling, the empirical Total Sampling Error is a function of the fundamental sampling error and the grouping and segregation error interacting with a specific sampling process. We here illustrate absolute and relative sampling variance levels resulting from a wide array of simulated repeated samplings and express the effects by pertinent lot mean estimates and associated Root Mean Squared Errors/sampling variances, covering specific combinations of materials' heterogeneity and typical sampling procedures as used in current science, technology and industry. Factors, levels and interactions are varied within limits selected to match realistic materials and sampling situations that mimic, e.g., sampling for genetically modified organisms; sampling of geological drill cores; sampling during off-loading 3-dimensional lots (shiploads, railroad cars, truckloads etc.) and scenarios representing a range of industrial manufacturing and production processes. A new simulation facility "SIMSAMP" is presented with selected results designed to show also the wider applicability potential. This contribution furthers a general exposé of all essential effects in the regimen covered by "correct sampling errors", valid for all types of materials in which non-bias sampling can be achieved.

18.
Bioresour Technol ; 100(5): 1711-9, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19006665

ABSTRACT

In this study, two process analytical technologies, near infrared spectroscopy and acoustic chemometrics, were investigated as means of monitoring a maize silage spiked biogas process. A reactor recirculation loop which enables sampling concomitant with on-line near infrared characterisation was applied. Near infrared models resulted in multivariate models for total and volatile solids with ratio of standard error of performance to standard deviation (RPD) values of 5 and 5.1, indicating good on-line monitoring prospects. The volatile fatty acid models had slopes between 0.83 and 0.92 (good accuracy) and RPD between 2.8 and 3.6 (acceptable precision). A second experiment employed at-line monitoring with both near infrared spectroscopy and acoustic chemometrics. A larger calibration span was obtained for total solids by spiking. Both process analytical modalities were validated with respect to the total solids prediction. The near infrared model had an RPD equal to 5.7, while the acoustic chemometrics model resulted in a RPD of 2.6.


Subject(s)
Acoustics , Chemistry Techniques, Analytical/methods , Fatty Acids, Volatile/analysis , Manure/analysis , Silage/analysis , Spectrophotometry, Infrared/methods , Zea mays/chemistry , Fermentation , Models, Chemical
19.
Anal Bioanal Chem ; 378(2): 391-5, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14647952

ABSTRACT

The electronic tongue based on an array of 30 non-specific potentiometric chemical sensors has been applied to qualitative and quantitative monitoring of a batch fermentation process of starting culture for light cheese production. Process control charts were built by using PLS regression and data from fermentations run under "normal" operating conditions. Control charts allow discrimination of samples from fermentation batches run under "abnormal" operating conditions from "normal" ones at as early as 30-50% of fully evolved fermentations. The capability of the electronic tongue to quantify concentrations of important organic acids (citric, lactic and orotic) in the present type of fermentation media was demonstrated. Average prediction errors were assessed in the range 5-13% based on test set validation. Correlation between peptide profiles determined using HPLC and the electronic tongue output was also established. The electronic tongue was demonstrated to be a promising tool for fermentation process monitoring and quantitative analysis of growth media.

SELECTION OF CITATIONS
SEARCH DETAIL