Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 30
Filtrar
1.
J Theor Biol ; 559: 111375, 2023 02 21.
Artigo em Inglês | MEDLINE | ID: mdl-36513210

RESUMO

Serology testing can identify past infection by quantifying the immune response of an infected individual providing important public health guidance. Individual immune responses are time-dependent, which is reflected in antibody measurements. Moreover, the probability of obtaining a particular measurement from a random sample changes due to changing prevalence (i.e., seroprevalence, or fraction of individuals exhibiting an immune response) of the disease in the population. Taking into account these personal and population-level effects, we develop a mathematical model that suggests a natural adaptive scheme for estimating prevalence as a function of time. We then combine the estimated prevalence with optimal decision theory to develop a time-dependent probabilistic classification scheme that minimizes the error associated with classifying a value as positive (history of infection) or negative (no such history) on a given day since the start of the pandemic. We validate this analysis by using a combination of real-world and synthetic SARS-CoV-2 data and discuss the type of longitudinal studies needed to execute this scheme in real-world settings.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , COVID-19/epidemiologia , Prevalência , Estudos Soroepidemiológicos , Teste para COVID-19 , Anticorpos Antivirais
2.
Int J Mol Sci ; 24(21)2023 Oct 28.
Artigo em Inglês | MEDLINE | ID: mdl-37958688

RESUMO

COVID-19 has highlighted challenges in the measurement quality and comparability of serological binding and neutralization assays. Due to many different assay formats and reagents, these measurements are known to be highly variable with large uncertainties. The development of the WHO international standard (WHO IS) and other pool standards have facilitated assay comparability through normalization to a common material but does not provide assay harmonization nor uncertainty quantification. In this paper, we present the results from an interlaboratory study that led to the development of (1) a novel hierarchy of data analyses based on the thermodynamics of antibody binding and (2) a modeling framework that quantifies the probability of neutralization potential for a given binding measurement. Importantly, we introduced a precise, mathematical definition of harmonization that separates the sources of quantitative uncertainties, some of which can be corrected to enable, for the first time, assay comparability. Both the theory and experimental data confirmed that mAbs and WHO IS performed identically as a primary standard for establishing traceability and bridging across different assay platforms. The metrological anchoring of complex serological binding and neuralization assays and fast turn-around production of an mAb reference control can enable the unprecedented comparability and traceability of serological binding assay results for new variants of SARS-CoV-2 and immune responses to other viruses.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , Anticorpos Monoclonais , Bioensaio , Análise de Dados , Anticorpos Antivirais , Anticorpos Neutralizantes
3.
Nucleic Acids Res ; 48(10): 5268-5280, 2020 06 04.
Artigo em Inglês | MEDLINE | ID: mdl-32347943

RESUMO

Structural DNA nanotechnology, as exemplified by DNA origami, has enabled the design and construction of molecularly-precise objects for a myriad of applications. However, limitations in imaging, and other characterization approaches, make a quantitative understanding of the folding process challenging. Such an understanding is necessary to determine the origins of structural defects, which constrain the practical use of these nanostructures. Here, we combine careful fluorescent reporter design with a novel affine transformation technique that, together, permit the rigorous measurement of folding thermodynamics. This method removes sources of systematic uncertainty and resolves problems with typical background-correction schemes. This in turn allows us to examine entropic corrections associated with folding and potential secondary and tertiary structure of the scaffold. Our approach also highlights the importance of heat-capacity changes during DNA melting. In addition to yielding insight into DNA origami folding, it is well-suited to probing fundamental processes in related self-assembling systems.


Assuntos
DNA/química , Termodinâmica , Varredura Diferencial de Calorimetria , Entropia , Transferência Ressonante de Energia de Fluorescência , Corantes Fluorescentes , Nanoestruturas/química , Conformação de Ácido Nucleico , Desnaturação de Ácido Nucleico
4.
Int J Mol Sci ; 22(5)2021 Mar 08.
Artigo em Inglês | MEDLINE | ID: mdl-33800363

RESUMO

Quantitative and robust serology assays are critical measurements underpinning global COVID-19 response to diagnostic, surveillance, and vaccine development. Here, we report a proof-of-concept approach for the development of quantitative, multiplexed flow cytometry-based serological and neutralization assays. The serology assays test the IgG and IgM against both the full-length spike antigens and the receptor binding domain (RBD) of the spike antigen. Benchmarking against an RBD-specific SARS-CoV IgG reference standard, the anti-SARS-CoV-2 RBD antibody titer was quantified in the range of 37.6 µg/mL to 31.0 ng/mL. The quantitative assays are highly specific with no correlative cross-reactivity with the spike proteins of MERS, SARS1, OC43 and HKU1 viruses. We further demonstrated good correlation between anti-RBD antibody titers and neutralizing antibody titers. The suite of serology and neutralization assays help to improve measurement confidence and are complementary and foundational for clinical and epidemiologic studies.


Assuntos
Teste Sorológico para COVID-19/métodos , Teste Sorológico para COVID-19/normas , COVID-19/sangue , COVID-19/imunologia , Testes de Neutralização/métodos , Testes de Neutralização/normas , SARS-CoV-2/imunologia , Anticorpos Neutralizantes/sangue , Anticorpos Neutralizantes/imunologia , Anticorpos Antivirais/sangue , Anticorpos Antivirais/imunologia , Reações Cruzadas , Citometria de Fluxo/métodos , Fluorescência , Humanos , Imunoglobulina G/sangue , Imunoglobulina G/imunologia , Imunoglobulina M/sangue , Imunoglobulina M/imunologia , Microesferas , Receptores Virais/química , Receptores Virais/imunologia , Glicoproteína da Espícula de Coronavírus/química , Glicoproteína da Espícula de Coronavírus/imunologia
5.
Anal Biochem ; 607: 113773, 2020 10 15.
Artigo em Inglês | MEDLINE | ID: mdl-32526200

RESUMO

Fluorescence-based measurements are a standard tool for characterizing the thermodynamic properties of DNA systems. Nonetheless, experimental melt data obtained from polymerase chain-reaction (PCR) machines (for example) often leads to signals that vary significantly between datasets. In many cases, this lack of reproducibility has led to difficulties in analyzing results and computing reasonable uncertainty estimates. To address this problem, we propose a data analysis procedure based on constrained, convex optimization of affine transformations, which can determine when and how melt curves collapse onto one another. A key aspect of this approach is its ability to provide a reproducible and more objective measure of whether a collection of datasets yields a consistent "universal" signal according to an appropriate model of the raw signals. Importantly, integrating this validation step into the analysis hardens the measurement protocol by allowing one to identify experimental conditions and/or modeling assumptions that may corrupt a measurement. Moreover, this robustness facilitates extraction of thermodynamic information at no additional cost in experimental time. We illustrate and test our approach on experiments of Förster resonance energy transfer (FRET) pairs used study the thermodynamics of DNA loops.


Assuntos
DNA/análise , Bases de Dados Factuais , Transferência Ressonante de Energia de Fluorescência , Modelos Moleculares , Conformação de Ácido Nucleico , Reprodutibilidade dos Testes , Espectrometria de Fluorescência , Termodinâmica , Temperatura de Transição
6.
Anal Bioanal Chem ; 412(28): 7977-7988, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-32951064

RESUMO

Motivated by the current COVID-19 health crisis, we consider data analysis for quantitative polymerase chain-reaction (qPCR) measurements. We derive a theoretical result specifying the conditions under which all qPCR amplification curves (including their plateau phases) are identical up to an affine transformation, i.e. a multiplicative factor and horizontal shift. We use this result to develop a data analysis procedure for determining when an amplification curve exhibits characteristics of a true signal. The main idea behind this approach is to invoke a criterion based on constrained optimization that assesses when a measurement signal can be mapped to a master reference curve. We demonstrate that this approach: (i) can decrease the fluorescence detection threshold by up to a decade; and (ii) simultaneously improve confidence in interpretations of late-cycle amplification curves. Moreover, we demonstrate that the master curve is transferable reference data that can harmonize analyses between different labs and across several years. Application to reverse-transcriptase qPCR measurements of a SARS-CoV-2 RNA construct points to the usefulness of this approach for improving confidence and reducing limits of detection in diagnostic testing of emerging diseases. Graphical Abstract Left: a collection of qPCR amplification curves. Right: Example of data collapse after affine transformation.


Assuntos
Algoritmos , Betacoronavirus/genética , Infecções por Coronavirus/virologia , Pneumonia Viral/virologia , RNA Viral/genética , Reação em Cadeia da Polimerase Via Transcriptase Reversa/métodos , Betacoronavirus/isolamento & purificação , COVID-19 , Infecções por Coronavirus/diagnóstico , Humanos , Pandemias , Pneumonia Viral/diagnóstico , RNA Viral/análise , Reação em Cadeia da Polimerase em Tempo Real/métodos , SARS-CoV-2
7.
Anal Chem ; 91(16): 10713-10722, 2019 08 20.
Artigo em Inglês | MEDLINE | ID: mdl-31393105

RESUMO

The ultimate performance of flow-based measurements in microfluidic systems is currently limited by their accuracy at the nanoliter-per-minute scale. Improving such measurements (especially in contexts that require continuous monitoring) is challenging because of constraints associated with shrinking system geometries and limitations imposed by making precise measurements of smaller quantities in real time. A particularly interesting limit is the relative uncertainty as flow approaches zero, which diverges for most measurement methods. To address these problems, we have developed an optofluidic measurement system that can deliver and record light in a precise interrogation region of a microfluidic channel. The system utilizes photobleaching of fluorophore dyes in the bulk flow and can identify zero flow to better than 1 nL/min absolute accuracy. The technique also provides an independent method for determining nonzero flow rates based on a robust scaling relationship between the fluorescence emission and flow. Together, these two independent approaches enable precise measurement of flow to within 5% accuracy down to 10 nL/min and validation of flow control to within 5% uncertainty down to 2 nL/min. We also demonstrate that our technique can be used to extend a calibrated flow meter well below its specified range (e.g., 500 nL/min) and to make dynamic measurements of similar relative uncertainties to the calibrated meter, which would have otherwise expanded significantly in this regime.

8.
AIAA J ; 9712019.
Artigo em Inglês | MEDLINE | ID: mdl-34149052

RESUMO

In computational materials science, coarse-graining approaches often lack a priori uncertainty quantification (UQ) tools that estimate the accuracy of a reduced-order model before it is calibrated or deployed. This is especially the case in coarse-grained (CG) molecular dynamics (MD), where "bottom-up" methods need to run expensive atomistic simulations as part of the calibration process. As a result, scientists have been slow to adopt CG techniques in many settings because they do not know in advance whether the cost of developing the CG model is justified. To address this problem, we present an analytical method of coarse-graining rigid-body systems that yields corresponding intermolecular potentials with controllable levels of accuracy relative to their atomistic counterparts. Critically, this analysis: (i) provides a mathematical foundation for assessing the quality of a CG force field without running simulations; and (ii) provides a tool for understanding how atomistic systems can be viewed as appropriate limits of reduced-order models. Simulated results confirm the validity of this approach at the trajectory level and point to issues that must be addressed in coarse-graining fully non-rigid systems.

9.
Physica D ; 16(1)2018.
Artigo em Inglês | MEDLINE | ID: mdl-32165775

RESUMO

By linking atomistic and mesoscopic scales, we formally show how a local steric effect can hinder crystal growth and lead to a buildup of adsorbed atoms (adatoms) on a supersaturated, (1+1)-dimensional surface. Starting from a many-adatom master equation of a kinetic restricted solid-on-solid (KRSOS) model with external material deposition, we heuristically extract a coarse-grained, mesoscale description that defines the motion of a line defect (i.e., a step) in terms of statistical averages over KRSOS microstates. Near thermodynamic equilibrium, we use error estimates to show that this mesoscale picture can deviate from the standard Burton-Cabrera-Frank (BCF) step flow model in which the adatom flux at step edges is linear in the adatom supersaturation. This deviation is caused by the accumulation of adatoms near the step, which block one another from being incorporated into the crystal lattice. In the mesoscale picture, this deviation manifests as a significant contribution from many-adatom microstates to the corresponding statistical averages. We carry out kinetic Monte Carlo simulations to numerically demonstrate how certain parameters control the aforementioned deviation. From these results, we discuss empirical corrections to the BCF model that amount to a nonlinear relation for the adatom flux at the step. We also discuss how this work could be used to understand the kinetic interplay between accumulation of adatoms and step motion in recent experiments of ice surfaces.

10.
J Chem Phys ; 146(9)2017 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-34234386

RESUMO

Despite more than 40 years of research in condensed-matter physics, state-of-the-art approaches for simulating the radial distribution function (RDF) g(r) still rely on binning pair-separations into a histogram. Such methods suffer from undesirable properties, including subjectivity, high uncertainty, and slow rates of convergence. Moreover, such problems go undetected by the metrics often used to assess RDFs. To address these issues, we propose (I) a spectral Monte Carlo (SMC) quadrature method that yields g(r) as an analytical series expansion; and (II) a Sobolev norm that assesses the quality of RDFs by quantifying their fluctuations. Using the latter, we show that, relative to histogram-based approaches, SMC reduces by orders of magnitude both the noise in g(r) and the number of pair separations needed for acceptable convergence. Moreover, SMC reduces subjectivity and yields simple, differentiable formulas for the RDF, which are useful for tasks such as coarse-grained force-field calibration via iterative Boltzmann inversion.

11.
Artigo em Inglês | MEDLINE | ID: mdl-32165772

RESUMO

In computational materials science, predicting the yield strain of crosslinked polymers remains a challenging task. A common approach is to identify yield as the first critical point of stress-strain curves simulated by molecular dynamics (MD). However, in such cases the underlying data can be excessively noisy, making it difficult to extract meaningful results. In this work, we propose an alternate method for identifying yield on the basis of deformation-recovery simulations. Notably, the corresponding raw data (i.e. residual strains) produce a sharper signal for yield via a transition in their global behavior. We analyze this transition by non-linear regression of computational data to a hyperbolic model. As part of this analysis, we also propose uncertainty quantification techniques for assessing when and to what extent the simulated data is informative of yield. Moreover, we show how the method directly tests for yield via the onset of permanent deformation and discuss recent experimental results, which compare favorably with our predictions.

12.
J Chem Phys ; 144(15): 154101, 2016 Apr 21.
Artigo em Inglês | MEDLINE | ID: mdl-27389203

RESUMO

Generating and calibrating forces that are transferable across a range of state-points remains a challenging task in coarse-grained (CG) molecular dynamics. In this work, we present a coarse-graining workflow, inspired by ideas from uncertainty quantification and numerical analysis, to address this problem. The key idea behind our approach is to introduce a Bayesian correction algorithm that uses functional derivatives of CG simulations to rapidly and inexpensively recalibrate initial estimates f0 of forces anchored by standard methods such as force-matching. Taking density-temperature relationships as a running example, we demonstrate that this algorithm, in concert with various interpolation schemes, can be used to efficiently compute physically reasonable force curves on a fine grid of state-points. Importantly, we show that our workflow is robust to several choices available to the modeler, including the interpolation schemes and tools used to construct f0. In a related vein, we also demonstrate that our approach can speed up coarse-graining by reducing the number of atomistic simulations needed as inputs to standard methods for generating CG forces.

13.
Soft Matter ; 10(37): 7370-8, 2014 Oct 07.
Artigo em Inglês | MEDLINE | ID: mdl-25080973

RESUMO

DNA origami is a powerful platform for assembling gold nanoparticle constructs, an important class of nanostructure with numerous applications. Such constructs are assembled by the association of complementary DNA oligomers. These association reactions have yields of <100%, requiring the development of methods to purify the desired product. We study the performance of centrifugation as a separation approach by combining optical and hydrodynamic measurements and computations. We demonstrate that bench-top microcentrifugation is a simple and efficient method of separating the reaction products, readily achieving purities of >90%. The gold nanoparticles play a number of critical roles in our system, functioning not only as integral components of the purified products, but also as hydrodynamic separators and optical indicators of the reaction products during the purification process. We find that separation resolution is ultimately limited by the polydispersity in the mass of the gold nanoparticles and by structural distortions of DNA origami induced by the gold nanoparticles. Our study establishes a methodology for determining the design rules for nanomanufacturing DNA origami-nanoparticle constructs.


Assuntos
Centrifugação/métodos , DNA/química , Ouro/química , Nanopartículas Metálicas/química , Simulação por Computador , DNA de Cadeia Simples/química , Difusão , Hidrodinâmica , Luz , Nanocompostos/química , Nanotecnologia , Tamanho da Partícula , Pressão , Espalhamento de Radiação , Temperatura , Viscosidade
14.
medRxiv ; 2024 May 24.
Artigo em Inglês | MEDLINE | ID: mdl-38826359

RESUMO

COVID-19 disproportionately affected minorities, while research barriers to engage underserved communities persist. Serological studies reveal infection and vaccination histories within these communities, however lack of consensus on downstream evaluation methods impede meta-analyses and dampen the broader public health impact. To reveal the impact of COVID-19 and vaccine uptake among diverse communities and to develop rigorous serological downstream evaluation methods, we engaged racial and ethnic minorities in Massachusetts in a cross-sectional study (April - July 2022), screened blood and saliva for SARS-CoV-2 and human endemic coronavirus (hCoV) antibodies by bead-based multiplex assay and point-of-care (POC) test and developed across-plate normalization and classification boundary methods for optimal qualitative serological assessments. Among 290 participants, 91.4 % reported receiving at least one dose of a COVID-19 vaccine, while 41.7 % reported past SARS-CoV-2 infections, which was confirmed by POC- and multiplex-based saliva and blood IgG seroprevalences. We found significant differences in antigen-specific IgA and IgG antibody outcomes and indication of cross-reactivity with hCoV OC43. Finally, 26.5 % of participants reported lingering COVID-19 symptoms, mostly middle-aged Latinas. Hence, prolonged COVID-19 symptoms were common among our underserved population and require public health attention, despite high COVID-19 vaccine uptake. Saliva served as a less-invasive sample-type for IgG-based serosurveys and hCoV cross-reactivity needed to be evaluated for reliable SARS-CoV-2 serosurvey results. Using the developed rigorous downstream qualitative serological assessment methods will help standardize serosurvey outcomes and meta-analyses for future serosurveys beyond SARS-CoV-2.

15.
Math Biosci ; 358: 108982, 2023 04.
Artigo em Inglês | MEDLINE | ID: mdl-36804385

RESUMO

An accurate multiclass classification strategy is crucial to interpreting antibody tests. However, traditional methods based on confidence intervals or receiver operating characteristics lack clear extensions to settings with more than two classes. We address this problem by developing a multiclass classification based on probabilistic modeling and optimal decision theory that minimizes the convex combination of false classification rates. The classification process is challenging when the relative fraction of the population in each class, or generalized prevalence, is unknown. Thus, we also develop a method for estimating the generalized prevalence of test data that is independent of classification of the test data. We validate our approach on serological data with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) naïve, previously infected, and vaccinated classes. Synthetic data are used to demonstrate that (i) prevalence estimates are unbiased and converge to true values and (ii) our procedure applies to arbitrary measurement dimensions. In contrast to the binary problem, the multiclass setting offers wide-reaching utility as the most general framework and provides new insight into prevalence estimation best practices.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , COVID-19/diagnóstico , COVID-19/epidemiologia , Prevalência , Teste para COVID-19
16.
PLoS One ; 18(12): e0295502, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38134031

RESUMO

Signals analysis for cytometry remains a challenging task that has a significant impact on uncertainty. Conventional cytometers assume that individual measurements are well characterized by simple properties such as the signal area, width, and height. However, these approaches have difficulty distinguishing inherent biological variability from instrument artifacts and operating conditions. As a result, it is challenging to quantify uncertainty in the properties of individual cells and perform tasks such as doublet deconvolution. We address these problems via signals analysis techniques that use scale transformations to: (I) separate variation in biomarker expression from effects due to flow conditions and particle size; (II) quantify reproducibility associated with a given laser interrogation region; (III) estimate uncertainty in measurement values on a per-event basis; and (IV) extract the singlets that make up a multiplet. The key idea behind this approach is to model how variable operating conditions deform the signal shape and then use constrained optimization to "undo" these deformations for measured signals; residuals to this process characterize reproducibility. Using a recently developed microfluidic cytometer, we demonstrate that these techniques can account for instrument and measurand induced variability with a residual uncertainty of less than 2.5% in the signal shape and less than 1% in integrated area.


Assuntos
Reprodutibilidade dos Testes , Incerteza , Tamanho da Partícula , Citometria de Fluxo/métodos
17.
PLoS One ; 18(3): e0280823, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36913381

RESUMO

The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has emphasized the importance and challenges of correctly interpreting antibody test results. Identification of positive and negative samples requires a classification strategy with low error rates, which is hard to achieve when the corresponding measurement values overlap. Additional uncertainty arises when classification schemes fail to account for complicated structure in data. We address these problems through a mathematical framework that combines high dimensional data modeling and optimal decision theory. Specifically, we show that appropriately increasing the dimension of data better separates positive and negative populations and reveals nuanced structure that can be described in terms of mathematical models. We combine these models with optimal decision theory to yield a classification scheme that better separates positive and negative samples relative to traditional methods such as confidence intervals (CIs) and receiver operating characteristics. We validate the usefulness of this approach in the context of a multiplex salivary SARS-CoV-2 immunoglobulin G assay dataset. This example illustrates how our analysis: (i) improves the assay accuracy, (e.g. lowers classification errors by up to 42% compared to CI methods); (ii) reduces the number of indeterminate samples when an inconclusive class is permissible, (e.g. by 40% compared to the original analysis of the example multiplex dataset) and (iii) decreases the number of antigens needed to classify samples. Our work showcases the power of mathematical modeling in diagnostic classification and highlights a method that can be adopted broadly in public health and clinical settings.


Assuntos
COVID-19 , SARS-CoV-2 , Humanos , COVID-19/diagnóstico , Saliva , Teste para COVID-19 , Técnicas e Procedimentos Diagnósticos , Anticorpos Antivirais , Sensibilidade e Especificidade
18.
Lab Chip ; 22(17): 3217-3228, 2022 08 23.
Artigo em Inglês | MEDLINE | ID: mdl-35856829

RESUMO

Flow cytometry is an invaluable technology in biomedical research, but confidence in single-cell measurements remains limited due to a lack of appropriate techniques for uncertainty quantification (UQ). It is particularly challenging to evaluate the potential for different instrumentation designs or operating parameters to influence the measurement physics in ways that change measurement repeatability. Here, we report a direct experimental approach to UQ using a serial flow cytometer that measured each particle more than once along a flow path. The instrument was automated for real-time characterization of measurement precision and operated with particle velocities exceeding 1 m s-1, throughputs above 100 s-1, and analysis yields better than 99.9%. These achievements were enabled by a novel hybrid inertial and hydrodynamic particle focuser to tightly control particle positions and velocities. The cytometer identified ideal flow conditions with fluorescence area measurement precision on the order of 1% and characterized tradeoffs between precision, throughput, and analysis yield. The serial cytometer is anticipated to improve single-cell measurements through estimation (and subsequent control) of uncertainty contributions from various other instrument parameters leading to overall improvements in the ability to better classify sample composition and to find rare events.


Assuntos
Hidrodinâmica , Citometria de Fluxo
19.
J Biomed Opt ; 27(1)2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-35102729

RESUMO

SIGNIFICANCE: Performance improvements in microfluidic systems depend on accurate measurement and fluid control on the micro- and nanoscales. New applications are continuously leading to lower volumetric flow rates. AIM: We focus on improving an optofluidic system for measuring and calibrating microflows to the sub-nanoliter per minute range. APPROACH: Measurements rely on an optofluidic system that delivers excitation light and records fluorescence in a precise interrogation region of a microfluidic channel. Exploiting a scaling relationship between the flow rate and fluorescence emission after photobleaching, the system enables real-time determination of flow rates. RESULTS: Here, we demonstrate improved calibration of a flow controller to 1% uncertainty. Further, the resolution of the optofluidic flow meter improved to less than 1 nL / min with 5% uncertainty using a molecule with a 14-fold smaller diffusion coefficient than our previous report. CONCLUSIONS: We demonstrate new capabilities in sub-nanoliter per minute flow control and measurement that are generalizable to cutting-edge light-material interaction and molecular diffusion for chemical and biomedical industries.


Assuntos
Técnicas Analíticas Microfluídicas , Microfluídica
20.
ArXiv ; 2022 Jun 28.
Artigo em Inglês | MEDLINE | ID: mdl-35795812

RESUMO

The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has emphasized the importance and challenges of correctly interpreting antibody test results. Identification of positive and negative samples requires a classification strategy with low error rates, which is hard to achieve when the corresponding measurement values overlap. Additional uncertainty arises when classification schemes fail to account for complicated structure in data. We address these problems through a mathematical framework that combines high dimensional data modeling and optimal decision theory. Specifically, we show that appropriately increasing the dimension of data better separates positive and negative populations and reveals nuanced structure that can be described in terms of mathematical models. We combine these models with optimal decision theory to yield a classification scheme that better separates positive and negative samples relative to traditional methods such as confidence intervals (CIs) and receiver operating characteristics. We validate the usefulness of this approach in the context of a multiplex salivary SARS-CoV-2 immunoglobulin G assay dataset. This example illustrates how our analysis: (i) improves the assay accuracy (e.g. lowers classification errors by up to 42 % compared to CI methods); (ii) reduces the number of indeterminate samples when an inconclusive class is permissible (e.g. by 40 % compared to the original analysis of the example multiplex dataset); and (iii) decreases the number of antigens needed to classify samples. Our work showcases the power of mathematical modeling in diagnostic classification and highlights a method that can be adopted broadly in public health and clinical settings.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA