Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 860
Filtrar
1.
Sensors (Basel) ; 24(13)2024 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-39001139

RESUMO

The paper "Using Absorption Models for Insulin and Carbohydrates and Deep Leaning to Improve Glucose Level Predictions" (Sensors2021, 21, 5273) proposes a novel approach to predicting blood glucose levels for people with type 1 diabetes mellitus (T1DM). By building exponential models from raw carbohydrate and insulin data to simulate the absorption in the body, the authors reported a reduction in their model's root-mean-square error (RMSE) from 15.5 mg/dL (raw) to 9.2 mg/dL (exponential) when predicting blood glucose levels one hour into the future. In this comment, we demonstrate that the experimental techniques used in that paper are flawed, which invalidates its results and conclusions. Specifically, after reviewing the authors' code, we found that the model validation scheme was malformed, namely, the training and test data from the same time intervals were mixed. This means that the reported RMSE numbers in the referenced paper did not accurately measure the predictive capabilities of the approaches that were presented. We repaired the measurement technique by appropriately isolating the training and test data, and we discovered that their models actually performed dramatically worse than was reported in the paper. In fact, the models presented in the that paper do not appear to perform any better than a naive model that predicts future glucose levels to be the same as the current ones.


Assuntos
Glicemia , Diabetes Mellitus Tipo 1 , Insulina , Insulina/metabolismo , Humanos , Glicemia/metabolismo , Glicemia/análise , Diabetes Mellitus Tipo 1/metabolismo , Carboidratos/química , Modelos Biológicos
2.
Sensors (Basel) ; 24(13)2024 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-39001161

RESUMO

This study aimed to measure the differences in commonly used summary acceleration metrics during elite Australian football games under three different data processing protocols (raw, custom-processed, manufacturer-processed). Estimates of distance, speed and acceleration were collected with a 10-Hz GNSS tracking technology device from fourteen matches of 38 elite Australian football players from one team. Raw and manufacturer-processed data were exported from respective proprietary software and two common summary acceleration metrics (number of efforts and distance within medium/high-intensity zone) were calculated for the three processing methods. To estimate the effect of the three different data processing methods on the summary metrics, linear mixed models were used. The main findings demonstrated that there were substantial differences between the three processing methods; the manufacturer-processed acceleration data had the lowest reported distance (up to 184 times lower) and efforts (up to 89 times lower), followed by the custom-processed distance (up to 3.3 times lower) and efforts (up to 4.3 times lower), where raw data had the highest reported distance and efforts. The results indicated that different processing methods changed the metric output and in turn alters the quantification of the demands of a sport (volume, intensity and frequency of the metrics). Coaches, practitioners and researchers need to understand that various processing methods alter the summary metrics of acceleration data. By being informed about how these metrics are affected by processing methods, they can better interpret the data available and effectively tailor their training programs to match the demands of competition.

3.
Data Brief ; 54: 110254, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38962210

RESUMO

The current work presents the generation of a comprehensive spatial dataset of a lightweight beam element composed of four twisted plywood strips, achieved through the application of Structure-from-Motion (SfM) - Multi-view Stereo (MVS) photogrammetry techniques in controlled laboratory conditions. The data collection process was meticulously conducted to ensure accuracy and precision, employing scale bars of varying lengths. The captured images were then processed using photogrammetric software, leading to the creation of point clouds, meshes, and texture files. These data files represent the 3D model of the beam at different mesh sizes (raw, high-poly, medium-poly, and low-poly), adding a high level of detail to the 3D visualization. The dataset holds significant reuse potential and offers essential resources for further studies in numerical modeling, simulations of complex structures, and training machine learning algorithms. This data can also serve as validation sets for emerging photogrammetry methods and form-finding techniques, especially ones involving large deformations and geometric nonlinearities, particularly within the structural engineering field.

4.
Artigo em Inglês | MEDLINE | ID: mdl-39037623

RESUMO

The intelligent predictive and optimized wastewater treatment plant method represents a ground-breaking shift in how we manage wastewater. By capitalizing on data-driven predictive modeling, automation, and optimization strategies, it introduces a comprehensive framework designed to enhance the efficiency and sustainability of wastewater treatment operations. This methodology encompasses various essential phases, including data gathering and training, the integration of innovative computational models such as Chimp-based GoogLeNet (CbG), data processing, and performance prediction, all while fine-tuning operational parameters. The designed model is a hybrid of the Chimp optimization algorithm and GoogLeNet. The GoogLeNet is a type of deep convolutional architecture, and the Chimp optimization is one of the bio-inspired optimization models based on chimpanzee behavior. It optimizes the operational parameters, such as pH, dosage rate, effluent quality, and energy consumption, of the wastewater treatment plant, by fixing the optimal settings in the GoogLeNet. The designed model includes the process such as pre-processing and feature analysis for the effective prediction of the operation parameters and its optimization. Notably, this innovative approach provides several key advantages, including cost reduction in operations, improved environmental outcomes, and more effective resource management. Through continuous adaptation and refinement, this methodology not only optimizes wastewater treatment plant performance but also effectively tackles evolving environmental challenges while conserving resources. It represents a significant step forward in the quest for efficient and sustainable wastewater treatment practices. The RMSE, MAE, MAPE, and R2 scores for the suggested technique are 1.103, 0.233, 0.012, and 0.002. Also, the model has shown that power usage decreased to about 1.4%, while greenhouse gas emissions have significantly decreased to 0.12% than the existing techniques.

5.
BMC Med Inform Decis Mak ; 24(1): 194, 2024 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-39014361

RESUMO

This research study demonstrates an efficient scheme for early detection of cardiorespiratory complications in pandemics by Utilizing Wearable Electrocardiogram (ECG) sensors for pattern generation and Convolution Neural Networks (CNN) for decision analytics. In health-related outbreaks, timely and early diagnosis of such complications is conclusive in reducing mortality rates and alleviating the burden on healthcare facilities. Existing methods rely on clinical assessments, medical history reviews, and hospital-based monitoring, which are valuable but have limitations in terms of accessibility, scalability, and timeliness, particularly during pandemics. The proposed scheme commences by deploying wearable ECG sensors on the patient's body. These sensors collect data by continuously monitoring the cardiac activity and respiratory patterns of the patient. The collected raw data is then transmitted securely in a wireless manner to a centralized server and stored in a database. Subsequently, the stored data is assessed using a preprocessing process which extracts relevant and important features like heart rate variability and respiratory rate. The preprocessed data is then used as input into the CNN model for the classification of normal and abnormal cardiorespiratory patterns. To achieve high accuracy in abnormality detection the CNN model is trained on labeled data with optimized parameters. The performance of the proposed scheme is evaluated and gauged using different scenarios, which shows a robust performance in detecting abnormal cardiorespiratory patterns with a sensitivity of 95% and specificity of 92%. Prominent observations, which highlight the potential for early interventions include subtle changes in heart rate variability and preceding respiratory distress. These findings show the significance of wearable ECG technology in improving pandemic management strategies and informing public health policies, which enhances preparedness and resilience in the face of emerging health threats.


Assuntos
Diagnóstico Precoce , Eletrocardiografia , Redes Neurais de Computação , Dispositivos Eletrônicos Vestíveis , Humanos , Eletrocardiografia/instrumentação , COVID-19/diagnóstico
6.
Artigo em Inglês | MEDLINE | ID: mdl-39013167

RESUMO

Mass spectrometry is broadly employed to study complex molecular mechanisms in various biological and environmental fields, enabling 'omics' research such as proteomics, metabolomics, and lipidomics. As study cohorts grow larger and more complex with dozens to hundreds of samples, the need for robust quality control (QC) measures through automated software tools becomes paramount to ensure the integrity, high quality, and validity of scientific conclusions from downstream analyses and minimize the waste of resources. Since existing QC tools are mostly dedicated to proteomics, automated solutions supporting metabolomics are needed. To address this need, we developed the software PeakQC, a tool for automated QC of MS data that is independent of omics molecular types (i.e., omics-agnostic). It allows automated extraction and inspection of peak metrics of precursor ions (e.g., errors in mass, retention time, arrival time) and supports various instrumentations and acquisition types, from infusion experiments or using liquid chromatography and/or ion mobility spectrometry front-end separations and with/without fragmentation spectra from data-dependent or independent acquisition analyses. Diagnostic plots for fragmentation spectra are also generated. Here, we describe and illustrate PeakQC's functionalities using different representative data sets, demonstrating its utility as a valuable tool for enhancing the quality and reliability of omics mass spectrometry analyses.

7.
J Proteome Res ; 2024 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-38833568

RESUMO

Direct-to-Mass Spectrometry and ambient ionization techniques can be used for biochemical fingerprinting in a fast way. Data processing is typically accomplished with vendor-provided software tools. Here, a novel, open-source functionality, entitled Tidy-Direct-to-MS, was developed for data processing of direct-to-MS data sets. It allows for fast and user-friendly processing using different modules for optional sample position detection and separation, mass-to-charge ratio drift detection and correction, consensus spectra calculation, and bracketing across sample positions as well as feature abundance calculation. The tool also provides functionality for the automated comparison of different sets of parameters, thereby assisting the user in the complex task of finding an optimal combination to maximize the total number of detected features while also checking for the detection of user-provided reference features. In addition, Tidy-Direct-to-MS has the capability for data quality review and subsequent data analysis, thereby simplifying the workflow of untargeted ambient MS-based metabolomics studies. Tidy-Direct-to-MS is implemented in the Python programming language as part of the TidyMS library and can thus be easily extended. Capabilities of Tidy-Direct-to-MS are showcased in a data set acquired in a marine metabolomics study reported in MetaboLights (MTBLS1198) using a transmission mode Direct Analysis in Real Time-Mass Spectrometry (TM-DART-MS)-based method.

8.
Entropy (Basel) ; 26(6)2024 May 23.
Artigo em Inglês | MEDLINE | ID: mdl-38920449

RESUMO

The causal structure of a system imposes constraints on the joint probability distribution of variables that can be generated by the system. Archetypal constraints consist of conditional independencies between variables. However, particularly in the presence of hidden variables, many causal structures are compatible with the same set of independencies inferred from the marginal distributions of observed variables. Additional constraints allow further testing for the compatibility of data with specific causal structures. An existing family of causally informative inequalities compares the information about a set of target variables contained in a collection of variables, with a sum of the information contained in different groups defined as subsets of that collection. While procedures to identify the form of these groups-decomposition inequalities have been previously derived, we substantially enlarge the applicability of the framework. We derive groups-decomposition inequalities subject to weaker independence conditions, with weaker requirements in the configuration of the groups, and additionally allowing for conditioning sets. Furthermore, we show how constraints with higher inferential power may be derived with collections that include hidden variables, and then converted into testable constraints using data processing inequalities. For this purpose, we apply the standard data processing inequality of conditional mutual information and derive an analogous property for a measure of conditional unique information recently introduced to separate redundant, synergistic, and unique contributions to the information that a set of variables has about a target.

9.
Hum Brain Mapp ; 45(8): e26751, 2024 Jun 01.
Artigo em Inglês | MEDLINE | ID: mdl-38864293

RESUMO

Effective connectivity (EC) refers to directional or causal influences between interacting neuronal populations or brain regions and can be estimated from functional magnetic resonance imaging (fMRI) data via dynamic causal modeling (DCM). In contrast to functional connectivity, the impact of data processing varieties on DCM estimates of task-evoked EC has hardly ever been addressed. We therefore investigated how task-evoked EC is affected by choices made for data processing. In particular, we considered the impact of global signal regression (GSR), block/event-related design of the general linear model (GLM) used for the first-level task-evoked fMRI analysis, type of activation contrast, and significance thresholding approach. Using DCM, we estimated individual and group-averaged task-evoked EC within a brain network related to spatial conflict processing for all the parameters considered and compared the differences in task-evoked EC between any two data processing conditions via between-group parametric empirical Bayes (PEB) analysis and Bayesian data comparison (BDC). We observed strongly varying patterns of the group-averaged EC depending on the data processing choices. In particular, task-evoked EC and parameter certainty were strongly impacted by GLM design and type of activation contrast as revealed by PEB and BDC, respectively, whereas they were little affected by GSR and the type of significance thresholding. The event-related GLM design appears to be more sensitive to task-evoked modulations of EC, but provides model parameters with lower certainty than the block-based design, while the latter is more sensitive to the type of activation contrast than is the event-related design. Our results demonstrate that applying different reasonable data processing choices can substantially alter task-evoked EC as estimated by DCM. Such choices should be made with care and, whenever possible, varied across parallel analyses to evaluate their impact and identify potential convergence for robust outcomes.


Assuntos
Teorema de Bayes , Mapeamento Encefálico , Encéfalo , Imageamento por Ressonância Magnética , Humanos , Encéfalo/fisiologia , Encéfalo/diagnóstico por imagem , Masculino , Feminino , Mapeamento Encefálico/métodos , Adulto , Adulto Jovem , Modelos Neurológicos , Processamento de Imagem Assistida por Computador/métodos , Vias Neurais/fisiologia , Vias Neurais/diagnóstico por imagem
10.
bioRxiv ; 2024 May 30.
Artigo em Inglês | MEDLINE | ID: mdl-38854017

RESUMO

Light-sheet fluorescence microscopy (LSFM), a prominent fluorescence microscopy technique, offers enhanced temporal resolution for imaging biological samples in four dimensions (4D; x, y, z, time). Some of the most recent implementations, including inverted selective plane illumination microscopy (iSPIM) and lattice light-sheet microscopy (LLSM), rely on a tilting of the sample plane with respect to the light sheet of 30-45 degrees to ease sample preparation. Data from such tilted-sample-plane LSFMs require subsequent deskewing and rotation for proper visualization and analysis. Such transformations currently demand substantial memory allocation. This poses computational challenges, especially with large datasets. The consequence is long processing times compared to data acquisition times, which currently limits the ability for live-viewing the data as it is being captured by the microscope. To enable the fast preprocessing of large light-sheet microscopy datasets without significant hardware demand, we have developed WH-Transform, a novel GPU-accelerated memory-efficient algorithm that integrates deskewing and rotation into a single transformation, significantly reducing memory requirements and reducing the preprocessing run time by at least 10-fold for large image stacks. Benchmarked against conventional methods and existing software, our approach demonstrates linear scalability. Processing large 3D stacks of up to 15 GB is now possible within one minute using a single GPU with 24 GB of memory. Applied to 4D LLSM datasets of human hepatocytes, human lung organoid tissue, and human brain organoid tissue, our method outperforms alternatives, providing rapid, accurate preprocessing within seconds. Importantly, such processing speeds now allow visualization of the raw microscope data stream in real time, significantly improving the usability of LLSM in biology. In summary, this advancement holds transformative potential for light-sheet microscopy, enabling real-time, on-the-fly data processing, visualization, and analysis on standard workstations, thereby revolutionizing biological imaging applications for LLSM, SPIM and similar light microscopes.

11.
Foods ; 13(12)2024 Jun 13.
Artigo em Inglês | MEDLINE | ID: mdl-38928804

RESUMO

Cassava is a staple crop in developing countries because its starchy roots provide essential dietary carbohydrates. The aim of this research was to conduct a comprehensive inquiry and scientific evaluation of the nutritional value of cassava tubers. Eight nutritional characteristics were examined in native and imported cassava variants: starch, reduced sugar, anthocyanins, protein, dietary fiber, quinic acid, vitamin C, and dry matter content. Principal component analysis (PCA) was conducted to minimize the dimensionality of the nutritional markers. A scientific assessment technique was developed to calculate a composite score for the various cassava samples. Analysis of the data revealed noticeable variance among the samples' nutritional indicators, suggesting varying degrees of association. Starch had a substantial positive link with lower sugar, protein, and dry matter content (p < 0.01). Anthocyanins and quinic acid interacted favorably (p < 0.05), and a positive link between protein and dry matter content was observed (p < 0.05); however, protein and dietary fiber interacted negatively (p < 0.05). The contribution rate of the top three PCA factors was over 76%, demonstrating that these factors incorporated the primary information acquired from the eight original nutritional indices, while maintaining excellent representativeness and impartiality. The experimental results showed a preliminary nutritional grade for 22 cassava tuber samples. The top five types were Guangxi Muci, Gui Cassava 4, Glutinous Rice Cassava, Huifeng 60, and Dongguan Hongwei. In the cluster analysis, the levels of similarity between the data showed that the 22 types of cassava tubers could be grouped into five categories, each with their own set of nutrients. This study promotes the directed breeding of cassava species and offers a theoretical foundation for creating and using various cassava varieties. Furthermore, this work lays the groundwork for a systematic and dependable technique for the quality assessment, comprehensive evaluation, and reasonable classification of cassava species and similar crops.

12.
Biosens Bioelectron ; 261: 116499, 2024 Oct 01.
Artigo em Inglês | MEDLINE | ID: mdl-38896981

RESUMO

With the advent of flexible electronics and sensing technology, hydrogel-based flexible sensors have exhibited considerable potential across a diverse range of applications, including wearable electronics and soft robotics. Recently, advanced machine learning (ML) algorithms have been integrated into flexible hydrogel sensing technology to enhance their data processing capabilities and to achieve intelligent perception. However, there are no reviews specifically focusing on the data processing steps and analysis based on the raw sensing data obtained by flexible hydrogel sensors. Here we provide a comprehensive review of the latest advancements and breakthroughs in intelligent perception achieved through the fusion of ML algorithms with flexible hydrogel sensors, across various applications. Moreover, this review thoroughly examines the data processing techniques employed in flexible hydrogel sensors, offering valuable perspectives expected to drive future data-driven applications in this field.


Assuntos
Técnicas Biossensoriais , Hidrogéis , Aprendizado de Máquina , Dispositivos Eletrônicos Vestíveis , Hidrogéis/química , Técnicas Biossensoriais/instrumentação , Técnicas Biossensoriais/métodos , Humanos , Algoritmos , Robótica/instrumentação , Desenho de Equipamento
13.
Methods Mol Biol ; 2817: 177-220, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38907155

RESUMO

Mass-spectrometry (MS)-based single-cell proteomics (SCP) explores cellular heterogeneity by focusing on the functional effectors of the cells-proteins. However, extracting meaningful biological information from MS data is far from trivial, especially with single cells. Currently, data analysis workflows are substantially different from one research team to another. Moreover, it is difficult to evaluate pipelines as ground truths are missing. Our team has developed the R/Bioconductor package called scp to provide a standardized framework for SCP data analysis. It relies on the widely used QFeatures and SingleCellExperiment data structures. In addition, we used a design containing cell lines mixed in known proportions to generate controlled variability for data analysis benchmarking. In this chapter, we provide a flexible data analysis protocol for SCP data using the scp package together with comprehensive explanations at each step of the processing. Our main steps are quality control on the feature and cell level, aggregation of the raw data into peptides and proteins, normalization, and batch correction. We validate our workflow using our ground truth data set. We illustrate how to use this modular, standardized framework and highlight some crucial steps.


Assuntos
Espectrometria de Massas , Proteômica , Análise de Célula Única , Software , Fluxo de Trabalho , Proteômica/métodos , Proteômica/normas , Análise de Célula Única/métodos , Espectrometria de Massas/métodos , Humanos , Biologia Computacional/métodos , Proteoma/análise , Análise de Dados
14.
IUCrJ ; 11(Pt 4): 464-475, 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38864497

RESUMO

The hardware for data archiving has expanded capacities for digital storage enormously in the past decade or more. The IUCr evaluated the costs and benefits of this within an official working group which advised that raw data archiving would allow ground truth reproducibility in published studies. Consultations of the IUCr's Commissions ensued via a newly constituted standing advisory committee, the Committee on Data. At all stages, the IUCr financed workshops to facilitate community discussions and possible methods of raw data archiving implementation. The recent launch of the IUCrData journal's Raw Data Letters is a milestone in the implementation of raw data archiving beyond the currently published studies: it includes diffraction patterns that have not been fully interpreted, if at all. The IUCr 75th Congress in Melbourne included a workshop on raw data reuse, discussing the successes and ongoing challenges of raw data reuse. This article charts the efforts of the IUCr to facilitate discussions and plans relating to raw data archiving and reuse within the various communities of crystallography, diffraction and scattering.

15.
J Synchrotron Radiat ; 31(Pt 4): 670-680, 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38838166

RESUMO

Deflectometric profilometers are used to precisely measure the form of beam shaping optics of synchrotrons and X-ray free-electron lasers. They often utilize autocollimators which measure slope by evaluating the displacement of a reticle image on a detector. Based on our privileged access to the raw image data of an autocollimator, novel strategies to reduce the systematic measurement errors by using a set of overlapping images of the reticle obtained at different positions on the detector are discussed. It is demonstrated that imaging properties such as, for example, geometrical distortions and vignetting, can be extracted from this redundant set of images without recourse to external calibration facilities. This approach is based on the fact that the properties of the reticle itself do not change - all changes in the reticle image are due to the imaging process. Firstly, by combining interpolation and correlation, it is possible to determine the shift of a reticle image relative to a reference image with minimal error propagation. Secondly, the intensity of the reticle image is analysed as a function of its position on the CCD and a vignetting correction is calculated. Thirdly, the size of the reticle image is analysed as a function of its position and an imaging distortion correction is derived. It is demonstrated that, for different measurement ranges and aperture diameters of the autocollimator, reductions in the systematic errors of up to a factor of four to five can be achieved without recourse to external measurements.

16.
Int J Biol Macromol ; 273(Pt 2): 133160, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38889836

RESUMO

Lignin is a promising renewable source of valuable organic compounds and environmentally benign materials. However, its involvement in economic circulation and the creation of new biorefining technologies require an understanding of its chemical composition and structure. This problem can be overcome by applying mass spectrometry analytical techniques in combination with advanced chemometric methods for mass spectra processing. The present study is aimed at the development of mass defect filtering to characterize the chemical composition of lignin at the molecular level. This study introduces a novel approach involving resolution-enhanced Kendrick mass defect (REKMD) analysis for the processing of atmospheric pressure photoionization Orbitrap mass spectra of lignin. The set of priority Kendrick fractional base units was predefined in model experiments and provided a substantially expanding available mass defect range for the informative visualization of lignin mass spectra. The developed REKMD analysis strategy allowed to obtain the most complete data on all the homologous series typical of lignin and thus facilitated the interpretation and assignment of elemental compositions and structural formulas to oligomers detected in extremely complex mass spectra, including tandem ones. For the first time, the minor modifications (sulfation) of lignin obtained in ionic liquid-based biorefining processes were revealed.


Assuntos
Lignina , Espectrometria de Massas , Lignina/química , Espectrometria de Massas/métodos
17.
Proteomics ; : e2400078, 2024 Jun 02.
Artigo em Inglês | MEDLINE | ID: mdl-38824665

RESUMO

The human gut microbiome plays a vital role in preserving individual health and is intricately involved in essential functions. Imbalances or dysbiosis within the microbiome can significantly impact human health and are associated with many diseases. Several metaproteomics platforms are currently available to study microbial proteins within complex microbial communities. In this study, we attempted to develop an integrated pipeline to provide deeper insights into both the taxonomic and functional aspects of the cultivated human gut microbiomes derived from clinical colon biopsies. We combined a rapid peptide search by MSFragger against the Unified Human Gastrointestinal Protein database and the taxonomic and functional analyses with Unipept Desktop and MetaLab-MAG. Across seven samples, we identified and matched nearly 36,000 unique peptides to approximately 300 species and 11 phyla. Unipept Desktop provided gene ontology, InterPro entries, and enzyme commission number annotations, facilitating the identification of relevant metabolic pathways. MetaLab-MAG contributed functional annotations through Clusters of Orthologous Genes and Non-supervised Orthologous Groups categories. These results unveiled functional similarities and differences among the samples. This integrated pipeline holds the potential to provide deeper insights into the taxonomy and functions of the human gut microbiome for interrogating the intricate connections between microbiome balance and diseases.

18.
Expert Opin Drug Discov ; 19(7): 815-825, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38785418

RESUMO

INTRODUCTION: High-throughput mass spectrometry that could deliver > 10 times faster sample readout speed than traditional LC-based platforms has emerged as a powerful analytical technique, enabling the rapid analysis of complex biological samples. This increased speed of MS data acquisition has brought a critical demand for automatic data processing capabilities that should match or surpass the speed of data acquisition. Those data processing capabilities should serve the different requirements of drug discovery workflows. AREAS COVERED: This paper introduced the key steps of the automatic data processing workflows for high-throughput MS technologies. Specific examples and requirements are detailed for different drug discovery applications. EXPERT OPINION: The demand for automatic data processing in high-throughput mass spectrometry is driven by the need to keep pace with the accelerated speed of data acquisition. The seamless integration of processing capabilities with LIMS, efficient data review mechanisms, and the exploration of future features such as real-time feedback, automatic method optimization, and AI model training is crucial for advancing the drug discovery field. As technology continues to evolve, the synergy between high-throughput mass spectrometry and intelligent data processing will undoubtedly play a pivotal role in shaping the future of high-throughput drug discovery applications.


Assuntos
Descoberta de Drogas , Ensaios de Triagem em Larga Escala , Espectrometria de Massas , Fluxo de Trabalho , Descoberta de Drogas/métodos , Espectrometria de Massas/métodos , Humanos , Ensaios de Triagem em Larga Escala/métodos , Inteligência Artificial , Fatores de Tempo , Animais
19.
Respir Care ; 2024 May 14.
Artigo em Inglês | MEDLINE | ID: mdl-38744475

RESUMO

BACKGROUND: Patients with obesity are at increased risk of postoperative pulmonary complications. CPAP has been used successfully to prevent and treat acute respiratory failure, but in many clinical scenarios, high-flow nasal cannula (HFNC) therapy is emerging as a possible alternative. We aimed to compare HFNC and CPAP in a sequential study measuring their effects on gas exchange, lung volumes, and gas distribution within the lungs measured through electrical impedance tomography (EIT). METHODS: We enrolled 15 subjects undergoing laparoscopic bariatric surgery. Postoperatively they underwent the following oxygen therapy protocol (10 min/step): baseline air-entrainment mask, HFNC at increasing (40, 60, 80, and 100 L/min) and decreasing flows (80, 60, and 40 L/min), washout air-entrainment mask and CPAP (10 cm H2O). Primary outcome was the change in end-expiratory lung impedance (ΔEELI) measured by EIT data processing. Secondary outcomes were changes of global inhomogeneity (GI) index and tidal impedance variation (TIV) measured by EIT, arterial oxygenation, carbon dioxide content, pH, respiratory frequency, and subject's comfort. RESULTS: Thirteen subjects completed the study. Compared to baseline, ΔEELI was higher during 10 cm H2O CPAP (P = .001) and HFNC 100 L/min (P = .02), as well as during decreasing flows HFNC 80, 60, and 40 L/min (P = .008, .004, and .02, respectively). GI index was lower during HFNC 100 compared to HFNC 60increasing (P = .044), HFNC 60decreasing (P = .02) HFNC 40decreasing (P = .01), and during 10 cm H2O CPAP compared to washout period (P = .01) and HFNC 40decreasing (P = .03). TIV was higher during 10 cm H2O CPAP compared to baseline (P = .008). Compared to baseline, breathing frequency was lower at HFNC 60increasing, HFNC 100, and HFNC 80decreasing (P = .01, .02, and .03, respectively). No differences were detected regarding arterial oxygenation, carbon dioxide content, pH, and subject's comfort. CONCLUSIONS: HFNC at a flow of 100 L/min induced postoperative pulmonary recruitment in bariatric subjects, with no significant differences compared to 10 cm H2O CPAP in terms of lung recruitment and ventilation distribution.

20.
J Environ Manage ; 359: 120954, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38692026

RESUMO

Plastic products' widespread applications and their non-biodegradable nature have resulted in the continuous accumulation of microplastic waste, emerging as a significant component of ecological environmental issues. In the field of microplastic detection, the intricate morphology poses challenges in achieving rapid visual characterization of microplastics. In this study, photoacoustic imaging technology is initially employed to capture high-resolution images of diverse microplastic samples. To address the limited dataset issue, an automated data processing pipeline is designed to obtain sample masks while effectively expanding the dataset size. Additionally, we propose Vqdp2, a generative deep learning model with multiple proxy tasks, for predicting six forms of microplastics data. By simultaneously constraining model parameters through two training modes, outstanding morphological category representations are achieved. The results demonstrate Vqdp2's excellent performance in classification accuracy and feature extraction by leveraging the advantages of multi-task training. This research is expected to be attractive for the detection classification and visual characterization of microplastics.


Assuntos
Aprendizado Profundo , Microplásticos , Técnicas Fotoacústicas , Microplásticos/análise , Técnicas Fotoacústicas/métodos , Monitoramento Ambiental/métodos , Plásticos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA