Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 82
Filtrar
Más filtros

Bases de datos
Tipo del documento
Intervalo de año de publicación
1.
Magn Reson Med ; 2024 Jul 10.
Artículo en Inglés | MEDLINE | ID: mdl-38988088

RESUMEN

PURPOSE: Retrospective frequency-and-phase correction (FPC) methods attempt to remove frequency-and-phase variations between transients to improve the quality of the averaged MR spectrum. However, traditional FPC methods like spectral registration struggle at low SNR. Here, we propose a method that directly integrates FPC into a 2D linear-combination model (2D-LCM) of individual transients ("model-based FPC"). We investigated how model-based FPC performs compared to the traditional approach, i.e., spectral registration followed by 1D-LCM in estimating frequency-and-phase drifts and, consequentially, metabolite level estimates. METHODS: We created synthetic in-vivo-like 64-transient short-TE sLASER datasets with 100 noise realizations at 5 SNR levels and added randomly sampled frequency and phase variations. We then used this synthetic dataset to compare the performance of 2D-LCM with the traditional approach (spectral registration, averaging, then 1D-LCM). Outcome measures were the frequency/phase/amplitude errors, the SD of those ground-truth errors, and amplitude Cramér Rao lower bounds (CRLBs). We further tested the proposed method on publicly available in-vivo short-TE PRESS data. RESULTS: 2D-LCM estimates (and accounts for) frequency-and-phase variations directly from uncorrected data with equivalent or better fidelity than the conventional approach. Furthermore, 2D-LCM metabolite amplitude estimates were at least as accurate, precise, and certain as the conventionally derived estimates. 2D-LCM estimation of FPC and amplitudes performed substantially better at low-to-very-low SNR. CONCLUSION: Model-based FPC with 2D linear-combination modeling is feasible and has great potential to improve metabolite level estimation for conventional and dynamic MRS data, especially for low-SNR conditions, for example, long TEs or strong diffusion weighting.

2.
NMR Biomed ; 37(4): e5076, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38091628

RESUMEN

Literature values vary widely for within-subject test-retest reproducibility of gamma-aminobutyric acid (GABA) measured with edited magnetic resonance spectroscopy (MRS). Reasons for this variation remain unclear. Here, we tested whether three acquisition parameters-(1) sequence complexity (two-experiment MEscher-GArwood Point RESolved Spectroscopy [MEGA-PRESS] vs. four-experiment Hadamard Encoding and Reconstruction of MEGA-Edited Spectroscopy [HERMES]); (2) editing pulse duration (14 vs. 20 ms); and (3) scanner frequency drift (interleaved water referencing [IWR] turned ON vs. OFF)-and two linear combination modeling variations-(1) three different coedited macromolecule models (called "1to1GABA", "1to1GABAsoft", and "3to2MM" in the Osprey software package); and (2) 0.55- versus 0.4-ppm spline baseline knot spacing-affected the within-subject coefficient of variation of GABA + macromolecules (GABA+). We collected edited MRS data from the dorsal anterior cingulate cortex from 20 participants (mean age: 30.8 ± 9.5 years; 10 males). Test and retest scans were separated by removing the participant from the scanner for 5-10 min. Each acquisition consisted of two MEGA-PRESS and two HERMES sequences with editing pulse durations of 14 and 20 ms (referred to here as MEGA-14, MEGA-20, HERMES-14, and HERMES-20; all TE = 80 ms, 224 averages). We identified the best test-retest reproducibility following postprocessing with a composite model of the 0.9- and 3-ppm macromolecules ("3to2MM"); this model performed particularly well for the HERMES data. Furthermore, sparser (0.55- compared with 0.4-ppm) spline baseline knot spacing yielded generally better test-retest reproducibility for GABA+. Replicating our prior results, linear combination modeling in Osprey compared with simple peak fitting in Gannet resulted in substantially better test-retest reproducibility. However, reproducibility did not consistently differ for MEGA-PRESS compared with HERMES, for 14- compared with 20-ms editing pulses, or for IWR-ON versus IWR-OFF. These results highlight the importance of model selection for edited MRS studies of GABA+, particularly for clinical studies that focus on individual patient differences in GABA+ or changes following an intervention.


Asunto(s)
Encéfalo , Ácido gamma-Aminobutírico , Masculino , Humanos , Adulto Joven , Adulto , Reproducibilidad de los Resultados , Espectroscopía de Resonancia Magnética/métodos , Fantasmas de Imagen , Sustancias Macromoleculares/metabolismo , Encéfalo/metabolismo
3.
Nano Lett ; 23(20): 9399-9405, 2023 Oct 25.
Artículo en Inglés | MEDLINE | ID: mdl-37877237

RESUMEN

An accurate rule for predicting conductance is the cornerstone of developing molecular circuits and provides a promising solution for miniaturizing electric circuits. The successful prediction of series molecular circuits has proven the possibility of establishing a rule for molecular circuits under quantum mechanics. However, the quantitatively accurate prediction has not been validated by experiments for parallel molecular circuits. Here we used 1,3-dihydrobenzothiophene (DBT) to build the parallel molecular circuits. The theoretical simulation and single-molecule conductance measurements demonstrated that the conductance of the molecule containing one DBT is the unprecedented linear combination of the conductance of the two individual channels with respective contribution weights of 0.37 and 0.63. With these weights, the conductance of the molecule containing two DBTs is predicted as 1.81 nS, matching perfectly with the measured conductance (1.82 nS). This feature offers a potential rule for quantitatively predicting the conductance of parallel molecular circuits.

4.
J Synchrotron Radiat ; 30(Pt 6): 1183, 2023 Nov 01.
Artículo en Inglés | MEDLINE | ID: mdl-37850564

RESUMEN

The name of an author in the article by Saurette et al. (2022) [J. Synchrotron Rad. 29, 1198-1208] is corrected.

5.
Magn Reson Med ; 90(3): 823-838, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-37183778

RESUMEN

PURPOSE: The Vespa package (Versatile Simulation, Pulses, and Analysis) is described and demonstrated. It provides workflows for developing and optimizing linear combination modeling (LCM) fitting for 1 H MRS data using intuitive graphical user interface interfaces for RF pulse design, spectral simulation, and MRS data analysis. Command line interfaces for embedding workflows in MR manufacturer platforms and utilities for synthetic dataset creation are included. Complete provenance is maintained for all steps in workflows. THEORY AND METHODS: Vespa is written in Python for compatibility across operating systems. It embeds the PyGAMMA spectral simulation library for spectral simulation. Multiprocessing methods accelerate processing and visualization. Applications use the Vespa database for results storage and cross-application access. Three projects demonstrate pulse, sequence, simulation, and data analysis workflows: (1) short TE semi-LASER single-voxel spectroscopy (SVS) LCM fitting, (2) optimizing MEGA-PRESS (MEscher-GArwood Point RESolved Spectroscopy) flip angle and LCM fitting, and (3) creating a synthetic short TE dataset. RESULTS: The LCM workflows for in vivo basis set creation and spectral analysis showed reasonable results for both the short TE semi-LASER and MEGA-PRESS. Examples of pulses, simulations, and data fitting are shown in Vespa application interfaces for various steps to demonstrate the interactive workflow. CONCLUSION: Vespa provides an efficient and extensible platform for characterizing RF pulses, pulse design, spectral simulation optimization, and automated LCM fitting via an interactive platform. Modular design and command line interface make it easy to embed in other platforms. As open source, it is free to the MRS community for use and extension. Vespa source code and documentation are available through GitHub.


Asunto(s)
Programas Informáticos , Espectroscopía de Resonancia Magnética/métodos , Simulación por Computador , Bases de Datos Factuales , Frecuencia Cardíaca
6.
NMR Biomed ; 36(3): e4854, 2023 03.
Artículo en Inglés | MEDLINE | ID: mdl-36271899

RESUMEN

Expert consensus recommends linear-combination modeling (LCM) of 1 H MR spectra with sequence-specific simulated metabolite basis function and experimentally derived macromolecular (MM) basis functions. Measured MM basis functions are usually derived from metabolite-nulled spectra averaged across a small cohort. The use of subject-specific instead of cohort-averaged measured MM basis functions has not been studied widely. Furthermore, measured MM basis functions are not widely available to non-expert users, who commonly rely on parameterized MM signals internally simulated by LCM software. To investigate the impact of the choice of MM modeling, this study, therefore, compares metabolite level estimates between different MM modeling strategies (cohort-mean measured; subject-specific measured; parameterized) in a lifespan cohort and characterizes its impact on metabolite-age associations. 100 conventional (TE = 30 ms) and metabolite-nulled (TI = 650 ms) PRESS datasets, acquired from the medial parietal lobe in a lifespan cohort (20-70 years of age), were analyzed in Osprey. Short-TE spectra were modeled in Osprey using six different strategies to consider the MM baseline. Fully tissue- and relaxation-corrected metabolite levels were compared between MM strategies. Model performance was evaluated by model residuals, the Akaike information criterion (AIC), and the impact on metabolite-age associations. The choice of MM strategy had a significant impact on the mean metabolite level estimates and no major impact on variance. Correlation analysis revealed moderate-to-strong agreement between different MM strategies (r > 0.6). The lowest relative model residuals and AIC values were found for the cohort-mean measured MM. Metabolite-age associations were consistently found for two major singlet signals (total creatine (tCr])and total choline (tCho)) for all MM strategies; however, findings for metabolites that are less distinguishable from the background signals associations depended on the MM strategy. A variance partition analysis indicated that up to 44% of the total variance was related to the choice of MM strategy. Additionally, the variance partition analysis reproduced the metabolite-age association for tCr and tCho found in the simpler correlation analysis. In summary, the inclusion of a single high signal-to-noise ratio MM basis function (cohort-mean) in the short-TE LCM leads to more lower model residuals and AIC values compared with MM strategies with more degrees of freedom (Gaussian parametrization) or subject-specific MM information. Integration of multiple LCM analyses into a single statistical model potentially allows to identify the robustness in the detection of underlying effects (e.g., metabolite vs. age), reduces algorithm-based bias, and estimates algorithm-related variance.


Asunto(s)
Encéfalo , Colina , Humanos , Encéfalo/metabolismo , Estudios de Factibilidad , Espectroscopía de Resonancia Magnética/métodos , Relación Señal-Ruido , Sustancias Macromoleculares/metabolismo , Colina/metabolismo , Receptores de Antígenos de Linfocitos T/metabolismo
7.
J Magn Reson Imaging ; 58(3): 782-791, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-36373998

RESUMEN

BACKGROUND: Balanced steady-state free precession (bSSFP) is important in cardiac MRI but suffers from off-resonance artifacts. The interpretation-limiting artifacts in patients with cardiac implants remain an unsolved issue. PURPOSE: To develop an interleaved radial linear combination bSSFP (lcSSFP) method with partial dephasing (PD) for improved cardiac cine imaging when implanted cardiovascular devices are present. STUDY TYPE: Prospective. PHANTOM AND SUBJECTS: Flow phantom adjacent to a pacemaker and 10 healthy volunteers (mean age ± standard deviation: 31.9 ± 2.9 years, 4 females) with a cardioverter-defibrillator (ICD) positioned extracorporeally at the left chest in the prepectoral region. FIELD STRENGTH/SEQUENCE: A 3-T, 1) Cartesian bSSFP, 2) Cartesian gradient echo (GRE), 3) Cartesian lcSSFP, and 4) radial lcSSFP cine sequences. ASSESSMENT: Flow artifacts mitigation using PD was validated with phantom experiments. Undersampled radial lcSSFP with interleaving across phase-cyclings and cardiac phases (RLC-SSFP), combined with PD, was then employed for achieving improved quality of cine images from left ventricular short-axis view. The image quality in the presence of cardiac devices was qualitatively assessed by three independent raters (1 = worst, 5 = best), regarding five criteria (banding artifacts, streak artifacts, flow artifacts, cavity visibility, and overall image quality). STATISTICAL TESTS: Wilcoxon rank-sum test for the five criteria between Cartesian bSSFP cine and RLC-SSFP with PD. Fleiss kappa test for inter-reader agreement. A P value < 0.05 was considered statistically significant. RESULTS: Based on simulations and phantom experiments, 60 projections per phase cycling and 1/6 PD were chosen. The in vivo experiments demonstrated significantly reduced banding artifacts (4.8 ± 0.4 vs. 2.7 ± 0.7), fewer streak artifacts (3.7 ± 0.6 vs. 2.6 ± 0.7) and flow artifacts (4.4 ± 0.4 vs. 3.7 ± 0.6), therefore improved cavity visibility (4.1 ± 0.4 vs. 2.9 ± 0.9) and overall quality (4.0 ± 0.4 vs. 2.7 ± 0.7). DATA CONCLUSION: RLC-SSFP method with PD may improve cine image quality in subjects with cardiac devices. EVIDENCE LEVEL: 2. TECHNICAL EFFICACY: Stage 1.


Asunto(s)
Corazón , Imagen por Resonancia Cinemagnética , Femenino , Humanos , Imagen por Resonancia Cinemagnética/métodos , Estudios Prospectivos , Corazón/diagnóstico por imagen , Imagen por Resonancia Magnética/métodos , Ventrículos Cardíacos , Artefactos , Reproducibilidad de los Resultados
8.
J Med Syst ; 47(1): 69, 2023 Jul 07.
Artículo en Inglés | MEDLINE | ID: mdl-37418036

RESUMEN

Magnetic resonance spectroscopy (MRS) can non-invasively measure levels of endogenous metabolites in living tissue and is of great interest to neuroscience and clinical research. To this day, MRS data analysis workflows differ substantially between groups, frequently requiring many manual steps to be performed on individual datasets, e.g., data renaming/sorting, manual execution of analysis scripts, and manual assessment of success/failure. Manual analysis practices are a substantial barrier to wider uptake of MRS. They also increase the likelihood of human error and prevent deployment of MRS at large scale. Here, we demonstrate an end-to-end workflow for fully automated data uptake, processing, and quality review.The proposed continuous automated MRS analysis workflow integrates several recent innovations in MRS data and file storage conventions. They are efficiently deployed by a directory monitoring service that automatically triggers the following steps upon arrival of a new raw MRS dataset in a project folder: (1) conversion from proprietary manufacturer file formats into the universal format NIfTI-MRS; (2) consistent file system organization according to the data accumulation logic standard BIDS-MRS; (3) executing a command-line executable of our open-source end-to-end analysis software Osprey; (4) e-mail delivery of a quality control summary report for all analysis steps.The automated architecture successfully completed for a demonstration dataset. The only manual step required was to copy a raw data folder into a monitored directory.Continuous automated analysis of MRS data can reduce the burden of manual data analysis and quality control, particularly for non-expert users and multi-center or large-scale studies and offers considerable economic advantages.


Asunto(s)
Programas Informáticos , Humanos , Flujo de Trabajo , Espectroscopía de Resonancia Magnética/métodos , Probabilidad
9.
Environ Monit Assess ; 195(11): 1290, 2023 Oct 12.
Artículo en Inglés | MEDLINE | ID: mdl-37821723

RESUMEN

Proper disposal of solid waste is crucial for the protection of natural resources and human health. However, increasing population and changes in consumption habits have led to a global increase in solid waste production. Therefore, a site selection process for solid waste management that takes into account environmental, economic, and social factors is needed. The number of open-source GIS (geographic information system) software programs used in site selection analysis is increasing day by day. QGIS software is an open-source GIS software developed by free software developers, with its popularity increasing with each new version and allowing for the development of plugins with the Python programming language. The shareability of plugins developed with QGIS software brings together open-source GIS users around the world for common goals. In this study, a plugin called "LANDFILL SITE SELECTION (LFSS)" was developed in the QGIS software environment for solid waste landfill site selection and a suitability map was created for solid waste landfill site selection in Tokat, Turkey, using this plugin. For this purpose, 14 evaluation criteria and 8 exclusion criteria were selected, the importance levels of criteria and sub-criteria were determined using the AHP method, and a solid waste landfill site selection suitability map was created using the developed plugin.


Asunto(s)
Eliminación de Residuos , Residuos Sólidos , Humanos , Eliminación de Residuos/métodos , Turquía , Monitoreo del Ambiente/métodos , Sistemas de Información Geográfica , Instalaciones de Eliminación de Residuos
10.
Entropy (Basel) ; 25(8)2023 Jul 28.
Artículo en Inglés | MEDLINE | ID: mdl-37628165

RESUMEN

In this paper, we put forward the model of multiple linear-combination security multicast network coding, where the wiretapper desires to obtain some information about a predefined set of multiple linear combinations of the source symbols by eavesdropping any one (but not more than one) channel subset up to a certain size r, referred to as the security level. For this model, the security capacity is defined as the maximum average number of source symbols that can be securely multicast to all sink nodes for one use of the network under the linear-combination security constraint. For any security level and any linear-combination security constraint, we fully characterize the security capacity in terms of the ratio of the rank of the linear-combination security constraint to the number of source symbols. Also, we develop a general construction of linear security network codes. Finally, we investigate the asymptotic behavior of the security capacity for a sequence of linear-combination security models and discuss the asymptotic optimality of our code construction.

11.
J Synchrotron Radiat ; 29(Pt 5): 1198-1208, 2022 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-36073878

RESUMEN

High-energy-resolution fluorescence-detected (HERFD) X-ray absorption near-edge spectroscopy (XANES) is a spectroscopic method that allows for increased spectral feature resolution, and greater selectivity to decrease complex matrix effects compared with conventional XANES. XANES is an ideal tool for speciation of elements in solid-phase environmental samples. Accurate speciation of As in mine waste materials is important for understanding the mobility and toxicity of As in near-surface environments. In this study, linear combination fitting (LCF) was performed on synthetic spectra generated from mixtures of eight measured reference compounds for both HERFD-XANES and transmission-detected XANES to evaluate the improvement in quantitative speciation with HERFD-XANES spectra. The reference compounds arsenolite (As2O3), orpiment (As2S3), getchellite (AsSbS3), arsenopyrite (FeAsS), kankite (FeAsO4·3.5H2O), scorodite (FeAsO4·2H2O), sodium arsenate (Na3AsO4), and realgar (As4S4) were selected for their importance in mine waste systems. Statistical methods of principal component analysis and target transformation were employed to determine whether HERFD improves identification of the components in a dataset of mixtures of reference compounds. LCF was performed on HERFD- and total fluorescence yield (TFY)-XANES spectra collected from mine waste samples. Arsenopyrite, arsenolite, orpiment, and sodium arsenate were more accurately identified in the synthetic HERFD-XANES spectra compared with the transmission-XANES spectra. In mine waste samples containing arsenopyrite and either scorodite or kankite, LCF with HERFD-XANES measurements resulted in fits with smaller R-factors than concurrently collected TFY measurements. The improved accuracy of HERFD-XANES analysis may provide enhanced delineation of As phases controlling biogeochemical reactions in mine wastes, contaminated soils, and remediation systems.


Asunto(s)
Contaminantes del Suelo , Trióxido de Arsénico , Minería , Espectroscopía de Absorción de Rayos X/métodos
12.
NMR Biomed ; 35(1): e4618, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34558129

RESUMEN

J-difference-edited spectroscopy is a valuable approach for the in vivo detection of γ-aminobutyric-acid (GABA) with magnetic resonance spectroscopy (MRS). A recent expert consensus article recommends linear combination modeling (LCM) of edited MRS but does not give specific details regarding implementation. This study explores different modeling strategies to adapt LCM for GABA-edited MRS. Sixty-one medial parietal lobe GABA-edited MEGA-PRESS spectra from a recent 3-T multisite study were modeled using 102 different strategies combining six different approaches to account for co-edited macromolecules (MMs), three modeling ranges, three baseline knot spacings, and the use of basis sets with or without homocarnosine. The resulting GABA and GABA+ estimates (quantified relative to total creatine), the residuals at different ranges, standard deviations and coefficients of variation (CVs), and Akaike information criteria, were used to evaluate the models' performance. Significantly different GABA+ and GABA estimates were found when a well-parameterized MM3co basis function was included in the model. The mean GABA estimates were significantly lower when modeling MM3co , while the CVs were similar. A sparser spline knot spacing led to lower variation in the GABA and GABA+ estimates, and a narrower modeling range-only including the signals of interest-did not substantially improve or degrade modeling performance. Additionally, the results suggest that LCM can separate GABA and the underlying co-edited MM3co . Incorporating homocarnosine into the modeling did not significantly improve variance in GABA+ estimates. In conclusion, GABA-edited MRS is most appropriately quantified by LCM with a well-parameterized co-edited MM3co basis function with a constraint to the nonoverlapped MM0.93 , in combination with a sparse spline knot spacing (0.55 ppm) and a modeling range of 0.5-4 ppm.


Asunto(s)
Espectroscopía de Resonancia Magnética/métodos , Ácido gamma-Aminobutírico/metabolismo , Humanos , Modelos Lineales
13.
Sensors (Basel) ; 22(3)2022 Jan 27.
Artículo en Inglés | MEDLINE | ID: mdl-35161742

RESUMEN

The phasor approach to fluorescence lifetime imaging, and more recently hyperspectral fluorescence imaging, has increased the use of these techniques, and improved the ease and intuitiveness of the data analysis. The fit-free nature of the phasor plots increases the speed of the analysis and reduces the dimensionality, optimization of data handling and storage. The reciprocity principle between the real and imaginary space-where the phasor and the pixel that the phasor originated from are linked and can be converted from one another-has helped the expansion of this method. The phasor coordinates calculated from a pixel, where multiple fluorescent species are present, depends on the phasor positions of those components. The relative positions are governed by the linear combination properties of the phasor space. According to this principle, the phasor position of a pixel with multiple components lies inside the polygon whose vertices are occupied by the phasor positions of these individual components and the distance between the image phasor to any of the vertices is inversely proportional to the fractional intensity contribution of that component to the total fluorescence from that image pixel. The higher the fractional intensity contribution of a vertex, the closer is the resultant phasor. The linear additivity in the phasor space can be exploited to obtain the fractional intensity contribution from multiple species and quantify their contribution. This review details the various mathematical models that can be used to obtain two/three/four components from phasor space with known phasor signatures and then how to obtain both the fractional intensities and phasor positions without any prior knowledge of either, assuming they are mono-exponential in nature. We note that other than for blind components, there are no restrictions on the type of the decay or their phasor positions for linear combinations to be valid-and they are applicable to complicated fluorescence lifetime decays from components with intensity decays described by multi-exponentials.


Asunto(s)
Colorantes , Imagen Óptica , Microscopía Fluorescente
14.
Entropy (Basel) ; 24(7)2022 Jun 24.
Artículo en Inglés | MEDLINE | ID: mdl-35885090

RESUMEN

Non-Hermitian (NH) quantum theory has been attracting increased research interest due to its featured properties, novel phenomena, and links to open and dissipative systems. Typical NH systems include PT-symmetric systems, pseudo-Hermitian systems, and their anti-symmetric counterparts. In this work, we generalize the pseudo-Hermitian systems to their complex counterparts, which we call pseudo-Hermitian-φ-symmetric systems. This complex extension adds an extra degree of freedom to the original symmetry. On the one hand, it enlarges the non-Hermitian class relevant to pseudo-Hermiticity. On the other hand, the conventional pseudo-Hermitian systems can be understood better as a subgroup of this wider class. The well-defined inner product and pseudo-inner product are still valid. Since quantum simulation provides a strong method to investigate NH systems, we mainly investigate how to simulate this novel system in a Hermitian system using the linear combination of unitaries in the scheme of duality quantum computing. We illustrate in detail how to simulate a general P-pseudo-Hermitian-φ-symmetric two-level system. Duality quantum algorithms have been recently successfully applied to similar types of simulations, so we look forward to the implementation of available quantum devices.

15.
J Food Sci Technol ; 59(7): 2694-2704, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35734130

RESUMEN

This assay was performed to select chemical compounds and rheological parameters for the prediction of the bread volume and crumb firmness of breads made with a blend of wheat flour-fine bran, using multivariate analysis. Two main factors were used, the source of fine bran and the particle size of fine bran. The experiment consisted in a completely random design, in a 2 × 3 factorial arrangement, the statistical analysis shown that the particle size of fine bran influence almost all the analytical variables. In addition, discriminant analysis was performed to analyse which rheological parameters and chemical components show the greater influence on dough behaviour and bread quality. Biaxial extensional viscosity, farinograph consistency, dough development time and stability were the main rheological parameters that govern the specific volume and crumb firmness and, were both closely related to the fibre, protein and starch content in the flour-fine bran blends. Particle size-genotype interaction has a significant influence on gelatinisation enthalpy and biaxial extensional viscosity that change the bread volume and the crumb firmness. The simplicity of linear equation of five independent variables to predict bread quality with high levels of accuracy could be advantageous in both basic research and the routine quality control of wheat mills. Supplementary Information: The online version contains supplementary material available at 10.1007/s13197-021-05290-3.

16.
Genet Epidemiol ; 44(1): 67-78, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-31541490

RESUMEN

Emerging evidence suggests that a genetic variant can affect multiple phenotypes, especially in complex human diseases. Therefore, joint analysis of multiple phenotypes may offer new insights into disease etiology. Recently, many statistical methods have been developed for joint analysis of multiple phenotypes, including the clustering linear combination (CLC) method. Due to the unknown number of clusters for a given data, a simulation procedure must be used to evaluate the p-value of the final test statistic of CLC. This makes the CLC method computationally demanding. In this paper, we use a stopping criterion to determine the number of clusters in the CLC method. We have named our method, hierarchical clustering CLC (HCLC). HCLC has an asymptotic distribution, which is very computationally efficient and makes it applicable for genome-wide association studies. Extensive simulations together with the COPDGene data analysis have been used to assess the type I error rates and power of our proposed method. Our simulation results demonstrate that the type I error rates of HCLC are effectively controlled in different realistic settings. HCLC either outperforms all other methods or has statistical power that is very close to the most powerful method with which it has been compared.


Asunto(s)
Análisis por Conglomerados , Variación Genética/genética , Modelos Genéticos , Estudio de Asociación del Genoma Completo , Humanos , Fenotipo
17.
J Comput Chem ; 42(23): 1662-1669, 2021 09 05.
Artículo en Inglés | MEDLINE | ID: mdl-34114237

RESUMEN

The resonance theory is still very useful in understanding the valence electron structure. However, such a viewpoint is not usually obtained by general-purpose quantum chemical calculations, instead requires rather special treatment such as valence bond methods. In this study, we propose a method based on second quantization to analyze the results obtained by general-purpose quantum chemical calculations from the local point of view of electronic structure and analyze diazadiboretidine and the tautomerization of formamide. This method requires only the "PS"-matrix, consisting of the density matrix (P-matrix) and overlap matrix, and can be computed with a comparable load to that of Mulliken population analysis. A key feature of the method is that, unlike other methods proposed so far, it makes direct use of the results of general-purpose quantum chemical calculations.

18.
NMR Biomed ; 34(4): e4482, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-33530131

RESUMEN

Short-TE proton MRS is used to study metabolism in the human brain. Common analysis methods model the data as a linear combination of metabolite basis spectra. This large-scale multi-site study compares the levels of the four major metabolite complexes in short-TE spectra estimated by three linear-combination modeling (LCM) algorithms. 277 medial parietal lobe short-TE PRESS spectra (TE = 35 ms) from a recent 3 T multi-site study were preprocessed with the Osprey software. The resulting spectra were modeled with Osprey, Tarquin and LCModel, using the same three vendor-specific basis sets (GE, Philips and Siemens) for each algorithm. Levels of total N-acetylaspartate (tNAA), total choline (tCho), myo-inositol (mI) and glutamate + glutamine (Glx) were quantified with respect to total creatine (tCr). Group means and coefficient of variations of metabolite estimates agreed well for tNAA and tCho across vendors and algorithms, but substantially less so for Glx and mI, with mI systematically estimated as lower by Tarquin. The cohort mean coefficient of determination for all pairs of LCM algorithms across all datasets and metabolites was R2¯ = 0.39, indicating generally only moderate agreement of individual metabolite estimates between algorithms. There was a significant correlation between local baseline amplitude and metabolite estimates (cohort mean R2¯ = 0.10). While mean estimates of major metabolite complexes broadly agree between linear-combination modeling algorithms at group level, correlations between algorithms are only weak-to-moderate, despite standardized preprocessing, a large sample of young, healthy and cooperative subjects, and high spectral quality. These findings raise concerns about the comparability of MRS studies, which typically use one LCM software and much smaller sample sizes.


Asunto(s)
Modelos Lineales , Espectroscopía de Protones por Resonancia Magnética/métodos , Algoritmos , Colina/metabolismo , Creatina/metabolismo , Glutamina/metabolismo , Humanos
19.
Sensors (Basel) ; 21(9)2021 Apr 30.
Artículo en Inglés | MEDLINE | ID: mdl-33946508

RESUMEN

Digital image correlation (DIC) for displacement and strain measurement has flourished in recent years. There are integer pixel and subpixel matching steps to extract displacement from a series of images in the DIC approach, and identification accuracy mainly depends on the latter step. A subpixel displacement matching method, named the double-precision gradient-based algorithm (DPG), is proposed in this study. After, the integer pixel displacement is identified using the coarse-fine search algorithm. In order to improve the accuracy and anti-noise capability in the subpixel extraction step, the traditional gradient-based method is used to analyze the data on the speckle patterns using the computer, and the influence of noise is considered. These two nearest integer pixels in one direction are both utilized as an interpolation center. Then, two subpixel displacements are extracted by the five-point bicubic spline interpolation algorithm using these two interpolation centers. A novel combination coefficient considering contaminated noises is presented to merge these two subpixel displacements to obtain the final identification displacement. Results from a simulated speckle pattern and a painted beam bending test show that the accuracy of the proposed method can be improved by four times that of the traditional gradient-based method that reaches the same high accuracy as the Newton-Raphson method. The accuracy of the proposed method efficiently reaches at 92.67%, higher than the Newton-Raphon method, and it has better anti-noise performance and stability.

20.
J Environ Manage ; 285: 112049, 2021 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-33578210

RESUMEN

Forest plays an important role in keeping water ecosystem services, such as drinking water provision. Thus, payment for ecosystem services is an essential instrument to promote forest restoration in agricultural watersheds. However, funds are limited and must be well planned to ensure water resources conservation and water ecosystem services improvement. In this context, our study aimed to identify priority areas for forest restoration, based on water ecosystem services in agricultural landscapes. For this, we have developed a decision-making support model for agricultural watersheds (in the Atlantic Forest region), based on mixed approaches, that were multicriteria evaluation (MCE) and Participatory Technique. The model will help decision-makers and stakeholders to set priorities for payment for ecosystem services programs implementation. So, we evaluate its application in watersheds with different forest cover patterns to check if it can be applied to different landscape patterns. The base of the model was the following criteria, that were produced with high-resolution data and ranking in the Participatory Technique context, considering their importance for the study: proximity to spring, slope, soil erodibility, topographic index, and land-use/land-cover (LULC). The criteria were aggregated by the Weighted Linear Combination (WLC) method (an MCE method). The priorities maps showed areas classified as high priority near the rivers (at most 200 m far from rivers), on the greatest slopes (>40%), on soils associated with high potential of erosion, and predominantly in agriculture lands. However, this class presented more percentage of the area associated with native forest in the forested watershed (native forest covers 55% of its area) than in the watershed non-forested (native forest covers 25%). Another important point of the final maps was a high percentage of areas associated with the medium class, which is a characteristic of the WLC method. Thus, areas classified as high and medium priority was defined as targets for forest restoration in the watersheds. We can conclude that for small watersheds, the MCE method, with high-resolution data, supports an appropriate prioritization of areas for forest restoration, aiming at the improvement of water ecosystem services. This way, our model can be applied to various payments for ecosystem services schemes in agricultural landscapes worldwide.


Asunto(s)
Conservación de los Recursos Naturales , Ecosistema , Agricultura , Bosques , Ríos , Agua
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA